Most people don’t know how good their bodies are designed to feel. I’m not sure if it’s because we weren’t taught to eat well or if it’s because we just don’t care. Somewhere along the way we started treating our bodies like a garbage dump instead of a temple. We get more satisfaction from the way a cookie tastes than we do from physically feeling good.
It’s time to break the cycle, and show our children that healthy food is delicious and that Physically Feeling Good Is Worth It.
What do you think? Do you think most people just weren’t taught to eat well, or do you think they just don’t care about their health?