It seems to me that telling women to accept their bodies just as they are is sort of the opposite of empowering. Let me explain. The problem with our ideas about our bodies (at least in my humble opinion) is not that we want to change them; changing our bodies requires will-power, hard work, dedication, commitment, all of these are good things. The problem is that we want to change our bodies into what society tells us they should be. I see absolutely no problem with people wanting to be the best they can in every way. We should not be telling women to accept their bodies as they are, we should be telling them to make their bodies what they want them to be. I think there is a great sense of empowerment that comes from watching your body change and become better due to your hard work. That is something that we should be cultivating, not discouraging.
I, personally, don't believe that I will ever stop trying to improve my body; I'm just not the type of person who is ever really completely satisfied with anything. That in no way means that I don't appreciate the progress I have made though. I feel good about all the work that I have done, and all the benefits I have gained from it. This is not because I look the way society tells me I should (because I don't look the way society tells me I should) but because I look more and more the way I want to everyday. I feel that, instead of telling each other to accept our bodies the way they are, we should be telling each other to decide who we want to be, and do whatever it takes to be that. Myself, I'm sort of going for the "warrior woman" look.
"Have you ever been with a warrior woman?"
No comments:
Post a Comment