It’s March, which means that it’s Women’s History Month, and what better
time to talk about… Feminism.
What is feminism?
Feminism is the advocacy of women’s rights on the basis of the equality of the sexes.
Feminism is also one of the most controversial words on the planet. You could mention the word around almost anyone and it’s often a wildfire that gets started whether someone is against it or for it. Most people associate the word with women wanting special treatment, when in all honesty that’s not what it means. The definition is simply believing that women are equal to men and should be treated that way.
Now, don’t get me wrong, women’s rights have improved since the 1900’s, because sure we’re not forced to stay home all day with the kids and we have the right to own our own houses without a man, but that doesn’t mean our fight is over. We got our rights, but we still don’t always receive the respect that we deserve. You hear every day about how women can’t even walk down the street without being catcalled or followed, and then being called names when they don’t respond. Or how turning down a date or a drink can get you attacked in an alley and how we have to wear the “right” clothes so no one has the urge to take them off without our consent.