Feminism
By SaagarikaVerma, Grade 10, GEMS Modern Academy, Dubai
All the people out there who think this is a bad word, this is for you. Feminism, according to Urban Dictionary, is the belief that women are and should be treated as potential intellectual equals and social equals to men. Now go back and read the definition again. Yes, equals. A feminist is a person who believes in the social, political and economic equality of the sexes. When did feminists become haters of men? I think you have been confusing yourself with the Hunters of Artemis.
There is nothing wrong in being a feminist. In fact, feminism is a sign of forte. To all those females out there that hesitate on the question ‘Are you a feminist?’ should begin to comprehend that feminism isn’t being a man-hater, ugly or a hater of non-feminists. It’s simply striving for gender equality. And what could possibly be erroneous in that?
I might be just a teen, but it’s never too early to have opinions. The misunderstanding we have about this simple word is mind-boggling. Feminists aren’t ‘out to get you’. They are merely stating that gender shouldn’t matter if you have the skills to do the job. Gender isn’t a factor to be discriminated on.
Mia Wasikowska, the Australian actress that played Alice in the film Alice in Wonderland, said, “Feminism is just about equality, really, and there’s so much stuff attached to the word, when it’s actually so simple. I don’t know why it’s always so bogged down,” and who can argue with that?