Feminism – Should it Be Advocated?




Feminism, by definition, is the advocacy of women’s rights on the basis of the equality of the sexes. Today, the political ideology of feminism is only growing. Away with the misogynistic society that was the past, in with the new: a future that women are apart of. It is incredibly disappointing that it has only been a century since women have gained equal rights. Sadly, there is still a huge amount of bigotry and sexism in this world.  Discrimination, unequal pay, catcalling, and sexist slurs are only some of the ways sexism takes part of women’s daily lives.

It has become a disgusting norm for women to be subjected to all types of harassment for the pleasure of men. For example, Harvey Weinstein was accused of sexual harassment by several women. Being one of the first men to have been accused, several more accusations followed. As a result, more and more women started standing up for themselves. Brave women have brought the true nature of most authoritative figures, specifically men in power, into the light. Thanks to these women, the hashtag “#MeToo” was created. This hashtag would bring awareness of sexual abuse and assault all around the world to women struggling with the inequality in their lives.

So, of course, feminism should be advocated. Women empowerment is a great way of creating a more equal and ideal society. While having a “perfect” society may never be achievable, recognizing, and stopping, sexism around the world is the first step into a civilization of fairness.

While feminism is certainly a valuable addition to modern society, radical feminism leads to a drastically different way of celebrating the beautiful strength of women. Radical feminists generally oppose political and social organizations because they tie into patriarchy or the system of society or government in which men hold the power and women are largely excluded from it. While they are not necessarily wrong, radical feminists hold a very strong view of the inequality in the world and may have much more extreme visions for the future of this society.

To conclude, women have been oppressed for too long. Feminism isn’t a movement that wants all women to riot and takes over the world, but the simple empowerment of women everywhere; as well as the raising of awareness of global issues like sexism. Feminists are the way American society will change for the better. Without the injustices of today’s civilization, we will be on the path to a more perfect society.