The world needs more feminists
Feminism is the idea that women deserve to be placed on a pedestal and worshipped while men bow their heads in honor, kissing the ground they walk on. Or at least, that’s how the media portrays it.
Contrary to popular belief, the purpose of feminism is to advocate for equality of the sexes, rather than to make men feel guilty for existing. It is recognizing that we need to integrate more women into our education and history, rather than singling out white men and their impact. It is looking critically at literature, social media, and even advertisements with regards to how women are objectified and oversexualized. It is breaking the notion that femininity should be defined as being “gentle and soft” because we are beyond these dainty adjectives.
Through my women’s studies course this semester, I have entirely redefined what it means to be a woman for myself. I developed a deeper understanding of the inequality women face politically, socially, and economically. The discrimination exists beyond feeling threatened walking alone on the streets at night. We as women dread the lack of access to reproductive healthcare, lower wages, and having our voices silenced with the fear of being called “hormonal” and “crazy” as a dismissal. The reality is that men in social systems make decisions that affect women, and women are incredibly more powerful than being someone else’s decision.
What can we do to change that? For starters, rework the mindset of, “What if this was your mother, your daughter, or your sister being treated that way?” Instead, it’s about having respect and valuing inclusion for a fellow human being, regardless of their relation to you or if they even identify along the gender binary or not. Compassion and realization of our own privileges are required to make not only our nation safer and more accessible for women, but to improve conditions for women worldwide.