Sexism
Five Myths About Feminists
I didn’t know what a feminist was.
--
My father was a minister. I grew up attending a church that didn’t allow women elders or ministers. Women-led the Ladies Aid and the Women’s Missionary League. That’s it.
When I got a college scholarship, I took a new course offered at my university called Intro to Women’s Studies.
I knew about women’s suffrage, but that was it. Who knew there was a history no one told us existed?
What history was “new” to me?
- While I was in high school, it became legal for men and women to get birth control and abortion in the United States.
The government didn’t do it; it was the Supreme Court that made the change. * In my health classes, I found out about menstruation.
In my high school accounting class, the teacher skipped the news that it had just become legal for women to get their credit cards and loans without a male co-signing their application.
- I learned how to balance a checkbook.
So what was feminism? The Merriam-Webster dictionary defines feminism as:
the theory of the political, economic, and social equality of the sexes
I found out what women could do now because of lawsuits and the U.S. Supreme Court.
- But I heard from the media and ministers what feminists thought, too.
It didn’t match what I thought, or what my friends thought, or what I heard at my university. Here are a few of the feminist myths I have heard repeated more times than I can count:
“I myself have never been able to find out precisely what feminism is: I only know that people call me a feminist whenever I express sentiments that differentiate me from a doormat.” — Rebecca West
1. Feminists are man-hating lesbians
Betty Friedan, one co-founder of the National Organization for Women in 1966, worried that if lesbians were part of the feminist…