As I’m applying to college, the east coast’s women’s schools are at the top of my list. There are a lot of things that draw me to an all-girls school. These women are said to be more willing to seriously pursue their interests and be more confident in their ability to do so than their co-ed counterparts. My mom, a women’s college alumna, has always encouraged me to consider her alma mater as well as the other sister schools. My mom really liked her experience at a women’s college and feels like it prepared her well to enter the male-dominated working world. She made good friends and set up connections that she’s maintained for decades, and having such a good connection with such strong and amazing women is something I’ve always admired. However, it’s slowly becoming clear to me that I’m probably the only girl at my school that has all five of the east coast women’s colleges on her list.
Women’s schools aren’t really respected in my town. And this idea is sort of everywhere, with no sense that a counter argument exists. If I mention that I’m applying to any of the women’s colleges, friends will tell me that they can’t imagine being the kind of girl that could go without boys for 4 years or they will tell me that their parents would never let me go to “one of those dyke schools.” Even friends that are more accepting, say that they need to have guy friends because girls are too much drama.
But luckily, women’s colleges don’t attract people with this kind of mindset. They attract motivated women, with a passion for learning and intelligent discussion. Not to say that you couldn’t find women like that at co-ed schools, but there’s an awareness at women’s schools that is unique. Women cannot be forgotten or subtly pushed aside. And most people there are aware of gender issues, extending to the trans community. (For example, Smith’s bathrooms are divided by male-identified and female-identified, instead of just male and female.) It’s unfortunate that so many people don’t recognize the importance of having a school that focuses on recognizing that sexism is still a real problem as well as empowering women. And it’s even more unfortunate that that kind of education is stigmatized.
But are women’s colleges still necessary? Originally, they were made because all other forms of higher education were reserved exclusively for men. And now that colleges are (for the most part) co-ed, women’s college may seem to be dated. So maybe they’re not necessary. But the community they create is incredibly important for women who want to go in to a male-dominated work force. The support and welcoming community that women’s colleges create protects women from being ignored and treated as less than. And that environment is something I see as being very necessary.
What do you think about women’s colleges? Did you, or would you go to one? Share with us in the comments!
Written by Madeleine Minke