There’s a particularly rousing discussion going on in the comments over at Indies Unlimited regarding gender-bias in fiction, and it got me thinking (always a good thing from a blog post). In contrast to many of the comments, I see things as having changed a lot since I was a kid (admittedly, that was a loooong time ago).
My mother got married and raised a family in the 50s (and absolutely hated the times) and vowed to bring up her two daughters as people who could do anything they wanted to, regardless of gender (Dad agreed, obviously). Yes, I’ve come up against a shit load of gender bias throughout my lifetime, but when I look back, I can see the tide definitely turning, at least here in the States. (Re: here’s a blog post I wrote celebrating kick ass women in the movies) Most people I talk to accept strong women as normal and necessary. Yes, there are still stories where the male is the equivalent of Underdog and is all, “Here I come to save the day” but most women I know hate the stereotype and will usually avoid reading/watching/spending their money on those kinds of stories. Now, I can’t speak for other countries–I realize women’s rights are abysmal all over the world and we need to keep agitating and holding the perpetrators responsible–but, why not celebrate the achievements?
What do you think? Have we come a long way as a culture or am I just looking at the issue through rose-colored glasses?