Originally Posted by
formald3hyde
I think the whole idea of feminism got warped into something ridiculous when gender roles really started breaking down at the turn of the century. It seems a lot of the extremists started taking feminism to mean that women are superior to men and all men are scum who should go DIAF. It seems to be less about equality now and more about women being overly aggressive about their 'right' to be put on a pedestal and worshiped for being a strong independent woman who don't need no man. I'm all for equality, but in general I don't feel that feminists are seeking equality anymore. I hear more and more people arguing about ridiculous 'injustices' that are still being done to women, but most of them aren't even true. Not only that, but because of this new surge of independence, I feel like a lot of men are getting their own injustices that no one's fighting against. Why are feminists not called equalists? If there are feminists and people support that, why should there not also be masculists defending the rights of men? I know that sounds backwards because, hah, men have everything easy.
But just as there are downfalls to being a woman there are downfalls to being a man. Dating, for example - men are still expected to make the first move. A lot of girls STILL wait 'for him to call' instead of taking action themselves. Men are expected to put themselves in the line of rejection which, trust me, is no less devastating for a man than it is for a woman. Men still have lower success rates in seeking child custody after divorce. While it's true that there are jobs that have historically predominantly employed men, there have also been jobs that have, historically, been 'for women'. Now, more and more woman are getting trades and doing hard physical labor, which is great if that's what they want to do, and they actually have a higher chance of being hired because employers like improving their gender equality statistics. But from another perspective, everyone woman in the workplace is one less man in the workplace. And that would be fine if society was making up for it somewhere and other jobs were starting to become flooded with men; but that hasn't really happened. You still don't see a lot of male nurses, receptionists, hair stylists, flight attendants, etc - and if you ARE a man working in a field like that, more often than not, people will assume you're gay.
I just think that people take feminism too far these days, to the point where it's actually making things less equal in a lot of cases. Of course the above is just my personal opinion.