toasega
(?)Community Member
Offline
- Posted: Fri, 25 Jul 2014 11:39:26 +0000
wahmbulance Disclaimer: THIS IS GOING TO PISS MANY, IF NOT ALL, OF YOU OFF. It is merely some observations of mine, based on things, as well as people that I've personally seen and heard. If you still feel like losing your rationality to butthurt emotions, so be it. Just rest assured that I will ignore you completely if you do. And know that if you do the whole "yeah, but men do that too!" thing, that is not the topic of discussion, and I'd really appreciate it if you would just respond without taking the "vice versa" stance and being angrily evasive. And quite frankly, I just want to see what you'll all say (although I have a good enough idea already). wahmbulance
Ok, let's get started.
It seems like women are always talking about how men have the massive egos and how we always want to be the center of attention, be in the top positions of power, "Patriarchy", evil, dumb, etc, etc. Yet, in every aspect of our society, women are the ones who are trying to always grab the power, and have it without even trying. So why are all the movements still going when women seem to have achieved exactly what they wanted? Why has it gone so far?
Socially:
1. Women may do as they please, and say as they please, yet men have to walk on eggshells because heaven forbid there's ever equal freedom of speech and action in the workplace or anywhere else.
2. Men must not deign to approach any woman, regardless of whether they know them, because women are lofty, unattainable creatures, and men are lowly serfs (unless they happen to make lots of money).
3. If a man hits a woman for no reason, he's automatically the aggressor (which makes sense, of course), but a woman can key up his car, burn his clothes, kill his dog, poison his friends and demolish his house (also for no real reason), but he still can't hit her. And to make matters worse, people will laugh and think it's funny, regardless of the severity of the situation. Case in point:
http://www.youtube.com/watch?v=VKgwczruOSQ
4. If a man gets drunk and does something he regrets, the responsibility rests solely on his shoulders. If a woman gets drunk, that same responsibility rests on someone else's shoulders.
In the Media:
1. Everybody Loves Raymond. Debra treats Ray like absolute s**t, but Ray is the bad guy whenever he even slightly returns the favor.
2. The male characters in "Friends", compared to the female ones. dumb guys and super capable, dominant women.
3. Homer simpson compared to Marge. Dumb guy, smart woman.
Comercials:
Politics:
Hillary Clinton said that "Women are the primary victims of war." She was speaking of the women who are left behind when a soldier dies in war. I suppose the guy who got blasted into a million pieces isn't a victim, then. The guy who survived the war, sans his legs and mobility? Oh, please! The guy with brain damage who will never be the same again? Hah! Quit whining!
There's something weird going on. Many women seem to want "Equality", but only as long as it benefits them in a superior fashion. If this mythical "equality" gives anything less than absolute gender superiority, then it's not true "equality". I don't even know if they realize it fully and are going through with all of this anyway, if they are lying to themselves, if they honestly don't know/realize, or if it's some weird combination of everything.
I just don't know. I see and hear things like this every day, and I always wonder why no one says a word if it's directed at men, but will turn it into a national incident if it happens to even one woman.
Ok, let's get started.
It seems like women are always talking about how men have the massive egos and how we always want to be the center of attention, be in the top positions of power, "Patriarchy", evil, dumb, etc, etc. Yet, in every aspect of our society, women are the ones who are trying to always grab the power, and have it without even trying. So why are all the movements still going when women seem to have achieved exactly what they wanted? Why has it gone so far?
Socially:
1. Women may do as they please, and say as they please, yet men have to walk on eggshells because heaven forbid there's ever equal freedom of speech and action in the workplace or anywhere else.
2. Men must not deign to approach any woman, regardless of whether they know them, because women are lofty, unattainable creatures, and men are lowly serfs (unless they happen to make lots of money).
3. If a man hits a woman for no reason, he's automatically the aggressor (which makes sense, of course), but a woman can key up his car, burn his clothes, kill his dog, poison his friends and demolish his house (also for no real reason), but he still can't hit her. And to make matters worse, people will laugh and think it's funny, regardless of the severity of the situation. Case in point:
http://www.youtube.com/watch?v=VKgwczruOSQ
4. If a man gets drunk and does something he regrets, the responsibility rests solely on his shoulders. If a woman gets drunk, that same responsibility rests on someone else's shoulders.
In the Media:
1. Everybody Loves Raymond. Debra treats Ray like absolute s**t, but Ray is the bad guy whenever he even slightly returns the favor.
2. The male characters in "Friends", compared to the female ones. dumb guys and super capable, dominant women.
3. Homer simpson compared to Marge. Dumb guy, smart woman.
Comercials:
Politics:
Hillary Clinton said that "Women are the primary victims of war." She was speaking of the women who are left behind when a soldier dies in war. I suppose the guy who got blasted into a million pieces isn't a victim, then. The guy who survived the war, sans his legs and mobility? Oh, please! The guy with brain damage who will never be the same again? Hah! Quit whining!
There's something weird going on. Many women seem to want "Equality", but only as long as it benefits them in a superior fashion. If this mythical "equality" gives anything less than absolute gender superiority, then it's not true "equality". I don't even know if they realize it fully and are going through with all of this anyway, if they are lying to themselves, if they honestly don't know/realize, or if it's some weird combination of everything.
I just don't know. I see and hear things like this every day, and I always wonder why no one says a word if it's directed at men, but will turn it into a national incident if it happens to even one woman.