Men being victims of sexism
Before sounding insane, let me acknowledge that life hasn't been fair for women- they weren't allowed to vote until fairly recently, they haven't (and still don't always) received equal pay, and in some countries aren't even allowed to drive or do leisurely activities without male permission.
However, for the most part, I feel that the overall well-being of women is respected more than that of men. When it comes to crime and punishment, women seem to get a slap-on-the-wrist while men receive harsher punishments. Metaphorically, women can get away with murder whereas men will get the needle.
When it comes to personal freedom, men are regulated a lot more. For example, lesbianism seems to get a thumbs-up, whereas gay men still seem to get mocked and stigmatized. (let's also not forget than anyone who hangs around a bunch of women is a stud or female-equivalent, whereas associating with men is slutty or whorish). Women can hold hands or release song's about making out, but that would never happen if the homosexual were male.
As far as safety goes, men have always been forced into war, whereas women are there on their own decisions. Vigilante fuckheads can't seem to leave male culprits alone, but if the wrongdoer is female, mobs know how to mind their own fucking business.
Women can flirt their way out of tickets and possibly get into establishments for free if they're attractively dressed, but can men ever brag about these kinds of things.
I could go on, but I've wasted enough time. It pisses the living shit out of me that my fate can be determined by something as seemingly minor as my genitalia.