Is it normal that i hate being a woman?
I know I'm supposed to be proud of who I am, and love the life I have, and I am grateful for being a woman in America where I can pretty much do what with my day and my body I want without overt threats of violence. So yay first and second wave feminism. Good job on that.
But, it really does seem like everything men do is more fun, more exciting, and more socially acceptable. It makes navigating the world a bit easier for them. For example, I wish it was easier for me to build muscle, I wish I wasn't so risk-averse, and I wish I had better spatial awareness and could keep my head quiet. I'm thinking about getting an IUD just to quiet my female hormones a bit so I can think straight.
Sometimes being female feels like a disease, hence my desire to treat it with an IUD. We're more likely to get chronic illnesses, quit our jobs early or have less advanced careers, end up in poverty if single, get raped/attacked, get scoffed at in every performing arts field (music, comedy, acting) except for Adult Entertainment, the list goes on. And when people try to set up programs to help women, we just get told we've got it so easy because look how many handouts we get just for being a woman?
I can't take it anymore. I'd get a sex change but I don't want to mutilate my body and I am attracted to men. I'm sure men have some things harder, but generally speaking the world was created in their image and likeness so it can't really be THAT much harder. I wouldn't mind being expected to pick up heavy things and hold doors for people and pay for dates and fight wars and support a family if it meant I also got to be sane, strong, and confident and had the freedom to think and dress and be where I want without having to look over my shoulder, seek constant medical attention just to deal with the world, or apologize for who I am all the time.