Is it normal i think that women owe me something?
I keep hearing the line being spouted "Women don't owe you ANYTHING" and I disagree. By virtue of living in civilization, they owe somebody something, and to a greater or lesser degree, I'm included in that set. They owe me basic decency, meaning that they're not gonna try and rip me off, or fart in an elevator when I'm next to them. They owe it to me and everyone else to treat us like human beings, they owe it to us to pay their fair share towards the upkeep of civilization, by paying their taxes and the like. The idea that you can exist and not owe anything to anyone is something a child would come up with, it's snappy but sounds moronic the second you look beyond the surface.