The left is destroying america?
I think that the appreciable decline in the quality of our nation can be traced to the entry of leftists, feminists, and cultural Marxists in our culture. They have been attacking the nucular family for decades now, and they're finally putting the nails in society's coffin. They've hammered away at things which are natural to developed societies, such as patriarchal rule and want to establish a sort of castrated matriarchy in its wake, they have vilified white people to the point of encouraging white genocide and replacing Shakespeare with Lil Wayne. Our personal liberties are being discarded with by an ever-imposing central government, reminiscent of Soviet-style fascism. They are trying to alter the definition of marriage to something so beyond its original meaning that any attempt to define it becomes meaningless and in doing so, have eroded the once functional family unit of western society beyond anything it might have once resembled. They are undermining the authority of the family and church in order to promote factionalism, and I believe that they are doing this in order to justify the state's ever-increasing presence in our life. IIN?