I think the west nowadays is one of the least christian place on earth
Personally, while most people in the Western world (notably Anglo-America and the EU) nowadays proclaim themselves to be Christian, I kind of feel that in reality it is perhaps one of the least Christian places on the planet. I mean there are so many un-Christian things that are mainstream in Western culture, like promiscuity, pornography, revenge, materialism, etc. In fact many people are ridiculed for not supporting the above examples. While I do see often that people will say they are Christian proudly, they are typically non-practising. Even a Buddhist temple is more "Christian", and it's not even an Abrahamic religion!
Now, I am not in any way claiming that the Western world nowadays is a bad place, nor am I saying that all people in the West are like this. It is still perhaps the best place on the planet to live, and I was born, grew up and live here. I am only saying that I think that I think despite its history with Christianity and people claiming to be religious, I think that nowadays the Western world is one of the least Christian places on the planet.
Is it normal to hold this viewpoint?