If Christianity is truly the yoke that has been holding western society down, you'd think that things would be getting better as religion erodes. That is not what is happening, instead what has happened is that society has gone progressively more insane, nihilistic and degenerate. We don't see the erasue of all religion around the world, it's only Christianity that is collapsing, all the other religions are still going strong especially Islam, which is slowly taking over the spiritual life of the west.
Maybe it's time to admit that Christianity did have positive things to contribute to Western society, and if so, what can we do about it?
Keep in mind that I'm not interested in any kind of polemics, or discussion about the veracity of the Bible or the existence of God, I'm exclusively talking about Christianity as a key part of Western civilization and culture.