The End of Christianity in America?

I have actually spent a few minutes this afternoon reading some of the current political and societal commentaries. Some of it talked about the decline of Christian America and Christianity in America.

Churches today teach a watered down love and feel good religion. I’ve been told by several preachers and teachers in churches that we only focus on Jesus. These teachers would never ever include the Jesus that said, “But bring those enemies of mine who didn’t want me to reign over them here, and kill them before me.”, Luke 19:27.

It’s time that the preachers teach the full gospel. It’s not all love and good times. There is a lot of non-politically correct and even racial teaching. It’s hidden in the Greek and Hebrew, but it’s there. Most churches don’t even know or teach the real meaning of adultery, adulteration, yes even mongrelization. It’s in the language, but is ignored.

If we don’t get with it and proclaim true Christianity, one that treats all fairly, but holds to the whole word of God maybe God will let us fall until a new generation has had enough of the nonsense and returns to God with their whole heart.

This entry was posted in General. Bookmark the permalink.