In this video, I explore cultural Christianity and its link to the rise of Critical Social Justice.
Cultural Christianity is when people identify with Christian traditions and mores more for their cultural significance than for their religious claims (Jesus walking on water). Do ideologies like wokeism fill a void with the decline of religious belief? Are societal and cultural changes solely related to the decline of Christianity?
I think wokeism and social justice in general is actually a type of "cultural Christianity". Specifically, I think that both are the result of retaining Christian morality while subtracting the belief in the kingdom of heaven. If you take away the idea that "our rewards are in heaven", then christian morality (turn the other cheek, the meek shall inherent the earth, give them your coat also, etc...) makes no sense whatsoever. Unless of course you create that kingdom on your own, and give it to those you deem meek enough...
There is an old saying in Evangelical circles that deals with Woke Christianity. Just because you call yourself a Christian, doesn't mean You Are.
"Many will say to me on that day, ‘Lord, Lord, did we not prophesy in your name and in your name drive out demons and in your name perform many miracles?’
Then I will tell them plainly, ‘I never knew you. Away from me, you evildoers!’"
Mathew Chapter 7 verses 22-23
A Lot of these people (The Woke) are going to be very surprised when they die. They probably don't think they are doing Evil, but they are.
Since The Start there has been a desire to Fit In with the culture. There is nothing per say wrong with that. See Christmas Trees, Halloween, Easter Eggs as examples BUT There Are Limits. Progressive (Woke) Christianity goes over those limits. To put it simply It's A Heresy. As James Lindsay puts t A Gnostic Heresy
I could go on if people want.