Has America had it with woke culture? Is there going to be a backlash? I saw a YouTube comment that said woke culture will be my generation’s disco. What happens when we attempt to balance making society more fair and more equitable with the seeming less endless social requirements on speech and actions?
The new “woke” language includes things that seem unnecessary to most and often people confront this, the first time at work. This is not your father’s or mother’s diversity training. Things like putting pronouns in email signature is a daily reminder of the new standards. The benefits for people are not well explained to the average person, and for many it seems incredibly tedious.
News stories about organizations making radical changes to be intentionally seem more inclusive and diverse seem nonsense to some. Diversity used to be just a concept, and woke culture has done more than trainings and some HR policies. Woke culture has gone to the next level. Woke culture is actually attempting to solve the problem. Thanks to social media, companies can’t just proclaim platitudes and ignore structural issues. Transparency is regularly leading to real progress. At least it seems that way. I have my doubts. Is all this shifting around actually doing any good? Is society beginning to make the shift?