Was America always this way or did it used to be more moral? It’s a question that many have asked, but few have answered honestly.
The truth is people have always been flawed and behaved imperfectly: that’s human nature.
Yet, the truth is things have, in fact, gotten a lot worse morally in America. So when did it start?
Some say this real breakdown of families and moral authority happened during World War One after 1914.
According to historian Norman Cantor, human life was completely devalued with the mass killing in World War I. Was he correct? Was WWI the beginning of the end for America and the West? Let’s take a look…
The Beginning of Sharp Decline
Some historians go further than Cantor and place WWI as the real collapse of the West as a whole.
Civilized nations mechanically butchering each other is a nightmare scene that’s hard to come back from and led directly to the horrors of WWII and Nazism.
If you really look at World War I, it had a devastating effect on people’s moral sense. They stopped broadly trusting authority; it also saw the fall of many monarchies in Europe and the trust in a higher power.
For Americans, WWI was our introduction to being more of an empire and the horrors of trench warfare. Yet, here at home, this was not our birth into the true horrors of relativism, sexual degeneracy, and woke weakness.
No, for America that point didn’t come in WWI, WWII, or Korea.
It came in the horrible war in Vietnam in the 1960s and early 1970s. The truth is the growth of counterculture, and “woke” ideas in the 1960s have led us to the edge of a cliff which we may never come back from.
Unbelievable, the moral rot of our country continues with our so called leaders shoving this garbage down America's throats Lunatic Pelosi more concerned with Trans folks, than the safety of Supreme Court Justices
— Carter Phillips (@CarterP88782853) June 11, 2022
It’s clear that a growing openness to relativism and sexual perversion was already started during WWI and WWII.
In Germany, WWII was preceded by the Weimar Republic where sexual fetishes and perversion reached a level unseen before in the world and spurred part of the nationalist backlash.
In the 1940s, a study on the sexual behavior of Americans pointed to a greater openness to talk about sex, which until then was taboo.
Although, for America, we really cracked open as a nation in the 1960s. Indeed, for William Bennett, a former Secretary of Education in the US, moral decline intensified in the 1960s.
As he called the time, this was a period of “decivilization.”
It is not surprising that precisely in this decade the movement of “liberation” of women and the sexual revolution happened almost simultaneously and were the so-called new morality.
With the invention of birth control pills, people were also no longer afraid of conception in sexual intercourse.
Behold, ‘free love’ was established and it became common to have sex with anyone without commitments. “Make love not war,” became a rallying cry.
If traditional, patriarchal society was “bad,” then surely a revolt against it could only lead to positive growth and freedom, right? Wrong. Absolutely dead wrong.
If history were taught in America from preschool on, citizens would know sexual degeneracy both symptom & cause of dying civilization throughout the ages.
— madisonroad (@madisonroad) June 10, 2022
The Monster Isn’t Under Our Beds
America is already weakened and contaminated. We have become easy prey for a hungry monster. It’s not under our beds, either. It’s inside our kids’ heads.