An idea prompts an action, which in turn suggests another idea, which leads to yet another action, and so on. That has been the pattern for as long as human beings have existed. The results of the pattern are known as consequences—some good some bad; some intended, others unexpected. History has shown that individuals who achieve more positive consequences than negative ones tend to prosper, and civilizations that value and cultivate such performance tend to achieve greater prominence and longevity. They achieve both by teaching each new generation to consider possible consequences before speaking or acting and to evaluate actual consequences after doing so. The first strategy helps to avoid mistakes; the second, to correct those that occur.
Today, in many Western countries, notably the United States, the consequences of words and actions are increasingly ignored. The focus is instead on the intentions that guide behavior. The prevailing idea is that if people mean well by what they say and do, then they are not only worthy of praise but their thoughts and words are wise and good and therefore not to be questioned.
This dramatic change is found more in attitude toward consequences than in a formulated claim or principle. In other words, people seldom say that consequences are unimportant—rather, they act as if they were by ignoring them. This ignoring of consequences has ironically produced what may well be the most sweeping consequence of our time. Here are just a few examples:
Many politicians argue that the Biden administration’s open border policy should be continued because it humanitarian. Critics point out that it has led to a dramatic increase in the trafficking of drugs and people, abuse of the migrants during their journey, the entry of a number of criminals, and economic strains on both states and the federal government. The politicians avoid discussing these alleged consequences and instead accuse the critics of lacking compassion and/or being racists.
A school board member states that parents have no business questioning what teachers are teaching; they should instead trust teachers, who are trained to know what is best for students. A parent says in response, “My second grader is being shown videos recommending sexual experimentation, and that infringes on my responsibility to protect her from harmful and age-inappropriate materials.” The board member refuses to engage that issue, saying instead, “This is about the teacher doing her job, not your parenting problem.”
Officials in a large city enact legislation classifying shoplifting as a crime only if the value of what is taken exceeds $900. Immediately thereafter, hordes of “shoppers” descend on stores, smash displays, and carry off all the merchandise in sight while cleverly keeping under the stated limit per person. No one is arrested. Storeowners protest, blaming the legislation for causing an intolerable rise in crime. City officials respond with a public statement explaining that the legislation is intended to bring about economic equity in the city so citizens should applaud rather than criticize it.
In each of these examples a serious problem is mentioned but the people responsible for creating it cannot see that it is a consequence of their policies because they believe that their good intentions make it impossible to err. Similar examples can be found every day in the news. People who still understand that intentions are not the measure of ideas or actions are appalled by their neighbors’ apparent ignorance of this fact. They wonder, “How could such a fundamental change in our culture have occurred?”
The answer is that it came about through a series of developments over roughly the past six decades, each one compounding the preceding one(s):
1) The idea of human imperfection, which is as old as the Judeo-Christian belief in Original Sin (and as fresh as the evening news), was challenged by those who called it a false notion that shames people and destroys their confidence. They contended that instead of being flawed, every human is actually wise and good from birth. Many, including more than a few religious leaders, embraced this idea.
2) Once people were considered wise and good from birth, guidance toward wisdom and virtue was considered unnecessary. People who internalized this idea began to feel exempt not only from guidance but also from rules and regulations associated with guidance.
3) Once guidance toward wisdom and virtue was considered unnecessary, truth was redefined as being within the mind (subjective) and personal, rather than outside the mind (objective) and impersonal. Moreover, opinion was no longer “a viewpoint that required testing for validation” but instead became synonymous with truth. People who accepted these new ideas began speaking of “my truth” rather than “the truth” and expected others to accept their opinions without question.
4) When people began regarding their opinions as necessarily true, they stopped testing their opinions for accuracy and truth. Thus, they gave up comparing different viewpoints, weighing pros and cons, rejecting implausible ideas, and deciding issues based on evidence. In other words, they ceased valuing thinking and instead made decisions based on feelings, impressions, moods, and wishes. Richard Weaver presciently noted the consequences this idea would produce when he wrote in 1949, “If we attach more significance to feeling than to thinking, we shall soon, by a simple extension, attach more to wanting than to deserving.”
5) By this time, an idea that had been part of the everyone-is-wise-and-good movement became a separate force. Known as the Self-Esteem Movement, it became a central part of education. The main idea was that self-esteem is necessary for every kind of achievement and people should therefore think wonderful thoughts about themselves, love themselves unsparingly, and reject all criticism from self or others. Children were led in classroom chants of “I am wonderful,” tests were made easier or abolished altogether, and grading was inflated so that everyone received an “A” (and, in sports, a trophy). The teaching of thinking, which necessitates finding errors and correcting them, was considered harmful to students’ self-image and discarded.
6) The belief that every one is wise and wonderful at birth and deserves the highest level of self-esteem possible led to the idea that children should set their own standards of right and wrong. Predictably, the standards children set (if they set any) were less demanding than those that parents and teachers would have set and that society traditionally expected them to meet in adulthood. Moreover, the principles of right and wrong that students chose were modeled on the notion of truth they had learned in the very same classrooms—if you believe something is morally right, then (voila!) it is so, at least for you.
7) Next, whatever standards of right and wrong people chose over traditional morality became entangled with their presumed need for self-esteem. In other words, rather than testing their personal values for reasonableness, (as earlier generations had learned to do), they accepted them unquestioningly as further expressions of their wonderfulness. Sadly, doing so increased their expectation that others would agree with their self-assessment. When others did not agree, they felt disappointment so deep that in many cases it found expression in rage.
8) Another development in the dramatic changing of our culture has been the concept of self-actualization. It actually began at roughly the same time that the traditional view of human imperfection was challenged (#1 above), and grew stronger over time. The term self-actualization may seem little different from the traditional term it replaced, self-improvement. In fact, however, “improvement” means changing the self to become better. In contrast, “actualization” can mean projecting perfection that is already present. And that is how self-actualization has been understood.
9) One final development deserves mention. It has always been a fact of life that, as the poet said, even “the best laid plans” tend to go “awry,” so it has always been difficult to reconcile this fact with the idea that people are thoroughly wise and wonderful from birth. (If they are all so wise and wonderful, how can they so often be foolish and awful?) Those who argued that humans are perfect found the solution to this dilemma in a book that was not necessarily on their required reading list, the first book of the Bible, which states that Adam blamed Eve and she blamed the serpent. In other words, the argument became that what ever goes wrong in life is never the consequence of the (perfect) person’s behavior, but must be the fault of the person’s family, school, church, or society in general. From this perspective, it is considered appropriate to take credit for anything good but inappropriate to feel guilty or responsible for anything bad. Similarly, that everyone has a right to expect apologies from others for real or even imagined offenses, but no such obligation to them.
As noted earlier, each of these nine developments is a consequence of one or more preceding developments and is based on the denial that consequences are important. The time frame of the developments has been about six decades in total, which means they have impacted, to a greater or lesser extent, every generation born since the middle of the last century. This means that the vast majority of today’s leaders in science, social science, the arts and humanities, education, government, business, journalism, and so on, have grown up in a culture in which self is more important than others and good intentions are more important than consequences. Given that context, it is hardly surprising that most of those leaders lack facility in the behaviors fundamental to social harmony and human progress.
To sum up this essay, the result of ignoring consequences has been the greater and vastly more dangerous MEGA-consequence we are experiencing–CULTURAL CHAOS.
Copyright © 2021 by Vincent Ryan Ruggiero. All rights reserved