Saturday, May 26, 2018

Does IQ Inequality Deny the Declaration of Independence?

When I first began thinking as a college student about how a Darwinian science of evolved human nature might provide the foundation for political philosophy, I noticed that such thinking provoked vehement scorn in the academic world; and I worried that pursuing this line of thinking would make it impossible for me to have a successful academic career.  But now as I look back over my lifetime, and see how this debate over Darwinian social science has changed, I am astonished at how much the debate has shifted in favor of the Darwinian position, because the weight of the accumulating evidence supporting Darwinian social science has become too great to ignore.  One of the most dramatic examples of this is the history of the debate over the science of intelligence as measured by IQ.

In 1969, in my junior year at the University of Dallas, Arthur Jensen published a long article in the Harvard Educational Review--"How Much Can We Boost IQ and Scholastic Achievement?"  He answered his question in the first sentence: "Compensatory education has been tried, and it apparently has failed."  In the United States and elsewhere, children from lower class families were not as successful on average as children from higher class families.  The lower class children seemed to be less intelligent on average, as measured by their low scores on IQ tests.  Since it was commonly assumed that human intelligence, like most other human capabilities, was shaped mostly, if not entirely, by the social environment, public policy makers believed that if lower class children were given advanced educational opportunities at an early age (such as the Head Start program in the U.S.), this would raise their intelligence so that they would show the same scholastic achievement as the upper class children.  Jensen's survey of the evidence that this had failed, and that the failure was due to genetically innate differences in intelligence that could not be easily changed by environmental factors, provoked outrage: he was denounced as a racist and a fascist.  His teaching and his lectures were disrupted by violent protests, and many people demanded that he be fired from his job at the University of California-Berkeley.  Jensen had provoked this anger because he had challenged the egalitarian claim of left liberalism that human beings are born with equal capacities that can be cultivated in any direction by the social environment of their early childhood.

In September of 1971, when I was beginning my graduate work at the University of Chicago, Richard Herrnstein published an article in The Atlantic entitled "I.Q."  Two years later, he expanded his article into a book--I.Q. in the Meritocracy.  Like Jensen, he argued that while general intelligence (g) as measured by IQ tests was shaped by both genes and environment, the variation in intelligence was due mostly to genes--perhaps as much as 80%. Moreover, he claimed that in modern liberal societies, which strive to remove the social and legal obstacles to social mobility, actual social mobility would be blocked by the innate human differences in intelligence.  When people are free to rise and fall by their own merit, they will sort themselves out according to their innate differences.  So societies that increase equality of opportunity for everyone will inevitably produce an unequal class structure where the smartest people will be the ruling class.

This tendency to meritocracy with a cognitive elite is strengthened by the growing complexity of modern societies in which the most highly paid and prestigious occupations require people who can handle cognitively challenging tasks, so that high IQ is correlated with economic success.  Thus, the class structure in an open liberal society will be built on natural human inequalities.

Herrnstein put his argument into the form of a syllogism:

1. If differences in mental abilities are inherited, and
2. If success requires those abilities, and
3. If earnings and prestige depend on success,
4. Then social standing (which reflects earnings and prestige) will be based to some extent on inherited differences among people (I.Q. in the Meritocracy, 198-199).

Herrnstein thought this had profound implications for political philosophy, because it refuted "the egalitarian society of our philosophical heritage" (221), a heritage that included not only Marxism but also the Declaration of Independence.  Both the Communist Manifesto and the Declaration of Independence had affirmed the "vision of a classless society," but Herrnstein seemed to show that we were not moving to a classless society.  If he was right, then the arbitrary barriers to social mobility in a traditional aristocracy will be replaced by the biological barriers to social mobility in a modern meritocracy.

This bothered me because I was not willing to give up on the Lockean liberal principle of equal liberty as expressed in the Declaration of Independence.  I wondered whether there could be a Darwinian defense of this principle.

But while I was open to Herrnstein's reasoning, it seemed that most people in the academic world were not.  Like Jensen, he was subjected to angry persecution.  As a result of this, the scientific study of intelligence became a taboo subject.  Only a few people continued this research, and it was often hard for them to find the necessary funding.

Then, in 1994, the controversy was reignited by the publication of The Bell Curve: Intelligence and Class Structure in American Life, coauthored by Herrnstein and Charles Murray.  Herrnstein died before the publication of the book, so Murray was left to face the vitriolic attacks that it elicited.  As usual, he was denounced as a racist and a fascist.

The mob violence against Murray last year at Middlebury College shows that the Darwinian science of intelligence is still taboo for many professors and students.  And yet, it seems to me that in general the angry resistance is not as great as it once was, because the research on the genetic basis of intelligence has become so impressive that it has to be taken seriously.

Perhaps the best recent survey of that research is Richard Haier's The Neuroscience of Intelligence (Cambridge University Press), published last year. Haier shows the overwhelming evidence that has accumulated over 40 years supporting the genetic basis of intelligence.  He stresses the most impressive evidence coming from neuroimaging that now allows us to see how IQ scores are correlated with the structure and functioning of the brain, which has been Haier's area of research.

He shows how the correlations among mental tests point to the existence of an underlying general factor of intelligence that is called g.  People who do well on one test tend to do well on other tests.  This holds for tests of reasoning, spatial ability, memory, processing speed, and vocabulary.

He also shows that these tests have great predictive validity.  High IQ scores at an early age predict educational achievement, professional success, income, and healthy aging.  He also emphasizes the importance of general intelligence for everyday life.  The complexity of everyday life is challenging, and people with low IQs are less successful in managing the challenges of life.  For example, we can compare low and high IQ groups--the low having IQs of 75-90, the high having IQs of 110-125.  People in the low group are 133 times more likely to drop out of high school, 10 times more likely to be a chronic welfare recipient, 7.5 times more likely to be incarcerated, 6.2 times more likely to live in poverty, and 3 times more likely to die in a traffic accident.  People who are not smart have a hard time navigating their way through the complex cognitive challenges of everyday life.  It really is better to be smart.

The twin and adoption studies of intelligence consistently show that genes cannot account for 100% of the variance.  So there are environmental factors involved.  But then the problem is estimating the relative contributions of genes and environment.  Different studies give different proportions, with the most common view being about 50-50.  The explanation for these different outcomes might be the age at which the twins are tested, because the heritability of IQ in identical twins increases with age--from about 30% at age 5 to over 80% starting at age 18.  So for young children environmental factors explain most of the variance, while for older children genes explain most of the variance.  That's why enhanced educational programs for young lower class children do sometimes raise their IQ scores for a few years, but then this improvement disappears as they grow older.

That such IQ differences are rooted in our evolutionary history is indicated by the fact that other mammalian animals also show IQ differences.  Studies of genetically diverse mice learning various kinds of tasks show a g-factor intelligence.  Mice show a bell curve of individual differences, so that some mice are innately smarter than others as shown in their diverse learning abilities (Matzel et al., 2003, 2013).  Similarly, chimpanzees show individual variability in heritable intelligence (Hopkins et al., 2014).  We might explain this through the "social brain" hypothesis: for animals that live in complex societies, there is an evolutionary pressure favoring the cognitive ability to navigate through a complex social world.

But the most impressive recent evidence confirming the evolved biological nature of intelligence comes from improvements in the technology of neuroimaging that allow us to see the structural and functional patterns in individual human brains that are correlated with intelligence.

For centuries, scientists have tried to correlate brain size and intelligence, with the thought that bigger brains allow higher intelligence.  Now we know from many MRI studies, that there is indeed a correlation between brain size and intelligence test scores, although the correlation is modest--average correlations ranging from .22 to .40 (McDaniel, 2005).

Another general conclusion from neuroimaging studies is that all brains do not work in the same way. Every individual brain is different, and the patterns differ according to age and sex.  Young brains operate differently from old brains.  And male brains operate differently from female brains.  There are differences in the density and organization of the white matter fibers that connect the areas of the brain.  There are also differences in amount of gray matter (the clusters of neurons) in different areas of the brain.

Amazingly, these individual differences are so distinctive that fMRI imaging can identify the unique pattern of connectivity among brain areas n an individual brain as a kind of brain fingerprint.  And these brain fingerprints can predict intelligence (Finn et al., 2015).

Neuroimaging has also supported the general conclusion that intelligence is not concentrated in one part of the brain, such as the frontal lobes.  Rather, intelligence is correlated with a distributed network of different areas of the brain.  Haier has concluded that the brain areas connected with intelligence are mostly concentrated in the parietal and frontal areas, which are areas associated with memory, attention, and language.  So he has defended a "Parietal-Frontal Integration Theory" of intelligence (Jung and Haier 2007).

Haier concludes that all of this research supports Herrnstein's original claim in 1971: a liberal society that removes the legal and political obstacles to social mobility will allow the biological differences in intelligence among individuals to be expressed in a class structure of meritocracy based on innate intelligence with a cognitive elite at the top.

Against this conclusion is all of the research that apparently shows that it's not genes but social-economic status (SES) that determines social success or failure.  The children of parents with high SES tend to be more successful than the children of parents with low SES.  The flaw in this research, however, Haier argues, is that it ignores how SES is confounded with intelligence, because SES has a strong genetic component (Lubinski, 2009; Trzaskowski et al., 2014).

To explain this point, Haier asks us to consider two alternative trains of thought.  The common train of thought about the importance of SES goes this way:
"Higher income allows upward mobility, especially the ability to move from poor environments to better ones. Better neighborhoods typically include better schools and more resources to foster children's development so that children now have many advantages.  If the children have high intelligence and greater academic and economic success, it could be concluded that higher SES was the key factor driving this chain of events."
An alternative train of thought favored by Haier and Herrnstein goes this way:
"Generally, people with higher intelligence get jobs that require more of the g-factor, and these jobs tend to pay more money.  There are many factors involved, but empirical research shows g is the single strongest predictive factor for obtaining high-paying jobs that require complex thinking.  Higher income allows upward mobility, especially the ability to move from poor environments to better ones.  This often includes better schools and more resources to foster children's development so that children now have many advantages.  If the children have high intelligence and greater academic and economic success, it could be concluded that higher parental intelligence was the key factor driving this chain of events due in large part to the strong genetic influences on intelligence" (192).
This second scenario is strengthened by the fact of assortative mating.  Over the past 60 years, very intelligent women have been able to move into high levels of advanced education and professional training--opportunities denied to women in the past.  As one result of this, many highly intelligent men and women meet in colleges and universities and marry, and then they pass on their high IQ genes to their children.  They also become "power couples" with high double-income wealth.  This is exactly the sorting out of people based on intelligence that Herrnstein foresaw.

The point here is that yes, of course, SES is an important factor in determining social and economic success; but SES includes a genetic component of innate intelligence.

This leads Haier to some disturbing conclusions that he identifies as "neuro-poverty" and "neuro-social-economic status."  Living in poverty is to some significant degree rooted in the neurobiology of low intelligence that is beyond anyone's control.  Similarly, living in the highest social and economic classes is to some significant degree rooted in the neurobiology of high intelligence that is beyond anyone's control.

There is one optimistic possibility, however.  Even though the neurobiology of intelligence is today "beyond anyone's control," because so far there is no proven scientific treatment for enhancing innate intelligence, Haier does foresee that sometime in the future, scientists might find ways to enhance intelligence through genetic engineering, drug therapy, or neuromicrochips.

But until that happens, we are left with the disturbing conclusion that many people lack the innate intelligence to be very successful in life through no fault of their own.  So people do better than others in the natural genetic lottery, which is not based on merit.

So does this deny the principle of equal liberty in the Declaration of Independence?  How can people have equal rights to life, liberty, and the pursuit of happiness if in fact their place in the social class system depends to a large extent on their genetically inherited cognitive abilities?

In 1981, I took up this problem in the first conference paper that I wrote on Darwinian political theory.  It was entitled "Charles Darwin and the Declaration of Independence," and it was presented at the national convention of the American Political Science Association in Denver.  In 1984, a revised version of this paper was published as "Darwin, Aristotle, and the Biology of Human Rights" in Social Science Information (vol. 23, no. 3).

I argued that Darwinian biology can recognize that the equality of all human beings as possessing a common human nature is fully consistent with the inequality of human beings due to their different natural endowments.  The reality of biological species is such that members of the same species share a common nature despite their individual differences.  This is the modern biological justification for the Lockean claim that although human beings are naturally unequal in many respects, they are equal in certain rights by virtue of their human propensity to assert their right to pursue their interests in life.

The equality of rights in the Declaration of Independence is an equality of opportunity but not an equality of results.  Herrnstein was wrong to suggest that Jefferson wanted a classless society.  As Jefferson indicated, he was hope for a "natural aristocracy" of "virtue and talents" rather than an "artificial aristocracy" of "wealth and birth."  As Murray indicated in the last chapter of The Bell Curve ("A Place for Everyone"), this Jeffersonian "natural aristocracy" looks a lot like what he and Herrnstein see as a meritocracy.

I elaborated this last point in some of my previous posts on Murray.

Sunday, May 13, 2018

Does Watching TV Make Nietzsche's Last Man Smarter?

In my last post, I commented on Ronald Beiner's Dangerous Minds: Nietzsche, Heidegger, and the Return of the Far Right.  I challenged him to present some empirical evidence supporting Nietzsche's claim that liberalism throws everyone into the degraded and spiritless life of the "last man."  In one of his comments on the post, Beiner responded with a question: "Have you watched American TV recently?" 

Of course, this is the standard response of intellectuals who insist that the cultural degradation of bourgeois liberalism is clear in popular culture--particularly, American TV.  But where's the empirical evidence to support this assertion?  We have had experience with over 70 years of regular network television broadcasting.  If Nietzsche's "last man" critique of liberalism is correct, then we could predict that there has been a steady decline in the cognitive complexity of TV programming over these years--from dumb to dumber.  This is a testable prediction.

Since I was born in the United States in 1949, just when families were beginning to purchase television sets for the first time, I grew up during the "golden age" of network television broadcasting, so I can remember "The Honeymooners," "I Love Lucy," and "The Lone Ranger."  Channel surfing today, I occasionally jump to some reruns of these original shows on Nick At Nite.  But I don't watch them for long, because they're so boring!  If I do watch them for a while, it's only to laugh at them for how dumb they were.  Haven't we all had the same experience?  Doesn't this suggest that we have become accustomed to more recent television programming that is more entertaining for us than the first shows, because the new shows are more cognitively challenging?  If so, then either we are becoming smarter, or TV is making us smarter, or both.  And if that is so, then our culture in our liberal society is becoming smarter, which contradicts the Nietzschean last man prediction of a dumbing down culture.

This subjective impressionistic evidence can be confirmed by some objective quantifiable evidence that TV shows have been increasing in their cognitive complexity.  In 2005, Steven Johnson published his book Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter.  An excerpt from the book was published as an article in The New York Times with the title "Watching TV Makes You Smarter."  He argued that contrary to the common assumption that mass popular culture is always declining to lower standards, culture is actually becoming more cognitively demanding, as illustrated by TV programming.  He pointed out that programming on TV is increasing in its demands on our mental capacities, as indicated by increasing complexity in three elements: multiple threading, flashing arrows, and social networks.

Multiple threading refers to the multiplicity of narrative threads in a TV show.  In the 1950s, a typical episode of "Dragnet" had only one story line--crime scene, investigation, cracking of the case--with only two or three main characters.  In the 1980s, an episode of "Hill Street Blues" would have as many as 10 story lines interweaving with many primary characters; and each episode would pick up a few threads from previous episodes and leave some threads open at the end.  In contrast to "Dragnet," viewers had to mentally sort out a complex narrative structure with a complex collection of characters and a complex subject matter.  Later, shows like "The Sopranos" and "ER" became even more complex--with more simultaneous threads, more characters, and more complex subjects.

The flashing arrow is Johnson's term for what he calls narrative hand-holding.  The movie "Student Bodies" was a parody of slasher movies like "Halloween" and "Friday the 13th."  In one scene, the teenage baby sitter hears a noise, opens the door of the house, sees nothing, and then goes back into the house as the door shuts behind her.  The camera swoops in on the doorknob, and we see the door is unlocked: there's a flashing arrow on the screen and the words "Unlocked!"  That's a parody of what popular stories often do: a script inserts someone to tell the viewer some important information for the plot.  Today, popular TV shows don't rely as much on such flashing arrows, thus leaving viewers to figure out what's going on for themselves, which appeals to the mind's pleasure in solving puzzles.

The third element of growing complexity is in social networks.  Much of story telling is about exploring the complexity of social life.  How are these characters related to one another?  What is motivating them?  Are they deceived about one another?  What are their underlying strategies?  Modern TV shows increasing  force viewers to probe ever deeper social complexity to figure out what is going on.

Moreover, Johnson suggests, the greater cognitive complexity of TV shows today makes them more profitable, because now people can watch TV shows multiple times through reruns and see nuances that were not clear in the first viewing.  There are even fan sites on the Internet where fans can comment on the shows.  Think about "The Game of Thrones," for example.  Or the Socratic comedy of "The Simpsons." (Thousands of students at the University of California at Berkeley have been introduced to the history of philosophy through a course on "The Simpsons and Philosophy.")

We might explain this through an evolutionary science of liberalism.  First, we have evolved to be storytelling animals (as Jonathan Gottschall has said), because storytelling is an evolved adaptation of the human mind for mentally simulating the complex problems of social life and imagining how best to navigate through that social complexity.  Popular culture like TV is largely entertaining storytelling that appeals to that evolved adaptation.  And in the every growing social complexity of a liberal pluralist society, embracing millions of individuals cooperating and competing with one another in spontaneous orders without any central planning, the storytelling becomes ever more cognitively challenging.

Another part of this is the amazing increase in average intelligence (as measured by IQ) in liberal societies over the past hundred years, which is called the Flynn Effect (for James Flynn, who has written about it).  Apparently, liberal societies have brought increasing levels of education, and particularly the cognitive challenges of scientific education, which really has brought "Enlightenment," as people in liberal social orders have become smarter.  And this increasing intelligence has brought with it increasing moral intelligence, which Steven Pinker has identified as the "moral Flynn effect."  (I have written about his in posts here, and here.)

"Have you watched American TV recently?"  Well, yes, we might answer, and we can see evidence there that the "last man" of American liberal culture is far smarter than Nietzsche predicted.

Wednesday, May 09, 2018

Nietzsche, Nazism, and the Alt-Right: Ronald Beiner's "Dangerous Minds"

            Adolf Hitler Staring at a Bust of Friedrich Nietzsche at the Nietzsche Archives

"Hail Trump!  Hail Our People!  Hail Victory!"

This was the famous exclamation of Richard Spencer at a gathering of the Alt-Right in Washington, DC, shortly after Donald Trump's electoral victory in 2016.  Paul Gottfried coined the term "Alternative Right" in 2008.  But Spencer claims to have originated the abbreviation "Alt-Right" in that year, and he has been one of the best known leaders of the Alt-Right movement as devoted to establishing what Spencer calls the "white ethnostate" for North America and Europe.  Spencer also claims to have originated the term "ethnostate," although this seems to be a variation on what Frank Salter has called the "ethnic state."  (At the bottom of this post, I've provided links to other posts on this and related topics.)

According to Spencer, this all started with Friedrich Nietzsche.  Spencer has said: "I was red-pilled by Nietzsche."  "Red-pilled" refers to a famous scene in the movie The Matrix, in which Keanu Reeves's character swallows a red pill that allows him to see that he and all of his fellow humans have been plugged into a delusional dream, and that he must free them from their dream.  So, to "red-pill" is the slang in the Alt-Right movement that refers to the moment when people see that all the ideals of liberal democracy--equality, liberty, pluralism, and peace--are delusional, and that the true reality of life is the racial and ethnic struggle for cultural dominance.  Spencer swallowed his red pill when he began reading Nietzsche as a college student at the University of Virginia, and then later as a graduate student at the University of Chicago, he began to study Leo Strauss, who he saw as sympathetic with fascist thinking. 

What he learned was that liberal egalitarian modernity was an expression of Christian slave morality as opposed to the master morality of Greek-Roman civilization, and that this slave morality was responsible for the decadence of Western culture as promoting the dehumanizing degradation of what Nietzsche called "the last man"--the man who lives an ignoble life of safe and comfortable pleasures with no aspiration for heroic achievement.  To overcome this decadence of liberalism, we need a new nobility of elite Supermen who can create an illiberal culture of pagan master morality in which the strong rule over the weak.

Now we have Ronald Beiner's new book--Dangerous Minds: Nietzsche, Heidegger, and the Return of the Far Right (University of Pennsylvania Press, 2018)--in which he traces the intellectual history that runs from Nietzsche to Martin Heidegger to fascism and Nazism and, finally, to the recent resurgence of fascism in the Alt-Right and other illiberal authoritarian movements across Europe and Russia.

Beiner's argument for the intellectual links between Nietzsche, Heidegger, Nazism, and the newly resurgent fascist authoritarianism is persuasive.  A even more carefully detailed history of Nietzsche's place in the Third Reich is given by Steven Aschheim in Chapter 8 of his book The Nietzsche Legacy in Germany1890-1990 (University of California Press, 1992).  As Aschheim argues, it's an empirical fact of cultural history that Nietzsche was ideologically appropriated by Hitler and the Nazis as part of the official culture of the Third Reich.  But accepting this appropriation of Nietzsche by the Nazis as a fact of cultural history does not settle the question of whether their interpretation of Nietzsche was accurate or not.

Beiner rightly argues that even if the Nazi interpretation was mistaken, it was a misinterpretation that was promoted by Nietzsche himself in his most reckless writing.  Nietzsche said that the highest human being is the Dionysian artist-philosopher or Superman who exercises his will to power by tyrannically legislating new values for all of humanity.  He said that "slavery is . . . both in the cruder and in the more subtle sense, the indispensable means of spiritual discipline and breeding" (BGE, 188).  He said that the new nobility would require "merciless annihilation of everything that was degenerating and parasitical" (Ecce Homo, "Birth of Tragedy," 4).  He declared that European democracy must ultimately transform itself into "a new and sublime development of slavery," in which the "herd animal" is enslaved to the "leader animal" (Will to Power, 954, 956).  Thus, the democratization of Europe is "an involuntary arrangement for the breeding of tyrants--taking that word in every sense, including the most spiritual" (BGE, 242).  This tyrannical rule of the artist-philosophers will require "conscious breeding experiments," "terrible means of compulsion," and even "the annihilation of millions of failures."  This is necessary for the "domination of the earth" by a "new, tremendous aristocracy," in which "the will of philosophical men of power and artist-tyrants will be made to endure for millennia," and the "breeding of a new caste to rule over Europe" will unify it into "one will" (BGE, 208, 251; Will to Power, 764, 954, 960, 964).  "What is good? Everything that heightens the feeling of power in man, the will to power, power itself. What is bad? Everything that is born of weakness. . . . The weak and the failures shall perish: first principle of our love of man.  And they shall be given every possible assistance.  What is more harmful than any vice? Active pity for all the failures and all the weak: Christianity" (The Antichrist, 2).  There is plenty here to inspire Hitler, the Nazis, and the Alt-Right.

Moreover, as Beiner indicates, Nietzsche foresaw that this would happen.  In a letter, he wrote: "The sort of unqualified and utterly unsuitable people who may one day come to invoke my authority is a thought that fills me with dread.  Yet that is the torment of every great teacher of mankind: he knows that, given the circumstances and the accidents, he can become a disaster as well as a blessing to mankind."  Beiner asks: "Well, if Nietzsche was so terrified about this, why didn't he simply exercise more responsibility or more prudence about how he wrote?  There's no good answer to this question" (63).

But here I see the first of two weak points in Beiner's argument.  He speaks of the "insane recklessness" and "extreme lunacy" of Nietzsche's writing that attracts people like Hitler and Spencer (63).  But while Beiner sees this in the early and late writings of Nietzsche (28-34), he passes over the middle writings--particularly, Human, All Too Human and Dawn--in silence, and so he does not notice that the writings of Nietzsche's middle period do not show the "insane recklessness" and "extreme lunacy" of his other writings.

In fact, some of the Nazi writers who read Nietzsche carefully noticed that his middle writings contradicted Nazi ideology.  For example, Heinrich Hartle's Nietzsche and National Socialism (Nietzsche und der Nationalsozialismus) was an official Nazi book published by the central Nazi publishing house in 1937 and 1944.  Hartle argued that the National Socialists would have to separate those ideas in Nietzsche's books that supported Nazi ideology from those that did not; and in particular, the Nazis would have to reject the teachings in Nietzsche's middle writings that supported liberal democratic individualism rather than statist collectivist authoritarianism.

In many posts over the years, I have argued that Nietzsche's middle writings show a Darwinian aristocratic liberalism that contradicts the Dionysian aristocratic radicalism of his early and late writings, and it's only the latter that inspires the Nazis and the fascists.

In his middle writings, Nietzsche respects the freedom provided by liberal democracy, which includes freedom for "free spirits"--philosophers and scientists--to live their lives of intellectual inquiry without persecution, while also allowing the great multitude of people to live their lives free from tyrannical exploitation.  In contrast to his early and late writings, Nietzsche here sees liberal modernity as ennobling rather than degrading or dehumanizing.

Beiner ignores this, which leads him into what I see as the second weak point in his argument--he accepts the claim of Nietzsche in his later writings that liberalism necessarily leads to the decadence of the "last man," and he refuses to even consider the empirical evidence against this claim.

Beiner insists that life in liberal modernity is "profoundly dehumanizing" and "a profound contraction of the human spirit" (10).  In any liberal society, "the whole experience of life spirals down into unbearable shallowness and meaninglessness" (11).  He says that as a college student in Canada, he first read Nietzsche an "antidote to growing up amid the banality and conformism of suburban life in North America" (16).  The reason for all this degradation of life in Canada and all other liberal societies is that liberalism's "excessive openness and the exploding of fixed horizons" creates "horizonlessness" (25, 28).  Consequently, there is "a form of life where privileged horizons, horizons that sustain a definite understanding of the point of existence, have ceased to exist" (35).  This brings "spiritlessness" and "a total extermination and uprooting of culture," so that culture as such becomes impossible (30, 34, 144).  No one in a liberal society can escape this "spiritual void," because "everyone suffers from this horizonlessness" (38, 132).  So life becomes meaningless for everyone who lives in a liberal society.  It is therefore easy to understand the popular appeal of Nietzschean fascists and Nazis who offer what Heidegger called "spiritual renewal."

So while Beiner thinks that Nietzsche's "solutions" for the problem of liberal decadence are "all nonsense or lunacy," he also thinks that Nietzsche's "cultural diagnosis" of the problem is "not nonsense" (24).  This leads to Beiner's final conclusion at the end of his book: "I don't rule out the possibility that Nietzsche and Heidegger successfully articulate aspects of spiritual or cultural vacuity in the liberal egalitarian dispensation that defines modernity.  But what they offer by way of new dispensations to supplant spiritless modernity is far worse" (134). 

Well, if the illiberal alternatives to liberalism are far worse, then doesn't that mean that liberalism is better?  But how can liberalism be better if it only promotes "spiritual or cultural vacuity"?

And what should we say about poor Professor Beiner at the University of Toronto whose whole life has been meaningless because of the "spiritlessness" of Canadian liberal society?  Not only has he been forced to live the life of the "last man," he has learned from reading Nietzsche that he is a "last man" living a despicably degraded life, and so he must suffer from self-loathing.  Or does his capacity for self-loathing show that he is not a "last man"?

I don't believe that Beiner and all of his fellow Canadians have lived meaningless or spiritless lives, because I don't believe that a liberal society like Canada forces everyone to become Nietzsche's "last man."  I see no way to settle this disagreement between me and Beiner except by looking at the factual evidence of how people live in liberal societies to see if they live well or badly.  Amazingly, however, Beiner never offers any factual evidence to support his claim that everyone in a liberal society suffers from a meaningless or spiritless life.  In this way, his rhetorical strategy is exactly the same as other recent critics of liberalism--like Steven Smith and Patrick Dineen--who cite the claims of anti-liberal cultural critics that liberal bourgeois modernity is dehumanizing, and then assume the truth of those claims without considering any of the relevant empirical evidence.

In some of my previous posts, I have surveyed the empirical evidence that the Liberal Enlightenment has promoted human progress by fostering the good character--the moral and intellectual virtues or what Deirdre McCloskey calls the "bourgeois virtues"--that promote human happiness or flourishing.  For example, one can see the correlation between the Human Freedom Index and the World Happiness Report, which shows that liberal regimes tend to be high in both freedom and happiness, and the illiberal regimes tend to be low in both freedom and happiness.

In Enlightenment Now, Steven Pinker argues for the stunning success of the Liberal Enlightenment as shown by massive factual evidence (conveyed in 73 charts of statistical data) of human progress over the past 200 years: because of liberalism today more human beings are living longer, healthier, wealthier, freer, safer, more stimulating, and generally happier lives than human beings have ever lived in any time in history.

Beiner is silent about all of this evidence for the flourishing of human life in liberal societies. 

He is also silent about the evidence of social history that denies his claim that in liberal societies, it is impossible for people to live in moral communities with "horizons that sustain a definite understanding of the point of human existence" (35).  Consider, for example, the social history of voluntary religious communities like the Amish, the Hasidic Jews, or the Mormons, who have become some of the fastest growing religious groups in the United States.  Beiner suggests that the only way to have "viable horizons" is through "legislating authoritative horizons whose only authority is the act of legislation itself" (57).  But groups like the Amish illustrate how in liberal societies moral and religious horizons arise in families and voluntary associations (churches, schools, clubs, friendships, and so on) without being coercively legislated.  In liberal societies, people can always exercise "The Benedict Option" (as Rod Dreher calls it)--they can form self-governing communities of people dedicated to some shared vision of moral or religious excellence.  The importance of such character formation for liberal political theorists is evident, for example, in texts such as John Locke's Some Thoughts Concerning Education and Adam Smith's Theory of Moral Sentiments.

The evidence of social history also shows that liberal societies provide the intellectual freedom of thought that cultivates the life of the mind in philosophy and science.  Beiner seems deny this by agreeing with Heidegger that part of the shallowness of life in a modern liberal society is that people are distracted from plumbing the depths of the mysterious question of Being--why is there something rather than nothing?  Thus, people do not engage in the "heroic thinking" that constitutes true philosophy (70-91).  But, in fact, Heidegger's question of Being--of why or how something comes from nothing--has become a fundamental question for modern philosophy and science--particularly in response to the scientific theory of the Big Bang as the origin of everything from nothing.

Beiner is also silent about the evidence of political history that shows the spirited heroism of liberal societies.  He speaks about the emotional appeal of Hitler's heroism (130-31), but he says nothing about the liberal heroism of Winston Churchill in leading Great Britain to resist and finally defeat Hitler.

The history of liberalism is to a large extent the history of spirited resistance to tyranny and courage in war.  The Declaration of Independence was a declaration of Lockean liberalism that was also a declaration of war.  The American Civil War under the heroic leadership of Abraham Lincoln became a test of whether people in a liberal society were courageous enough to fight and die for the emancipation of slaves and a "new birth of freedom."

In Great Britain, John Stuart Mill saw Lincoln's leadership in the war as a vindication of the moral heroism of people in a free society.  In "The Contest in America" (1862), Mill wrote:
"I cannot join with those who cry Peace, peace.  I cannot wish that this war should not have been engaged in by the North . . . . War, in a good cause, is not the greatest evil which a nation can suffer.  War is an ugly thing, but not the ugliest of things: the decayed and degraded state of moral and patriotic feeling which thinks nothing worth a war, is worse.  When a people are used as mere human instruments for firing cannon or thrusting bayonets, in the service and for the selfish purposes of a master, such war degrades a people.  A war to protect other human beings against tyrannical injustice; a war to give victory to their own ideas of right and good, and which is their own war, carried on for an honest purpose by their free choice--is often the means of their regeneration.  A man who has nothing which he is willing to fight for, nothing which he cares more about than he does about his personal safety, is a miserable creature, who has no chance of being free, unless made and kept so by the exertions of better men than himself.  As long as justice and injustice have not terminated their ever renewing fight for ascendancy in the affairs of mankind, human beings must be willing, when need is, to do battle for the one against the other."
This doesn't sound like the degraded and meaningless life of the "last man."

Here are links to some of my posts that elaborate some of my points here:

Nietzsche's middle period:  hereherehereherehere, and here

Nazi philosophers:  here and here

The Alt-Right ethnic state:  here, and here,

Leo Strauss and Nazismhere and here

Patrick Dineen and the Amish:  here and here

Rod Dreher and the Benedict Optionhere

Steven Smith:  here and here

Deirdre McCloskey and the bourgeois virtues:  herehere, and here

Steven Pinker and liberal progress:  here and here

The Human Freedom Indexhere

Empirical Human Progress through the Liberal Enlightenmenthere

Heidegger's question of something from nothing:  here and here

Thursday, May 03, 2018

Nietzsche's Critique of Jordan Peterson's Nietzschean Religion

The first three YouTube videos here are short. The fourth is a compilation of videos that is longer--about 55 minutes.

Oh, I know, many of you think I have already written too much about Jordan Peterson. So you can skip this post. And I promise this will be my last one on Peterson.

These videos show Peterson presenting his interpretation of Friedrich Nietzsche's proclamation of the death of God as creating a problem for morality--particularly, the Western morality of natural rights or human rights as founded on the sacred dignity of all individuals.  Peterson claims that this shows that morality is impossible without a grounding in a transcendent religious metaphysics.  Even those who think they are scientific atheists--like Richard Dawkins and Sam Harris, for example--are actually acting out their implicit practical belief in Christian metaphysics, because they embrace a Christian morality of natural individual rights.  This shows that "we're running on the fumes of Christianity in the West."  Or to use another metaphor, we're living inside the corpse of a whale (the dead God), and there has been plenty for us to eat; but we don't realize that soon there will be nothing left for us to eat.

People like Dawkins and Harris think that their morality can be based on pure rationality--the rational science of the Enlightenment.  But in fact, as Dostoevsky shows in Crime and Punishment, there's nothing irrational about choosing to become a psychopathic murderer (like Raskolnikov): It's perfectly rational to choose to take whatever you want whenever you want it from others without any concern for their welfare, as long as you can escape punishment.  Dostoevsky is showing us that this is what we would do if we truly were atheists.

This is the Ring of Gyges argument in Plato's Republic: if I could make myself invisible, so that I could steal, cheat, and murder for my pleasure, without ever getting caught, why not?  As Dostoevsky declared: If God is dead, then everything is permitted.  This explains why Peterson thinks he has to appeal to a Nietzschean/Jungian religion--an atheistic religion--to solve the problem of morality collapsing into nihilism if there is no religiously grounded morality.

As indicated in his lectures and in Maps of Meaning (6-7), Peterson's two favorite passages from Nietzsche are from Twilight of the Idols (ix.5) and The Gay Science (sec. 125).  In the first passage, Nietzsche ridicules George Eliot and the English generally for thinking they can deny the existence of the Christian God while keeping Christian morality, without realizing that Christian morality must be a command of God--its origin is transcendental--and therefore the death of God must bring the death of Christian morality. 

The second passage is Nietzsche's first statement  of his famous declaration that "God is dead."  What is notable about this passage, as Peterson points out, is how Nietzsche laments this as a disaster for humanity: "What did we do when we unchained this earth from its sun? Whither is it moving now? Whither are we moving now? Away from all suns? Are we not plunging continuously? Backward, sideward, forward, in all directions?  Is there any up or down left?  Are we not straying as through an infinite nothing?  Do we not feel the breath of empty space? Has it not become colder?  Is not night and more night coming on all the while?  Must not lanterns be lit in the morning?  Do we not hear anything yet of the noise of the grave-diggers who are burying God?  Do we not smell anything yet of God's decomposition? Gods too decompose."  Peterson goes off on Nietzsche's suggestion that the death of God means that there is no longer any up or down--without God to command what is right or wrong, there are no standards of higher or lower for us.

But then even as Peterson agrees with Nietzsche's claim that human morality depends on transcendent standards--a moral cosmology--Peterson also says that his whole position is embedded in a Darwinian evolutionary science that would seem to view human morality as founded on empirical standards--a moral anthropology.  This contradiction in Peterson's reasoning actually coincides with a contradiction between the early and later writings of Nietzsche showing a longing for transcendence and religious redemption and the middle writings of Nietzsche showing a Darwinian science that explains morality as rooted in evolved human nature.  (I have written about this in a series of posts in January to April, 2013.)

That Peterson agrees with the middle Nietzsche in seeing morality as grounded on a Darwinian moral anthropology is clear in 12 Rules for Life.  Agreeing with my principle that the good is the desirable, Peterson writes:
"Think about it like this.  Start from the observation that we indeed desire things--even that we need them. That's human nature.  We share the experience of hunger, loneliness, thirst, sexual desire, aggression, fear, and pain.  Such things are elements of Being--primordial axiomatic elements of Being. But we must sort and organize these primordial desires, because  the world is a complex and obstinately real place.  We can't just get the one particular thing we especially just want now, along with everything else we usually want, because our desires can produce conflict with our other desires, as well as with other people, and with the world.  Thus, we must become conscious of our desires, and articulate them, and prioritize them, and arrange them into hierarchies.  That makes them sophisticated. That makes them work with each other, and with the desires of other people, and with the world.  It is in that manner that our desires elevate themselves.  It is in that manner that they organize themselves into values and become moral. Our values, our morality--they are indicators of our sophistication" (101-102).
Here Peterson seems to agree with me (and with Philippa Foot) that morality is a system of hypothetical imperatives that depend on human interests and desires.  Morality is informed desire.  The good is the desirable, and reason judges how best to satisfy the desires over a whole life, which often requires settling conflicts between desires by judging how one desire fits with others in some deliberate conception of a whole life well lived.  Hypothetical moral imperatives can be understood as following a given/if/then structure: Given what we know about our evolved human nature and our individual circumstances,  if we want to live a flourishing human life, then we must organize the satisfaction of our desires into a coherent plan of life, which requires the moral and intellectual virtues.

But then immediately after the passage just quoted, Peterson says that we need to move to a deeper level of morality to see how the "ultimate values" depend on religion, which is what Plato meant by the transcendent "Idea of the Good."  So all of our morality depends on our religious beliefs.  And if someone objects, "But I'm an atheist," Peterson will answer:
"No, you're not (and if you want to understand this, you could read Dostoevsky's Crime and Punishment, perhaps the greatest novel ever written, in which the main character, Raskolnikov, decides to take his atheism with true seriousness, commits what he has rationalized as a benevolent murder, and pays the price). You're simply not an atheist in your actions, and it is your actions that most accurately reflect your deepest beliefs--those that are implicit, embedded in your being, underneath your conscious apprehensions and articulable attitudes and surface-level self-knowledge. You can only find out what you actually believe (rather than what you think you believe) by watching how you act.  You simply don't know what you believe, before that. You are too complex to understand yourself" (103).
Although Peterson offers this as some profound insight, it's really quite ridiculous.  The only reason we don't commit murder is because we believe that God commands us not to murder.  So if we believed that God was dead, we would commit murder.  Therefore, if we don't commit murder, our actions show that we are not atheists.  But then, eventually, as modern atheism becomes such a deeply felt belief that it becomes expressed in our actions--once we have consumed God's corpse, and there's nothing more to eat--we should expect that we will all become murderers.

If this were true, we would expect to see empirical historical evidence that religious belief is correlated with a low homicide rate, and declining religious belief is correlated with a high homicide rate.  But as we've seen in many previous posts, there is a lot of evidence for declining violence over the past centuries, with some of the steepest declines in the less religious countries. 

In fact, even Peterson cites Steven Pinker's Better Angels of Our Nature as supporting this conclusion: "The probability that a modern person, in a functional democratic country, will now kill or be killed is infinitesimally low compared to what it was in previous societies (and still is, in the unorganized and anarchic parts of the world)" (58).  Oddly, Peterson does not notice how this contradicts his prediction that the modern death of God must necessarily turn us all into murderous Raskolnikovs.

It's surprising to me that in all the commentary on Peterson that I have read, no one has pointed out this fundamental contradiction in his arguments.

There is another aspect of this fundamental contradiction.  On the one hand, Peterson insists that the domain of science as the study of objective facts is completely separated from the domain of religion as mythic storytelling about subjective values (34-35, 188).  On the other hand, he accepts the "social brain" hypothesis of evolutionary psychology as explaining the evolution of religious belief as expressing the "hyperactive agency detection device" in our brains (38-40).  Peterson doesn't recognize that this evolutionary theory of religious belief was first proposed by Darwin in The Descent of Man and by Nietzsche in Human, All Too Human (as I have indicated in posts here and here).  Nor does he recognize the contradiction in asserting that science both can and cannot study religious belief.

I have elaborated my criticisms of the claim that the death of the Christian God means the death of morality herehere, and here.

Friday, April 27, 2018

Jordan Peterson as the Nietzschean Hero of His Atheistic Personal Religion

Jordan Peterson shows a conflict in his soul between scientific atheism and religious longing.  That explains his fascination with Friedrich Nietzsche and Carl Jung, who struggled with the same conflict.  And like both Nietzsche and Jung, Peterson has tried to resolve this conflict by becoming the redeeming hero of his own atheistic mythopoetic religion.  One might say that he has succeeded in this in so far as he has attracted followers around the world--particularly young men--who say that he has saved their lives. 

Many Judeo-Christian religious believers have been impressed by the way he uses his interpretations of the Bible to convey his moral message about how young men need to grow up and take responsibility for their lives as they learn how to "walk with God."

But isn't there something deeply delusional about such atheistic religiosity--a fake religion offering a fake redemption for atheists who want religious feelings and religious morality, but without having to believe any religious doctrines, such as the existence of God?  I have raised that question in some of my posts on Nietzsche (here, here, here, here, and here)  and Roger Scruton (here and here).

Peterson's second book--12 Rules for Life: An Antidote to Chaos--became a worldwide bestseller as soon as it was published in January.  How exactly do his "12 rules for life" provide "an antidote to chaos"?  In his first book--Maps of Meaning--he explained that only the Hero can give us the antidote to chaos:
"Terrible, chaotic forces lurk behind the fa├žade of the normal world.  These forces are kept at bay by maintenance of social order.  The reign of order is insufficient, however, because order itself becomes overbearing and deadly, if allowed unregulated or permanent expression.  The actions of the hero constitute an antidote to the deadly forces of chaos, and to the tyranny of order.  The hero creates order from chaos, and reconstructs that order when necessary.  His actions simultaneously ensure that novelty remains tolerable and that security remains flexible" (91).
And who is this Hero who creates order from chaos?  Well, of course, it's Jordan Peterson!

The first epigram for Maps of Meaning is from Jesus: "I will utter things which have been kept secret from the foundation of the world" (Matthew 13:35).  This is followed by a Preface entitled "Decensus Ad Infernos,"  in which Peterson tells the story of his life and how he discovered the secret teachings that he will now reveal to the world.  This discovery required that he "descend into the underworld"--into the darkest depths of his unconscious--which was required for him to become the "revolutionary hero" (279, 457).

In the Preface, Peterson describes how as a young man he dreamed "absolutely unbearable dreams" and had terrifying visions that made him depressed, anxious, and even suicidal.  He thought he was falling into insanity.

Coming home late one night from a college drinking party, Peterson says he was "self-disgusted and angry," and then he felt compelled to paint a picture: "I sketched a harsh, crude picture of a crucified Christ--glaring and demonic--with a cobra wrapped around his naked waist, like a belt.  The picture disturbed me--struck me, despite my agnosticism, as sacrilegious. I did not know what it meant, however, or why I had painted it. Where in the world had it come from?  I hadn't paid any attention to religious ideas for years. I hid the painting under some old clothes in my closet and sat cross-legged on the floor. I put my head down.  It became obvious to me at that moment that I had not developed any real understanding of myself or of others" (xix).

Since his dreams and visions seemed religious, he started reading Jung, because he had heard that Jung had become an interpreter of religion and myth.  Although he could not understand much of what he read, he was impressed by this observation by Jung: "It must be admitted that the archetypal contents of the collective unconscious can often assume grotesque and horrible forms in dreams and fantasies, so that even the most hard-boiled rationalist is not immune from shattering nightmares and haunting fears."  This suggested an explanation for his disturbing visions of religious imagery.

I can understand what Peterson is describing, because I had a very similar experience as a college student.  I had many disturbing dreams that might have indicated a crisis of religious belief.  To interpret my dreams, I studied many of Jung's writings intensely for about a year.  I wrote out my dreams and tried to interpret them through Jungian psychology.  It was fascinating and upsetting at the same time. But then as I saw more of the spooky occult ideas in Jung--ancient mystery religions, spiritualist talking to the dead, astrology, alchemy, and so on--I was bothered by what looked like an weird religious mishmash dressed up as science.  So then I turned away.

Peterson went much deeper with his Jung studies than I ever did, and he was able to see that his picture of a crucified Christ with a snake wrapped around him was an archetype from the collective unconscious--an archetype of the Savior assimilated to the serpent that is both fascinating and terrifying as symbolizing death, judgment, and rebirth: "The ideal of the Savior necessarily implies the Judge--and a judge of the most implacable sort--because the Savior is a mythological representation of that which is ideal, and the ideal always stands in judgment over the actual.  The archetypal image of the Savior, who represents perfection or completion, is therefore terrifying in precise proportion to personal distance from the ideal" (472).

Notice what this means.  Jesus is not really the divine Son of God whose crucifixion promises divine redemption of human beings, so that they can ascend after death to eternal life in Heaven.  Rather, Jesus has become a mythic symbol of human self-redemption through heroic devotion to an ideal ethical life on Earth.

Peterson explains that the teachings of Jesus Christ "signified transition of morality from reliance on tradition to reliance on individual conscience--from rule of law to rule of spirit--from prohibition to exhortation."  "What principle is rule of spirit, rather than law, predicated upon? Respect for the innately heroic nature of man. . . . All behaviors that change history, and compel imitation, follow the same pattern--that of the divine hero, the embodiment of creative human potential. . . . it becomes possible for the creative individual to mimic, consciously incarnate, the process of world-redemption itself" (395-97).  Christ taught us to "put truth and regard for the divine in humanity above all else, and everything you need will follow. . . . that the hero must be incorporated into each individual--that everyone must partake of the essence of the savior" (398).

Peterson learned from Jung that "the central ideas of Christianity are rooted in Gnostic philosophy"--in the Gnostic secret teaching that the true inner self of every individual is divine. According to the Gospel of Thomas, a Gnostic text, Christ said that "the kingdom of heaven is spread out upon the earth, but men do not see it" (456).  For human beings to achieve heaven on earth, they must learn from the "mythological worldview, which specifically attributes divine status to the individual," and they must accept "the absolute personal responsibility imposed in consequence of recognition of the divine in man" (466).

Peterson makes it clear that in speaking of the divinity of individuals, he is not identifying divinity as some supernatural or superhuman reality.  Rather, divinity is nothing more than the creative capacity of human individuals in their subjective experience of mythic story-telling, by which they poetically create the gods (466-67). 

This echoes Nietzsche: "And how many new gods are still possible! As for myself, in whom the religious, that is to say god-forming, instinct occasionally becomes active at impossible times--how differently, how variously the divine has revealed itself to me each time!" (Will to Power, sec. 1038). So Nietzsche's famous announcement of the death of God does not necessarily mean the death of religion as such.  "It seems to me that the religious instinct is indeed in the process of growing powerfully--but the theistic satisfaction it refuses with deep suspicion" (Beyond Good and Evil, sec. 66). 

Nietzschean religion, therefore, will be an a-theistic religion--a religion without any theistic belief in the existence of God.  That's the kind of atheistic religiosity that Peterson creates in his two books and in his YouTube videos that have attracted millions of readers and viewers.

That Peterson is indeed the founder of a new atheistic religion is indicated from the very beginning of 12 Rules for Life.  In his Foreword for the book, Norman Doidge compares Peterson's teaching us his 12 Rules to Moses' bearing the tablets of the Ten Commandments.  And just as the Bible presents the Mosaic law as part of a dramatic story that illustrates the rules and makes it easier to understand them, Doidge explains, so does Peterson's book tell stories and interprets myths that illustrate and explain his 12 rules.

Peterson sees three themes in all religions--suffering, limitation, and redemption.  Life is suffering.  This must be so because of the limitations of existence--human beings are physically and emotionally fragile, and they inevitably grow old and die, and so they will always suffer.  To overcome suffering, they must be redeemed or reborn for a new life.

Since these three religious themes are Peterson's main themes, he seems to be a religious teacher, and indeed much of his teaching about these themes comes from his interpretation of religious texts like the Bible.  But any reader who looks carefully at what he says about these themes will see that he never moves beyond the purely natural reality of human experience--either the objective human experience of the physical and social world or the subjective human experience of the human mind--and he thus implicitly denies that there is any superhuman or supernatural reality of God, gods, or life after death in Heaven or Hell.  He assumes that God really is dead.

Consider what he says about limitation.  He describes his thoughts about his three-year-old son Julian.  "He's three, and cute and little and comical.  But I am also afraid for him, because he could be hurt.  If I had the power to change that, what might I do?" (12 Rules, 341).  Julian could be made of titanium.  He could have a computer-enhanced brain.  And so on.  But even if this were technically possible, it wouldn't work.
"Artificially fortifying Julian would have been the same as destroying him.  Instead of his little three-year-old self, he would be a cold, steel-hard robot.  That wouldn't be Julian.  It would be a monster.  I came to realize through such thoughts that what can be truly loved about a person is inseparable from their limitations.  Julian wouldn't have been little and cute and lovable if he wasn't also prone to illness, and loss, and pain, and anxiety.  Since I loved him a lot, I decided that he was all right the way he was, despite his fragility" (341).
Notice that Peterson says nothing about  the possibility that Julian could be resurrected from death and reborn for eternal life in Heaven, where Peterson could be reunited with him.  Notice also the implication that if the natural limitations of personal identity are inseparable from what makes a person lovable, then resurrecting our loved ones for eternal life without limitations in Heaven would not satisfy us, even if it were possible.  Peterson thus implies that the poet Wallace Stevens was correct in declaring: "Death is the mother of beauty."  (This thought that immortality might be both impossible and undesirable has been developed in some posts here, here, and here.)

What Peterson says about how "existence and limitation are inextricably linked" also implies that God cannot exist.  Peterson speaks about one of his clients whose husband was dying of cancer.  She was troubled by the prospect of his death as she asked "Why?"  Peterson says that the best answer he could give her was to speak about "the tight interlinking between vulnerability and Being."  He told her an old Jewish story: Imagine a Being who is omniscient, omnipresent, and omnipotent.  What does such a Being lack? The answer? Limitation. 

Peterson explains: "If you are already everything, everywhere, always, there is nowhere to go and nothing to be.  Everything that could be already is, and everything that could happen already has.  And it is for this reason, so the story goes, that God created man.  No limitation, no story.  No story, no Being. That idea has helped me deal with the terrible fragility of Being" (343).

Peterson illustrates this thought with the story of the evolution of the comic book superhero Superman.  Over a long period, Superman's powers were expanded, until finally he became invulnerable.  But then he was boring!  "A superhero who can do anything turns out to be no hero at all. He's nothing specific, so he's nothing.  He has nothing to strive against, so he can't be admirable. Being of any reasonable sort appears to require limitation. Perhaps this is because Being requires Becoming, as well as mere static existence--and to become is to become something more, or at least something different.  That is only possible for something limited" (345).

Although he does not explicitly say so, he allows his reader to draw the conclusion: if existence and limitation are inseparable, then God--a Being without limitation--cannot exist.

Consider also what Peterson says about the religious theme of redemption. For many religious traditions, redemption points to the promise of immortality in an afterlife, and perhaps judgment leading to either eternal reward in Heaven or eternal punishment in Hell.  Peterson does speak often of Heaven and Hell.  But he clearly indicates that he has "no afterlife fantasy" (220), and therefore Heaven and Hell are purely earthly experiences in this life (see 63, 159, 109-110, 172, 190, 198, 200, 217-24, 351, 359).  By the way they live, human beings create either Heaven or Hell for themselves and others in their daily lives now.

Consequently, for Peterson, Jesus Christ becomes "the archetypal perfect man" (191), which means that Christian redemption for eternal life becomes human self-redemption for a good human life on Earth.  This is clear, for example, in this passage:
"In the Christian tradition, Christ is identified with the Logos.  The Logos is the Word of God.  That Word transformed chaos into order at the beginning of time.  In His human form, Christ sacrificed himself voluntarily to the truth, to the good, to God.  In consequence, He died and was reborn.  The Word that produces order from Chaos sacrifices everything, even itself, to God.  That single sentence, wise beyond comprehension, sums up Christianity.  Every bit of learning is a little death.  Every bit of new information challenges a previous conception, forcing it to dissolve into chaos before it can be reborn as something better.  Sometimes such deaths virtually destroy us.  In such cases, we might never recover or, if we do, we change a lot. A good friend of mine discovered that his wife of decades was having an affair.  He didn't see it coming. It plunged him into a deep depression. He descended to the underworld.  He told me, at one point, 'I always thought that people who were depressed should just shake it off. I didn't have any idea what I was talking about.'  Eventually, he returned from the depths.  In many ways, he's a new man--and, perhaps, a wiser and better man.  He lost forty pounds. He ran a marathon. He travelled to Africa and climbed Mount Kilimanjaro. He chose rebirth over descent into Hell" (223-24).
Peterson's Christian readers will like what he says in the first half of this paragraph about interpreting John 1:1: "In the beginning was the Word, and the Word was with God, and the Word was God."  But any careful Christian reader who is an orthodox Christian will be disturbed by the second half of this paragraph, which turns the Christian doctrine of redemption by Jesus Christ into an atheistic doctrine of how human beings can symbolically imitate Christ by the way they learn from suffering how to transform themselves for living a better life.

No doubt there is practical wisdom here about developing one's character in the face of suffering so that one can live a better life, which is why Peterson's book has become an international bestseller for readers looking for a self-help manual that is also philosophically deep.  Most serious readers of 12 Rules will agree, I think, with Peterson's hope that this book can "help people understand what they already know: that the soul of the individual eternally hungers for the heroism of genuine Being, and that the willingness to take on that responsibility is identical to the decision to live a meaningful life" (xxxiv-xxxv).

But those readers attracted to this because they think it's a religious teaching--perhaps even a Biblical teaching--are deceiving themselves, and it's a deception intended by Peterson as part of his Nietzschean project for founding a new atheistic religion.

Wednesday, April 25, 2018

Jordan Peterson's Unscientific Faith in Carl Jung

The deepest flaw in Jordan Peterson's argumentation is his unscientific faith in Carl Jung. 

I speak of this as faith because I am persuaded by Richard Noll's evidence--in The Jung Cult: Origins of a Charismatic Movement (1994) and The Aryan Christ: The Secret Life of Carl Jung (1997)--that the Jungian intellectual movement is a religious cult based on Jung's charismatic authority.  And I speak of this as unscientific because neither Jung nor Peterson has presented a scientific argument based on empirical evidence to support the Jungian idea of the collective unconscious as containing archetypes that transcend personal experience.

Peterson's commitment to the Jungian cult becomes especially troubling once one notices that Peterson's religious teaching is actually a Jungian religion of atheism derived from Nietzsche's Dionysian religion.  Peterson's Nietzschean and Jungian religion will be the subject of my next post. But here I will point to his unreasonable acceptance of Jung's psychology.

The crucial influence of Jung--as well as Nietzsche--is evident in both of Peterson's books.  In Maps of Meaning, he says: "Many people--some with an outstanding academic reputation--have cautioned me against discussing Jung, warned me about even mentioning his name in the academic context" (401).  But he suggests that this scorn for Jung is based on nothing more than irrational prejudice.  "I have never met someone," he claims, "who actually understood what Jung was talking about and who was simultaneously able to provide valid criticism of his ideas" (401-402).

Amazingly, he provides no support for this claim that no one has ever offered any valid criticism of Jung's ideas.  He never even refers to any of the critics of Jung--such as Noll or Andrew Neher ("Jung's Theory of Archetypes: A Critique," Journal of Humanistic Psychology 36 (Spring 1996): 61-91).

The fundamental idea of Jung's theory of psychology is that most of our unconscious mind arises not from our personal experiences as shaped by our particular culture, but from the impersonal collective unconscious, which contains archetypes--latent concepts and images--that are biologically inherited and that represent a universal essence that transcends human experience and that is identical for all human beings.  These archetypes have evolved as the deposits in the psyche of the constantly repeated experiences in the typical situations of life beginning with the first human ancestors tens of thousand of years ago.

Jung saw evidence for this in his own experience and in the experience of other people.  For example, one of his most often cited cases is the story of the Solar Phallus Man, which Jung insisted was conclusive evidence for a collective unconscious.  In 1911, he first reported this case:
"Honegger discovered the following hallucination in an insane man (paranoid dement): The patient sees in the sun an 'upright tail' similar to an erected penis.  When he moves his head back and forth, then, too, the sun's penis sways back and forth in a like manner, and out of that wind arises.  This strange hallucination remained unintelligible to me for a long time until I became acquainted with the Mithraic Liturgy and its visions" (quoted in Noll 1994: 182).
Mithraism was a mystery religion in ancient Rome.  The Mithraic Liturgy was first discovered, Jung observed, in a Greek papyrus in Paris that was not published until after the Solar Phallus Man had his hallucination, which was therefore indisputable evidence that this man derived his vision from the collective unconscious.

In 1959, in a televised interview, Jung was asked about this case: "But how could you be sure that your patient wasn't unconsciously recalling something that somebody once told him?"  Jung answered: "Oh, no. Quite out of the question, because that thing was not known.  It was a magic papyrus in Paris, and it wasn't even published.  It was only published four years later, after I had observed it with my patient" (Noll 1994: 182).

Noll points out, however, that there were at least three accounts of the Mithraic Liturgy published prior to the hallucination of the Solar Phallus Man--in books by Johann Jakob Bachofen, Friedrich Creuzer, and Eugen Dieterich.  So it's possible that this hallucination was derived from this man's reading or hearing about this image from the Mithraic Liturgy.

In fact, Jung never conclusively ruled out the possibility that all of his cases that seemed to show archetypical mythic images and stories arising from the  collective unconscious could be better explained through the cultural diffusion of myths and symbols.

Peterson says nothing about this. Nor does he say anything about Jung's strange claims in the 1930s about National Socialism being an expression of the German God Wotan.  Jung never explained how Wotan could be a universal archetype and yet a distinctively German archetype.  And since Peterson has been so concerned with explaining the evil of totalitarian movements like Nazism, it's surprising that he never reflects on Jung's apparent endorsement of Nazism as a reawakening of German pagan religion.

Thursday, April 19, 2018

Jordan Peterson on Lobster Hierarchy: A Response to P. Z. Myers' Critique

P. Z. Myers, an evolutionary biologist, has attacked Jordan Peterson's account of lobster hierarchy as utterly stupid in its ignorance of Darwinian evolutionary science. Here are the videos.  The total time for all three is about thirty minutes.  The first one is eight and a half minutes.

Peterson argues that the similarities between lobster hierarchy and human hierarchy show that human hierarchy is rooted in an evolved human nature, and therefore that it cannot be a purely cultural construction of capitalist patriarchy, as some radical feminists have claimed.

Against this, Myers raises four objections.

(1)  Hierarchies in the animal world have not evolved to be fixed and identical, as Peterson claims, because they are variable in response to variable social circumstances; and therefore human hierarchies really are social constructions, and as such they are open to change.

(2) Peterson claims that the hierarchies of lobsters and human beings are the same in being derived from a common evolutionary ancestor, but this is denied by the logic and evidence of evolutionary science, which therefore refutes his assertion that human hierarchy is biologically determined.

(3) Peterson claims that the hierarchies of lobsters and human beings are the same in being based on the same nervous system that runs on serotonin, but this is denied by the fact that the nervous systems of lobsters and humans are very different, and by the fact that serotonin serves diverse functions in different nervous systems.

(4) Against Peterson's claim that all hierarchies are simple, linear, and competitive, Myers argues that in fact they are complex and nonlinear, and they are based not just on competition but also on cooperation.

All four objections fail because they are based on a straw-man fallacy: Myers is refuting claims that Peterson has not made.

Notice that like Cathy Newman, Myers is engaged in a dominance contest with Peterson.  For Myers, an intellectual discussion like this is an opportunity to show his superiority over those with whom he disagrees, as shown by his smug insulting dismissal of Peterson: "he is a loon!"  So Myers gives us a good illustration of what Peterson identifies as one of the eight kinds of conversation--the dominance-hierarchy conversation.  This debate over the idea of hierarchy is itself a manifestation of the natural human inclination to hierarchy.

I will concede that Peterson is not always as clear and explicit as he should be in laying out the evolutionary logic and evidence for his position.  So responding to Myers' critique forces us to clarify Peterson's argument.

Contrary to what Myers asserts, Peterson does not claim that in arguing for hierarchies as natural rather than purely social constructions, he is arguing for hierarchies being absolutely fixed and identical.

This should be clear in his use of the chess analogy in the Newman interview.  Hierarchy is like a chess game: there are lots of ways to play chess, but you can't break the rules of chess and continue to play chess.  Biological nature sets the rules of the game, but within those rules, you have a lot of leeway for individual and cultural variation.

Actually, the game analogy is even more complicated than this in 12 Rules, where he emphasizes that there are "many good games" of hierarchy (87, 303).  If you're losing in one game of hierarchy, you should look for other games where you have a better chance of winning.  So, for example, if Myers is a loser in many games of hierarchy, he can always play the YouTube video game and challenge Peterson, who is one of the highest ranking players of that game. 

Liberal pluralism promotes this by allowing a great diversity of hierarchical games for people to play, instead of the oppressive order in which there is only one game with few winners and many losers.

In explaining hierarchy, Peterson observes, there is no strict separation between nature and culture, because it is an "erroneous concept" that "nature is something strictly segregated from the cultural constructs that have emerged within it."  "There is little more natural than culture" (12 Rules, 14).  Thus, hierarchies really are "cultural constructs," but it is natural for human beings to culturally construct hierarchies.  So, against the nature/nurture dichotomy, Peterson argues for what I have called "nurturing nature": while we commonly separate nature and nurture or nature and art, animal nature--including human nature--must be nurtured if it is to reach its natural completion (Darwinian Natural Right, 36-44). 

Modern biology shows that innate traits in most cases are not absolutely fixed, because the observed phenotype emerges from the complex interaction of inborn potential, developmental history, and external physical and social environments.  Hierarchy is an natural propensity for human beings, as indicated by studies showing that even babies less than a year old recognize hierarchical relationships.  Yet the full expression of that innate propensity will emerge through the life history of each individual as shaped by cultural experience.

According to Myers, Peterson infers that since both lobsters and human beings have hierarchies, the ancient common ancestor of lobsters and human beings must have been hierarchical, which shows the ancient evolutionary lineage of human hierarchy.  It is easy for Myers to ridicule this claim as unsupported by the logic and evidence of evolutionary science.

But Myers ignores Peterson's suggestion in 12 Rules that the evolution of hierarchy among animals shows convergent evolution, which is the independent evolution of similar traits in species of different evolutionary lineages, where species in similar ecological niches facing similar problems have evolved similar solutions.  So, for example, the capacity for flight has evolved independently among insects, birds, and bats because flying was a similar solution for the similar problems they faced, and not because this trait was inherited from a common ancestral species.

Although Peterson does not explicitly speak about convergent evolution, his account of the evolution of hierarchy in 12 Rules suggests convergence.  He speaks about lobsters, wrens, chickens, wolves, lizards, dolphins, and humans as very different species, and yet they have faced a similar problem in evolutionary history--fighting over territorial resources--for which hierarchy was the solution.  "Over the millennia, animals who must co-habit with others in the same territories have in consequence learned many tricks to establish dominance, while risking the least amount of possible damage" (4).  The reference here to "learning" suggests gene-culture coevolution.  And the idea that hierarchy has evolved in species of different evolutionary lineages as a similar solution to the similar problem of territorial conflicts suggests convergent evolution.  Myers says nothing about this.

Myers ridicules the idea that hierarchy among both lobsters and humans can be explained as the product of a nervous system run on the neurotransmitter serotonin.  No nervous system runs on a single neurotransmitter.  Serotonin is a simple molecule that is ubiquitous in the living world, and it functions differently in different organisms and in different nervous systems.  Serotonin is found in bananas.  Does that mean that bananas are hierarchical?

"This man is lying to you!" Myers exclaims.

In 12 Rules, Peterson supports his account of serotonin and hierarchy by citing six articles on serotonin in lobsters and one survey article on serotonin and dominance in humans and other primates (371-72, nn. 5-10, 17).  Myers doesn't explain what is wrong with these articles or with Peterson's interpretation of them.

One of the cited articles--Ziomkiewicz-Wichary (2016)--really is a good brief summary of the research.  Other articles that Peterson does not cite provide good longer summaries of the research--Watanabe and Yamamoto (2015) and Van Vugt and Tybur (2016).  Myers is silent about this research.

This research does not claim that hierarchy can be explained by the action of serotonin alone, because there are many factors that influence the evolution of hierarchy (Van Vugt and Tybur 2016).  But in males high levels of serotonin do correlate with dominant behavior, and low levels of serotonin correlate with submissive behavior.  Dominant male vervet monkeys have twice the level of serotonin as subordinate monkeys.  If the dominant monkey is removed from a group, and certain subordinate monkeys are given tryptophan, a precursor of serotonin, or fluoxetine (Prozac), which increases synaptic concentrations of serotonin, the subordinate monkeys exhibit more dominant behavior (Raleigh et al. 1984, 1991; Raleigh and McGuire 1994).

The administration of serotonin to humans has a similar effect on social dominance.  Humans who have been administered tryptophan over 12 days begin to exhibit an increase in dominant behavior (Moskowitz et al. 2001).  When citalopram (a serotonin drug) is administered to human beings, these individuals are rated as more dominant by observers, and they also increase their eye contact when interacting with strangers (Tse and Bond 2002).

The serotoninergic system affects the recognition and establishment of social dominance in three ways.  Serotonin affects the processing of facial cues, so that dominant individuals react with less anxiety to angry and fearful faces.  It affects the processing of voice cues, so that dominant individuals are less responsive to angry voices.  And it affects the mechanisms of aggression and cooperation, so that individuals can achieve dominance by first increasing affiliative behavior to establish coalitions with some individuals, and then engaging in aggressive encounters with competing individuals (Ziomkiewicz-Wichary 2016).

Regrettably, Peterson does not present this research, which would strengthen his argument.  Myers says nothing about any of this research.

Against what he takes to be Peterson's position, Myers argues that human hierarchy is not simple but complex, not linear but nonlinear, not based only on competition but also on cooperation.  Moreover, Myers insists, male dominance does not exclude female power, because female sexual selection gives females the power of mate choice.

The problem, however, is that Peterson actually agrees with all of these points.  As I have already pointed out, Peterson sees human hierarchy as complex and nonlinear, because he sees that there are "many good games" of hierarchy that people can play, particularly in societies with liberal pluralism.

Myers says that the social pyramid hierarchy of premodern times leads to social inequity and long-term instability.  Peterson agrees with this when he contrasts hierarchies based only or primarily on power and those based on competence (12 Rules, 135).

Like Myers, Peterson stresses the importance of rcciprocal cooperation in which people work together for "mutual betterment" (135).  Parents need to teach their children to be cooperative and thus "make their children socially desirable" (60, 124, 143).  To be successful in society, people need to be both cooperative and competitive.  "Cooperation is for safety, security, and companionship. Competition is for personal growth and status" (337).

Peterson also agrees with Myers that female mate choice matters for empowering females.  Peterson stresses the power of "human female choosiness," which causes so much anxiety for us males.  "It is Nature as Woman who says, 'Well, bucko, you're good enough for a friend, but my experience of you so far has not indicated the suitability of your genetic material for continued propagation'" (41).

This stands behind Peterson's point with Cathy Newman that his message to young men about the need to grow up and take responsibility for their lives benefits not just men but women as well:
"If they're healthy, women don't want boys.  They want men.  They want someone to contend with; someone to grapple with.  If they're tough, they want someone tougher.  If they're smart, they want someone smarter.  They desire someone who brings to the table something they can't already provide.  This often makes it hard for tough, smart, attractive women to find mates: there just aren't that many men around who can outclass them enough to be considered desirable (who are higher, as one research publication put it, in 'income, education, self-confidence, intelligence, dominance, and social position').  The spirit that interferes when boys are trying to become men is, therefore, no more friend to woman than it is to man. . . . And if you think tough men are dangerous, wait until you see what weak men are capable of" (332).
I'll be writing a few more posts on Jordan Peterson.


Moskowitz, D. S., G. Pinard, D. C. Zuroff, L. Annable, and S. N. Young. 2001. "The Effect of Tryptophan on Social Interaction in Everyday Life: A Placebo-Controlled Study." Neuropsychopharmacology 25: 277-89.

Raleigh, M. J., M. McGuire, G. I. Brammer, and A. Yuwiler. 1984. "Social and Environmental Influences on Blood Serotonin Concentrations in Monkeys." Archives of General Psychiatry 41: 405-410.

Raleigh, M. J., M. McGuire, G. L. Brammer, D. B. Pollack, and A. Yuwiler. 1991. "Serotonergic Mechanisms Promote Dominance Acquisition in Adult Male Vervet Monkeys." Brain Research 559: 181-90.

Raleigh, M. J., and M. T. McGuire. 1994. "Serotonin, Aggression, and Violence in Vervet Monkeys." In Roger D. Masters and Michael T. McGuire, eds., The Neurotransmitter Revolution: Serotonin, Social Behavior, and the Law, 129-45. Carbondale: Southern Illinois University Press.

Tse, W. S., and A. J. Bond. 2002. "Serotonergic Intervention Affects Both Social Dominance and Affliative Behaviour." Psychopharmacology 161: 324-30.

van Vugt, Mark, and Joshua M. Tybur. 2016. "The Evolutionary Foundations of Status Hierarchy." In David M. Buss, ed., The Handbook of Evolutionary Psychology, 2: 788-809. Hoboken, NJ: John Wiley & Sons.

Watanabe, Noriya, and Miyuki Yamamoto. 2015. "Neural Mechanisms of Social Dominance." Frontiers in Neuroscience 9 (June), article 154, 1-14. Available online.

Ziomkiewicz-Wichary, Ania. 2016. "Serotonin and Dominance." In T. K. Shackelford and V. A. Weekes-Shackelford, eds., Encyclopedia of Evolutionary Psychological Science. Cham, Switzerland: Springer International.  Available online.