Friday, November 01, 2019

New book available! David Kaiser, A Life in History

Mount Greylock Books LLC has published my autobiography as an historian, A Life in History.  Long-time readers who want to find out how the author of this blog became the historian he is will find information about the book in a new blog,  

My talk at the Harvard Coop last May 28 about A Life in History, can be viewed here.  Enjoy! An interesting radio interview with a Denver talk show host about the book can be streamed or downloaded here.

The book can be ordered here.
I look forward to seeing your reactions. For the time being I am pinning this post. Thanks in any case to all of you for your faithful support.

Check below for more recent posts.

Saturday, July 20, 2019

Postmodernism 101

A Postmodernist Primer and its Implications for Our Time

Last February, the Resource Center Team of the Office of Diversity and Inclusion at Amherst College released a “Common Language Guide,” a series of definitions reflecting the ideologies that dominate many campuses today, to serve as “a guide to common, shared language around identity.” Conservative students at Amherst immediately passed the document to right-wing media outlets, a flap ensued, and the office withdrew the document, denying that it “represent[s] an official position of the college or an expectation that everyone on campus should use any particular language or share a point of view,” while claiming that it did illustrate “the ways in which many people at Amherst and beyond understand their own identities.”

The document is still widely available on line.  It illustrates the enormous campus power of diversity bureaucracies at most of our major institutions, where they increasingly claim the right to critique course content and cross-examine faculty about things they have said in class. It also reveals a great deal about how contemporary academics think and how influential their thinking has become.  Although it looks simply like a set of definitions, many of the definitions have a political and moral content as well as a simply descriptive one.  And because the young people who populate our major media outlets, our artistic communities, and the Democratic Party have usually passed through leading colleges and universities, the ideas it embodies have worked their way into the American mainstream, with, in my opinion, tragic consequences.  I have never discovered a relatively short text that lays out the fundamentals of this ideology clearly and concisely, but the Amherst guide manages to do that without being systematic about it.  A number of key principles that are seldom if ever stated emerge from the definitions that the office clearly wanted its student body to accept.

The postmodern ideology that the common language guide embodies comes from several twentieth century thinkers, led by the Frenchman Michel Foucault, and revolves around a particular idea of power.  It does not see power primarily as physical force, but rather as something expressed, above all, through language.  Thus, the guide’s definition of power reads as follows:

             “1. The ability to name or define.
“2. The ability to decide.
“3. The ability the set the rule, standard or policy.
“4. The ability to change the rule, standard or policy to serve your needs, wants or desires.
“5. The ability to influence decision makers to make choices in favor of your cause, issue or concern (YWCA)[sic].Power can show up materially and immaterially, and in various domains, including: personal, social, institutional, and structural.”

The list begins with the ability “to name or define” because this ideology thinks that defining reality conveys more power than controlling it.  This does indeed reverse the traditional view of Enlightenment thought, that language is designed to describe the real world, not to create it.  We shall return later to the critical question of why this view has become so popular in the last three decades.  Meanwhile, I note that power can be “immaterial” as well as material, that is, it doesn’t necessarily imply physical coercion, or greater wealth, or authority over a certain sphere of human activity, but can merely be a supposed ability to control how people think.

The definition of “oppression,” on the previous page, identifies the holders of power, which belongs not to particular individuals, but to groups.

“A system for gaining, exercising and maintaining structural and institutional power for the benefit of a limited dominant group. An inequitable system where a select few hold material and social power and marginalized groups are coerced to participate in the system against their best interests. Oppression exists on the individual, interpersonal, institutional and ideological levels. There is no such thing as reverse oppression, because oppression is rooted in institutional power.”

The definition of “racism” identifies the group that holds power more specifically.

“A system of advantage and disadvantage based on the socially constructed category of ‘race’ and the idea of white racial superiority and black racial inferiority. Specifically within the United States, racism refers to white racial prejudice and power used to advantage white people over indigenous people, black people and people of color(IBPOC) and has been made possible by the historic and present-day unequal distribution of resources. Racism is enacted on multiple levels—institutional, interpersonal, individual and ideological—and can exist both consciously and unconsciously. Unconscious or covert racism is often hidden and not recognized as racial discrimination, whereas overt racism refers to conscious attitudes and intentions to harm and discriminate against IBPOC. Both covert and overt racism are forms of violence and are rooted in the idea of white supremacy.”

 In an age when any of us can send a swab to and receive a breakdown of the tribal and racial origins of our particular genetic inheritance, one cannot help but be a bit surprised by the assumption that “’race’” is nothing but a socially constructed category, even if one believes, as I do, that the intellectual endowments of all racial groups are comparable.  More shocking, however, is the extraordinarily America-centric idea that racism only involves beliefs in white superiority over black people (although the definition immediately includes other “people of color”—that is, nonwhites—among its victims.)  As a matter of historical fact, racism, whether defined as prejudice or as systematic oppression, has existed all over the world since the beginning of human history.  Many Asians remain convinced today not only that Asian civilizations are superior to others, but also that certain Asian peoples such as Chinese or Japanese are superior to other Asian peoples.  Both American Indian tribes and African tribes often regarded each other with the deepest hostility as well.  But here, racism connects only to “the idea of white supremacy,” and the rest of humanity receives a pass.  In the same way, while the guide defines misogyny as “A type of gender-based oppression founded in the belief that women are inferior to and must remain subordinate to men,” “misandry,” the parallel prejudice of women against men, does not appear in it.  This is because, as another entry on “reverse oppression” explains, “women cannot be ‘just as sexist as men,’ because they do not hold political, economic and institutional power.”

Gender, indeed, plays a much more important role in the guide than race.  Here is the definition of “male privilege”:

“A group of unearned cultural, legal, social and institutional rights extended to cisgender men based on their assigned-sex and gender. Cisgender men have access to institutional power, make the rules, control the resources and are assumed capable. Masculinity, as enacted by cisgender men, is universalized and viewed as the normative gender. Cisgender men are often unaware of their dierential treatment (see Fragile Masculinity). While trans men, masculine-of-center women and nonbinary folks have access to benefits based on their proximity to hegemonic masculinity (see above definition), male privilege is reserved for cisgender men. This is particularly true for white cisgender men.”

The term “cisgender men” (like “cisgender women”) refers to the “assignment” of male or female to newborns, based on whether they have a penis on the one hand or a vagina on the other.  We shall return to this concept shortly.  Trans men refer to biological females (my term) who have declared themselves to be men, while “nonbinary folks”, a critical concept, refer to people who refuse to be defined as men or women, for reasons that we shall explore later. I know, of course, that humanity includes a very small number of people born with indeterminate sex organs, and another small number who have always felt that they did not belong in the body they were born in, but the new view gender goes way beyond them, as we shall see.  The historical ignorance of this definition, which assumes that all males, and particularly white males, have “access to institutional power, make the rules, control the resources and are assumed capable,” boggles the mind.  The vast majority of men around the world have never fit that description and do not now; they have struggled to eke out a stable existence on the best terms they can.   Later we shall see how this extraordinary view could have emerged and become so influential.  This definition, interestingly, also seems to claim that nonwhite men have more privilege and power than white women by virtue of their “gender assigned at birth,” although not as much as white men. 

While I had already been familiar with much of the language and thinking behind the guide for years, it truly opened my eyes about gender issues, and particularly about the increasing numbers of young people who refuse to accepts pronouns like “he” or “she” and claim non-traditional gender identities.  I had assumed that they felt a disconnect between their physical selves and their self-image, but the guide suggests something more is involved.  Many, including the authors of the guide, are rejecting traditional gender terminology not on emotional grounds, but on political ones.  This emerges very clearly from the definition of the “gender binary”:

“A socially constructed gender system in which gender is classified into two distinct and opposite categories. These gender categories are both narrowly defined and disconnected from one another. They are strictly enforced through rigid gender roles and expectations. Further, there is a hierarchy inherent to the classification, in which one gender, men/boys/masculinity, has access to power and privilege and the other, women/girls/femininity, is marginalized and oppressed. These classifications are seen as immutable; those assigned male at birth should identify as men and embody masculinity, and those assigned female at birth should identify as women and embody femininity. This binary system excludes nonbinary, genderqueer and gender-nonconforming individuals. All people are harmed by the gender binary system, but your place within the system determines the degree and quality of harm.  The gender binary is weaponized through conquest, colonization and continued occupation of indigenous people’s lands. The gender binary system is inherently violent and foregrounds all gender-based oppression.”

In other words, hospital personnel don’t put M or F on birth certificates simply to identify different biological types, but rather to segregate infants into the critical social categories of oppressor and oppressed, for which the terms “man” and “woman” are synonyms.  The penultimate sentence also suggests that the creation, and maintenance, of those categories is responsible for war, conquest, and racism (see above.)  Those who choose to live outside the “gender binary” are not simply courageous iconoclasts, they are the only people in our society, it would seem, who want to escape from this traditional system of oppression.  Lest any readers think that I am overstating my case here, let me add the guide’s definition of “nonbinary”:

“An identity term for a person who identifies outside of the gender binary. A person whose beautiful existence transcends reductive binary constructs and works to annihilate gender and gender-based oppression once and for all.”  Rather than a minority that deserves our tolerance and respect, nonbinary folk emerge as the vanguard of the revolution that will lead us into a new utopia.

I turn now to some of the political implications of this world view.

The Enlightenment, the intellectual movement that dominated the western world from the late 18th until the late 20th centuries before starting to give way to the ideas embodied in the Amherst guide, took numbers and statistics very seriously.  It got into the habit of identifying species, or buildings, or systems of government, by the features which they had in common.  Eventually the Enlightenment gave equal political rights to larger and larger numbers of people, and entrusted the choice of political leaders to electoral majorities.  All these features of enlightenment thought and institutions gave more weight to the most common attributes of human beings, and of other animals and plants, and of various distinct kinds of institutions, when they attempted to describe them.  Democratic politics in particular tend to favor the thoughts and feelings of the average or median individual, and political leaders, to take one key example, have a better chance of being elected if they endorse and at least seem to embody, the values of the largest number of their voters.

The ideology of the Amherst guide stands this methodology on its head, because it denies certain realities of human existence.  Here is the very significant definition of   “Cissexism”:  

“The system of belief that cisgender individuals are the privileged class and are more natural, normal or acceptable than transgender, genderqueer, nonbinary and/or gender-nonconforming people.”

“Cisgender” individuals, to repeat, accept the definition of their gender that medical personnel put on their birth certificate.  They constitute well over 90% of the population.  While I believe that individuals who reject that definition, like all other individuals, deserve equal rights, that statistic makes “cisgender individuals” normal insofar as the term does in fact describe almost the entire population.  By normal I do not mean morally superior or praiseworthy, but simply enormously more common.  One could also make a strong argument that there is something natural about the tendency to identify as a man or a woman, given the frequency with which members of the human species have done so.  But to the new campus ideologues, the word “natural” always appears in quotes to indicate that it is an imposed category, and numbers mean less than nothing.  Indeed, as we have seen, the views of the overwhelming majority of “cisgender” individuals deserve less consideration, since they are collaborators in a system of oppression, the system that, in this view, “assigned” them a given gender, and thus a particular status, at birth.  This attitude towards statistical reality also emerges from the definition of “People of Color,”         “An umbrella term for any individual belonging to a racially minoritized group.”  One does not belong to a minority by virtue of comparative numbers, but because the dominant culture has designated one’s group as a minority, hence the new verb, ”minoritize.”  The postmodern movement has fought the idea of any particular person’s or group’s experience as “typical.”  I wonder whether a modern democracy can function without some such idea to bring us all together—combined, of course, with an equal respect for the rights of those who fall outside it. 

Another idea runs through all these definitions:  that only the oppressed are truly virtuous.  I think that that idea has found its way into the Democratic Party, which is deeply influenced by what happens on campus.  The only virtuous men, according to the Amherst guide, are those who embody “healthy masculinity,” who “work in solidarity with marginalized gender identities to end gender-based oppression. They have an understanding of how their masculinity is impactful, and do the work of healing, undoing and preventing harm.”  The current controversy over the “squad” of four Democratic women in the House of Representatives—Alexandra Ocasio-Cortez, Rashida Tlaib, Ilhan Omar, and Ayanna Pressley—stems in part from the belief that their views deserve more weight because they are “women of color.”  The election of more women has become a good in itself among Democratic activists, whether to make total numbers within Congress more equal, or to give women more of a voice, or to give female children and adolescents more inspiration.  Many of these activists also favor writing off the votes of the white working class, which in their eyes has proven itself to be hopelessly racist and oppressive.  We shall find in the coming year how many of our fellow citizens accept these views.

Two other definitions from the guide have also found their way into our politics.  The guide defines “equality” as follows:

        “Treating everyone exactly the same. An equality emphasis often ignores historical and structural factors that benefit some social groups/communities and harm other social groups/communities.”

         The definition of “colorblindness” elaborates on this view:

         “The ideology that believes the best way to end racial discrimination is through treating individuals the same, regardless of race, culture and ethnicity. This belief, however, ignores historical and structural factors that benefit white people and disadvantage indigenous, black and all other people of color. ‘Colorblindness’ does nothing [sic] to address inequity, since it does not acknowledge the impacts of institutional and systemic racism on people of color.”

        Although modern democracies, like other known forms of government, have never treated everyone exactly equally, they have made progress in that direction, and I do not think they can continue to function if they abandon the ideal of equal treatment as their goal.

The guide also included a definition of “Legal/Illegal”—one that does not include the word “law:”

         “Highly racialized term to describe a person’s presence in a nationwithout government-issued immigration status. Not an appropriate noun or adjective to describe an individual. Often misused to designate certain undocumented members of a society (specifically people of color) to deny their contributions, right to exist and recognition as people within certain national boundaries.”

In a recent Democratic debate a number of candidates appeared to accept this definition, in practice, when they called for decriminalizing entry into the United States without permission and access to health care for illegal immigrants.  What history and current events both tell us is that immigrants, like everyone else, need legal status—rather than simply the moral glow that comes from life in a relatively poor region—to assure them of basic rights.  As a matter of fact, a whole new school of legal thought, critical legal studies, tends to argue that the whole Anglo-American legal tradition was just another way to enshrine the power of straight white males, ignoring that without it, no one will be safe.

I turn now to the paradoxical relationship of the new ideology to western civilization.

While neither reason nor science were confined to Europe in the ancient and medieval worlds, both acquired an unprecedented influence within Europe and its settler colonies during the 18th and 19th centuries.  In the political realm, reason and science decreased the influence of religion in politics, tried to rationalize government to serve the people, and spread the idea of equal political rights and equal citizenship for all.  The Enlightenment also created the modern university, an institution dedicated to the use of reason, not religion, to explain the world.  Those ideas and institutions spread around the world in the 19th and 20th centuries, both by example and because of European colonialism.  Nations like Japan and Turkey concluded that they had to adopt some western ideas and institutions to compete against the west and maintain their own sovereignty.  Communism—an offshoot of the Enlightenment—became a potent revolutionary ideology in Russia, China, and Vietnam.  African peoples introduced to ideas of equal rights by colonial powers demanded those rights for themselves.  Many other empires, of course, had spread their values and influence over large parts of the world in centuries past, but the Europeans, for whatever reason, did so most successfully.  And yes, millions of people inside and outside of Europe and North America concluded that that showed the superiority of western civilization.

The Amherst guide bluntly denies that this historical development was a good one.  Here is its definition of “Eurocentrism:

 “A worldview that is biased towards European thought, history, knowledge, institutions, peoples and culture, often favoring eorts of colonization and development specific to countries in the Global North while dismissing the benefits and advantages of the thought, culture and history rooted elsewhere. Often used to refer also to Western-centrism, which is inclusive of non-European countries in the Global North.”

I see irony here, because this whole postmodern ideology could never have emerged from anywhere but at the heart of western civilization, which gave the world the idea of equal political rights and successively extended that idea to new economic and social groups, to all races, and to women as well as men.  Yet some postmodernists have now repudiated that idea as a snare and a delusion.  This is how the distinguished historian Joan Wallach Scott, who has worked for decades at Princeton’s Institute for Advanced Study, could actually come to argue, in a recent book, that the Enlightenment, not the Muslim religion, has had the worst impact on women’s rights in the Muslim world.[1] That, however, offers a clue as to how this remarkable world view, so utterly at odds with both historical and contemporary realities, could have become so popular.  And here, in another irony, the postmodernists have something to tell us.

Reality, they constantly teach us, depends on one’s perspective, which in turn depends on one’s race, gender, and sexual orientation.  (The Amherst guide has remarkably little to say about class.)  The postmodern perspective did, I think, resonate with a particular group of individuals who began to emerge about half a century ago:  well-off women and nonwhites attempting to enter academia and the professions in the United States, and probably in certain European countries well.  What did they find?

They found a world where men unquestionably held nearly all the power, and where a good many of them (but never all, by any means) refused to take women seriously.  Some of these men also exploited their position to try to secure sexual favors.  In addition, these women—like their male counterparts, for the most part—found themselves in highly competitive environments where employment and promotion outcomes often had little relationship to ability and performance.  Faced with this daunting situation, some women easily concluded that the workplace (especially the academic workplace) was a male conspiracy and nothing more.  And that view became the basis of a good deal of scholarship, the kind of scholarship that led ultimately to the production of the Amherst guide.

The great flaw of contemporary academics—a flaw not confined to any race or gender—is to confuse their reality with reality in the rest of the world, even though they actually live in an environment every bit as separated from the real world as a medieval monastery.   And as a matter of fact, some postmodern ideas describe academia far more accurately than they do the real world.  In academia, language does matter more than reality.  One’s status frequently depends on adopting the right views, using the right jargon, and attacking the right enemies.  Very few people in academia have the discipline and patience to evaluate work on its merits.  The right to define what is important does determine a great deal in the academy, including who gets hired and who does not.  And groups can much more easily impose “hegemony,” as defined by the Amherst guide—“The imposition of dominant group ideology onto everyone in society”—on a campus than in the world at large.  Hegemony, the guide continues, “makes it dicult to escape or resist ‘believing in’ this dominant ideology; thus social control is achieved through conditioning rather than physical force or intimidation.” That is exactly what the Amherst guide was designed to do.

Like so many other intellectual movements, the postmodernist ideology, or “political correctness,” aroused a good deal of attention in the major media when it first became a force on campus in the 1990s.  It tended to fade from view over the next twenty years, but it meanwhile achieved almost complete hegemony on most of our campuses.  Now its impact has emerged on the national scene in the media, the arts, and politics.  Whatever Democrat is nominated next year will almost certainly have made a number of rhetorical and policy concessions to it.  Donald Trump, meanwhile, will do everything he can—which is a lot—to make the election a referendum on the gulf between the new ideology and traditional values.  The voters will decide.


[1] See the review of Scott’s book, Sex and Secularism, by Laura Kipnis in the New York Review of Books, May 24, 2018:

Saturday, July 13, 2019

Why democracy is in trouble - a different view

For well over a century, the idea of modern democracy as the superior and only legitimate form of government has reigned unchallenged in the English speaking world and must of the rest of the West.  In the second half of the twentieth century, after democracy had defeated Fascism and contained Communism, it also seemed to be spreading around much of the third world as well.  Then came the collapse of Communism and the brief illusion that liberal democracy had swept all before it.

Now, thirty years later, the picture looks very different.  Liberal democracy has failed to take hold in most of eastern Europe, especially in Hungary and Poland.  It has given away to authoritarian rule in Russia and much of the rest of the former USSR, and China is not evolving towards it as well.  Countries such as  Israel, Turkey and India which embraced at least the forms of secular democracy during the 20th century are moving towards religious nationalism.  Countries such as the Philippines and Brazil have elected authoritarian rulers with no respect for democratic norms.  And the two nations that did the most to spread the democratic model, the United States and Great Britain, present pitiful spectacles of paralyzed governments and polarized electorates.   A boisterous demagogue heads the US government and another is poised to take over in Britain as well.  Such movements are also gaining ground in some of the British dominions.  Populists also hold power in Italy, the German government is deeply divided, and France, the only major country in which one party rules, has not lined up behind its government either.  Why has this happened?

Democracy, I would argue, thrived and spread to the extent that it did in the twentieth century for several reasons.  One was the purely intellectual idea of self-government and equal rights, which in the 18th, 19th and 20th centuries brought down the idea of privileged orders and hereditary rule in country after country.  That is the idea which so many of our educated classes still cling to, even though its application, in recent decades, has not met the needs of a good many of our citizens.  A second reason--once again, in the second half of the twentieth century--was that the victory of democratic Great Britain and the United States in the Second World War gave democracy a certain world wide legitimacy. (Ironically, in some of the world, the victory of the USSR did the same for Communism.)  But the other reason, the one that we have in my opinion lost sight of, was that democracies had managed to accomplish so much, in so many ways, by mobilizing their society's resources.  Not merely the beauty of their ideals, but also the record of their achievements, inspired confidence.

Many of these accomplishments occurred in the field of international conflict.  The multipolar world of the 19th and early 20th centuries required all major states to maintain large armies and navies.  Young men in every major nation eventually were conscripted in peace as well as in war, until the great turning point of the Vietnam War in the late 1960s and early 1970s.  Yet that was not all.  In those same centuries, the major nations were expanding their rule and influence overseas. The United States in the 1960s went a step further, and sent men to the moon. They were building modern infrastructure for transportation and communication at home.  Many built and maintained public educational systems.  In response to the great economic and political catastrophe of the Great Depression, governments became employers of last resort, and regulated capital markets to stop speculative excesses.  In Europe, where political failure had brought about the catastrophe of the two world wars, the project of a united Europe brought many governments together.    Citizens across the income distribution paid higher taxes, in many cases, than they do today, but many really felt part of critical common enterprises in which they could take genuine pride. 

These conditions, of course, carried dangers of their own with them.  The well-organized industrial states of the first half of the twentieth century fought wars on a new and destructive scale.  In the Second World War, many millions died in death camps, in cities firebombed by aerial bombing, and on the battlefield. The development of nuclear weapons and ballistic missiles threatened the complete annihilation of the human race.  A certain uniformity of dress, custom and values prevailed across the industrialized world.  And thus, it seems, a great revolt, led by the generation born in the wake of the Second World War, became inevitable, and burst forth in the late 1960s and early 1970s.  Its consequences are still with us.

I am not going to try to trace the steady erosion of loyalty and common purpose that has marked the last few decades.  Western governments still play a huge role in their citizens lives, but the nature of that role has changed.  They provide critical financial support for many of their citizens, particularly the elderly, and many of them use taxes to provide health care for their whole population.  But they have allowed globalization to usurp their role as economic planners, and have often failed to cope successfully with its consequences for their people.  Another huge change reflects the behavior of the inhabitants of the industrialized nations.  Their birth rates have fallen very significantly, creating labor shortages that only new waves of immigration could solve.  But without the kind of common enterprises that the twentieth century featured--including great wars--the new immigrants, it seems to me, have had much more trouble assimilating.  In many countries, including the United States, large numbers of them do not even enjoy the right to vote. 

The decline of print media, I think, also plays a role in the decline of democracy.  Modern societies are enormously complex. Understanding them demands a great deal of journalists, who have to bring facts and their significance to the attention of the public, and citizens, who need to devote time and energy to reading and thinking.  Neither television nor social media can fill the gap left by the decline of serious journalism.  Instead, they appeal to tribal and ideological loyalties, and spend many hours on sensational scandals of a kind that older generations tried to keep out of politics.  That is the only reason, it seems to me, that Donald Trump, who so obviously lacks the knowledge and intellectual ability to be President, could have reached the White House.  Too many voters no longer care about those qualities at all.  Another culprit is my own profession of history, which began to conclude, in the wake of Vietnam, that the whole idea of a national history was simply a snare and a delusion designed to keep certain groups in power.  When everyone's individual story becomes equally important, there is no longer room for the larger story that can bind us all.

Our economic inequality has now, it seems to me, become self-sustaining, and I don't expect it to be reversed any time soon.  Yet if our governments cannot increase economic justice, they could still show some capacity to solve problems such as infrastructure and health care that involve us all.  The government could also find a sustainable mix of solutions to the immigration crisis.  Such measures will not make everyone  happy in our fractured landscape, but they could once again make us feel that we share certain common enterprises, and that we can make them succeed.  That, I think, is now the necessary first step to any real renewal of democracy.

Saturday, July 06, 2019

The gerrymander decision and the future of democracy

This week the Roberts court, by its customary 5-4 partisan majority, refused to affirm two separate lower court decisions that had invalidated state redistricting plans on the grounds that they were designed to secure unfair partisan outcomes.  The cases offered a perfect opportunity for a non-partisan decision, since they involved a North Carolina plan that ensured Republican dominance and a Maryland one designed to favor the Democrats.  Nonetheless, the Supreme Court majority ruled that the courts lack the power to intervene to prevent this kind of gerrymander.  That will insure a new round of gerrymanders in various states--although by no means all states--after we hold the census next year and re-allocate Congressional districts.

Over the years I have found myself drifting further away from partisans on both sides of our great political divide, and this case is no exception.  I think that the decision was wrong, on balance, and I think that Justice Kagan made a careful and powerful argument to show why it was wrong, parallel, in its way, to the excellent, fact-based argument that Justice Ginsburg made in support of the Affordable Care Act.  Yet I can also see some merit in Justice Roberts' argument that while partisan gerrymandering may indeed be a big problem, the federal courts are not the place in which to try to solve it.  And thus, I think it's at least possible that during the next twenty years or so, the decision could revive our democracy, in important ways, at the state level and in Congress.  Let me explain.

In his opinion, Roberts muddied the waters, in my opinion, by claiming that the objection to partisan gerrymandering by the plaintiffs in the original cases that reached the court came from a desire to protect the rights of political parties.  Those parties, he claimed, argued that a plan like the North Carolina one, which could give the Republicans 11 of 13 seats even if they won only 50% or so of the total vote, treated them unfairly.  But parties, he argued in effect, have no standing under the Constitution, and he is right.  That is not however the point.  The problem with these plans is that they diluted, to put it mildly, the rights of Democratic voters in North Carolina and Republicans in Maryland to have their votes heard.  The right to vote for Congress loses its effect if one find one's self in a district packed with members of the other party.  As Roberts had to admit, the federal courts have indeed ruled against certain forms of racial gerrymandering--those designed to distribute black voters so widely that they will find it very difficult to elect candidates of their choice.  Roberts ruled, however, that while one cannot do this to a person because they are black, one can do it to them because they happen to be Democrats or Republicans--a result which I find quite astonishing.

Roberts made two major arguments against affirming the lower court judgments. First, he claimed, it would be impossible to devise a rule stating what exactly constituted excessively partisan gerrymandering.  The Constitution, he noted, certainly does not mandate proportional representation for the two major parties in Congress.  Kagan demolished that argument in the most impressive part of her opinion.  The lower courts, she pointed out, had managed to do just that.  All parties to this controversy are now using computer programs to draw districts, and the states generally do lay down some general mandates about how redistricting is supposed to be done.  Using such a program, one can easily generate 1000 different plans for North Carolina, say, that respect that state's non-partisan guidelines.  One can then estimate the results that each of those plans will produce, and grade those plans according to how closely those results reflect the total vote for the two parties in the state.  One need not try to insist on the plan that produces the most perfect match--that is, the plan coming closest to proportional registration--but one could certainly rule out the 33% of plans (let us say) that most clearly favor the Republicans on the one hand, and the 33% that most clearly favor the Democrats on the other, on the grounds that they deprived too many voters of their 14th Amendment right to equal protection.  Such statistical tests, as even Roberts had to admit, have found their way into Supreme Court decisions in the past, including the antitrust case that broke up the ALCOA aluminum company, as I recall, on the grounds that its extraordinary market share made it, ipso facto, a monopoly banned by the Sherman Act.  And the court could easily have endorsed such a test in this case since both the Democrats in Maryland and the Republicans in North Carolina stated their motivations with such extraordinary frankness, leaving no doubt whatever that they simply wanted to increase their party's representation, period.  (For those who are interested, the North Carolina Republicans distorted the will of their voters more, but the Democrats in Maryland, one could argue, were in a way just as greedy, since they weren't content with a 6-2 edge in their Congressional delegation, but went through complicated redistricting to get it up to 7-1.)

Yet Roberts's second argument carried some weight for me.  The Constitution, he notes, states very specifically who is to arrange the election of members of Congress.  It gives that power to the legislatures of the states, while also reserving to the U.S. Congress the power to make such regulations as it deems appropriate.  Two democratically elected bodies, in other words, have the responsibility to insure fair elections.

The North Carolina and Maryland cases came to the Supreme Court because the legislatures of those states had abused that power so clearly.  Yet two other courses of action could have reversed their decisions.  The legislatures of those states could abandon partisan gerrymandering and set up nonpartisan commissions to recommend new districts.  This idea is not a fantasy: eight states, ranging from deep red to deep blue, have already done just that: Alaska, Arizona, California, Colorado, Idaho, Michigan, Montana, and Washington.  And the Constitution seems to allow Congress to mandate such a procedure for the whole country, should it choose to do so.  

Roberts' decision could indeed hasten such developments.  The new census and a new round of redistricting lie just around the corner, and his ruling will encourage partisan majorities in just about every state legislature do make the most partisan decisions that they can.  That prospect could set off a backlash that would create more state commissions to do the job, and even, conceivably, lead to Congressional action, which I believe the House of Representatives has already taken.  I also think Roberts has a point in one broader respect.  Extreme partisanship has hopelessly deadlocked our politics--and I agree that the federal courts can't solve that problem for us.  Our democracy simply won't work until and unless we find enough common ground to solve some problems together.  Like the very likely reversal of Roe v. Wade, this decision should encourage us all to focus more on the ballot box and our legislatures, the arenas in which true democracy is supposed to function.

Sunday, June 30, 2019

The stakes for 2020

We are nearing the climax of the fourth great crisis in American national life.  Unlike the first three--1774-94, 1860-68, and 1929-45--it has not involved a great national enterprise that mobilized economic resources on behalf of a moral cause.  It is a crisis of fragmentation and contested authority, marked (like the Civil War crisis) by a steady growth of corporate power and a decline of civic virtue.  It is also a crisis of values, and next year's election will inevitably validate one set of values over another.  I do not think that anyone at this time can tell what is going to happen, but the first two Democratic debates have made me more pessimistic about what to expect next year, and for the next decade or so.

Those debates, and particularly the second one, confirmed for me that the kind of left wing activism that I first saw in action more than 50 years ago in college has now become mainstream.  As I remarked during a discussion with four young activists at my 50th reunion--a discussion that I shall link in a day or two here--it is characterized above all by a moral approach to the world.  What is evil should not be, what is right should automatically prevail.  That means, for instance, that "undocumented"--that is, illegal--immigrants should receive guaranteed health care, because all people should receive it.  Since immigrants are poor, largely nonwhite, and largely from the third world, we should decriminalize their status at once.  That, in fact, seems to be more important to most of today's Democratic candidates than giving the 11 or 20 million illegal immigrants now living in the United States a path to citizenship that would allow them to vote--an idea we did not hear about during the debate.  I well remember how in the 1960s my contemporaries treated legal issues as the older generation's way of keeping an unjust order in place, ignoring what humankind had learned over many centuries--that legal procedures are the price of civilization, and thus, clearly necessary to any meaningful long-term reform.  

The latest flurry of attacks against Joe Biden also echo the late 1960s contempt for establishment politicians and compromises.  Biden never praised segregationists like James Eastland, he simply spoke with nostalgia about a time when men and women of entirely different views could treat each other courteously and work together on some issues.  But the very idea of treating such people (now Republicans) with anything but contempt, much less acknowledging that we might have to find common ground with them on some issues, never occurs to many liberals today.  Hillary Clinton illustrated that with her comment about Trump's "basket of deplorables" in the 2016 campaign.  The same kind of feelings lead to calls from some Democrats to ignore the evil white working class, riddled with "white supremacist' beliefs, and reward women and/or nonwhites by pitching the Democratic appeal specifically to them and putting one or more of them on the ticket.  Should such a strategy succeed, the new millennium will be at hand; should it fail, it will merely confirm the essential patriarchal, racist wickedness of the United States of America.  Liberals cling easily to these views since so many of them work in education or journalism, where almost no one challenges them.  They are waiting for the rest of the country to catch up to them, and some are gambling that the advent of younger generations and changing demographics will lead to that result.  That is possible.

On the other hand, the Republican Party--led by a TV-bred demagogue, but essentially the party of big business in general and the energy industry in particular--has entrenched its power in the Senate, in various state legislatures, and above all in the court system.  The 5-4 decision allowing and indeed encouraging gerrymandering confirms that Republican judges will use their power to help their party, and leave voters at the mercy of state governments.  The Trump administration is dismantling parts of the federal government and trying to kill others (including much of the State Department) by refusing to fill key positions.  It's the home of ideologues on many issues, including education policy, immigration policy, and policy towards the Middle East.  Such people have no trouble working with an incompetent President because he gives them what they want.  Should a Democrat like Elizabeth Warren get into power, the court system may well stand in the way of meaningful economic reform, as it did from the 1870s until the late 1930s.  Today's Republican Party--like the Republican Party from 1876 to 1896--lacks a real national majority, and has one the popular vote only once in the last seven national elections.  But it might gain votes next year because of the very low unemployment rate, or because Democrats are misreading the views of some of the new electoral constituencies, or because the spectacle of the Democratic contest for the nomination turns voters off, as happened to Republicans in 1964 and Democrats in 1972.  The Republicans, in any case, have unity and discipline, which the Democrats distrust almost on principle. Politics is war by other means, and in war, unity and discipline count for more than having a just cause.

I continue to believe, as I did in 2010 (see last week's post), that the chance to reverse our economic course was lost in the first two years of the Obama administration and will not return any time soon.  I think our real task now is to re-establish some tradition of honest, responsible government that can address problems like immigration within a serious legal framework instead of an emotional one, devote serious resources to critical problems like infrastructure and climate change, and pursue sensible foreign policies.  Yet we live in an age where politicians no longer make their name by performing effective public service.  We will probably have to start rebuilding that tradition at the state and local level, as the New York Democratic party seems to have done, in a few important key respects, last year.  And to create any broader national consensus, we might have to trade truly effective border control and restrictions on immigration for a path to citizenship for those who already live here.  After the Civil War, our politics remained crippled by partisanship for several decades.  That will be our fate too, I fear, unless we an find a unifying figure and a new political vocabulary.

Saturday, June 22, 2019

Another blast from the past - July 2010

I returned from 10 days in France yesterday.  Here is a second blast from the past--the moment, in July 2010, that I realized that Barack Obama was not going to reverse the direction the nation was taking.

Monday, July 05, 2010

The Regeneracy may not be televised

William Strauss and Neil Howe, the authors of Generations and The Fourth Turning, grew up, as I did, in the shadow of the Depression, the New Deal, and the Second World War. As they explained to a group of their acolytes in the late 1990s, they began early in that decade to write a book about American generations, focusing on what each of them had contributed to our national life. Both had been involved in government for about a decade, and both had lived through the cultural cataclysm of the 1960s and early 1970s. But their critical discovery, Bill explained, occurred when they were studying the first half of the nineteenth century, when control of national politics passed successively from the Republicans (Jefferson, Hamilton, Madison, and Monroe) to the Compromisers (Jackson, Daniel Webster, Henry Clay), and hence to the Transcendentals (Lincoln, Jefferson Davis, Sumner, John Brown, and the rest of the Southern fireasters) who brought about the Civil War. Suddenly they recognized the remarkable similarities between three pairs of generations: the Republicans and the GIs (the Presidents from Kennedy through Bush I), whose lives had been shaped by the previous crises; the Compromisers and the Silent Generation, who remembered those crises from their childhoods and sought to moderate emerging conflicts; and the Transcendentals and their own generation, the Boomers, all focused upon throwing out the old and bringing on the new. A new theory of history was born--and they began predicting a new crisis era, set to begin around 2010.

Crises of this type represent the death of the old order and the birth of a new one. The two most inspiring in American history were the late-eighteenth century crisis that gave us the Revolution and the Constitution, and the Depression and the New Deal, which culminated in the Second World War and the creation of the welfare state. The Civil War, as they recognized, had much less of a legacy, failing even to solve the racial problem that had brought it about. It is now clear that their prediction of a crisis was right on the money in both the economic and political spheres--but it seems increasingly likely, I am sorry to say, that we are not going to experience a rebirth or regeneracy comparable to that of the 1780s-90s or the 1930s-40s. The hopes that so many of us shared for a New Deal are retreating further every day, and while I am not yet entirely giving up, my head tells me that we are indeed headed for a new age of corporate supremacy parallel to the 1890s.

Today's New York Times gives a typical example of the reasons for my despair. Earmarks, we all know, are detested by all and sundry (except those who receive them), and the Congress has passed new regulations against them, specifically forbidding their award to private businesses. No sooner was this rule passed, however, than Congressmen and private companies found away around it. They are busily founding non-profits who will control the money and pass it on to the very same private firms that will do the work involved. Nothing, in short, is going to change. In the same way, the new financial reform bill, now nearing passage, will not substantially reduce trade in derivatives or force the big banks to stop trading on their own account. Even its consumer protection provisions contain loopholes. Reducing the influence of money on our politics seems as futile a task as civil service reform or railroad regulation in the 1870s--and that leads me to my next, even more controversial point.

Back in the 1990s Strauss and Howe made another prediction: a member of our own Boom generation would lead us in a new world, like the Transcendental Lincoln and the Missionary Franklin Roosevelt. When 9/11 occurred--only 72 years after the beginning the last crisis in 1929--we all held our breaths to see if it might indeed be the beginning of the crisis, or, as they called it, "Fourth Turning." When George W. Bush failed to unite the United States most of us concluded that it was not. But now, I am not so sure--because it seems that George Bush did far more to pout the United States on a different path, both at home and abroad, than Barack Obama will be able to do. Let us look, as Al Smith used to say, at the record.

Abroad, George W. Bush abandoned most of the principles that had governed our parents' foreign policies. He denounced a critical arms control treaty, the one that had banned ABMs, and began deploying missiles that still have not been proven to work. The Obama Administration has modified his plans, but it has not abandoned them. He invaded Afghanistan and Iraq on the grounds that we could not allow Al Queda to have safe havens, and we remain in Iraq while escalating our presence in Afghanistan, even though it is not clear that any of this has made us more secure. These wars have enormously raised the prestige of the military in American life for the first time since the early 1960s. In the Middle East Bush told Israel it could keep any territory it settled in a peace agreement, and the Obama Administration backed down from its first attempt to challenge that position. President Obama initially tried to recast our relations with the Muslim world but he has stuck, essentially, to the same policies, and individual Muslims (usually ones who had lived in the US and even become US citizens) have carried out a few terrorist attacks. Should one of those succeed on a fairly large scale we have no idea what the consequences might be.

At home, the reckless pursuit of deregulation by every Administration from Reagan through George W. Bush gave us the financial crisis of 2008--but before Bush left office, Henry Paulsen, it is now clear, had managed to make sure that all the banks' losses on derivatives would largely be made good through the huge bailout of AIG. Most importantly, the Bush tax cuts destroyed the surplus that Bush inherited and recreated the permanent deficit so dear to the heart of Ronald Reagan. That, combined with conservative fiscal orthodoxy which Obama seems reluctant to challenge, has crippled the government's response to the highest sustained unemployment since the 1930s. The Obama stimulus stopped the job loss but was not big enough to reverse it, and now it is coming to an end. The Republicans are fighting even modest moves like another extension of unemployment benefits--so far, at least, successfully. They seem certain to gain seats in both the House and Senate this fall, which will make any radical economic moves impossible.

Perhaps we were wrong; perhaps the crisis did begin with 9/11. Certainly George W. Bush took advantage of the shift in the national mood to move forward on a great many fronts, and his work has proven lasting. What is happening now is by no means all his fault. The Democratic Party effectively abandoned New Deal principles years ago--Bill Clinton, in fact, bragged about doing so. Now a Democratic Administration has very little to offer to the millions of new unemployed. They may not become enthusiastic Republicans, but they will not be enthusiastic Democrats, either--even though the younger voters among them are closer to the Democrats on social issues.

The politics of the Gilded Age were dominated by money. They were much more hotly contested than most people realize. U. S. Grant won two terms by huge majorities, but the next five elections--from 1876 through 1892--were all extremely close, all close enough to be decided by shifting a single state. The Democrats should have regained the White House in 1876 and did so in 1884 and 1892. Our politics may be similarly contested for the rest of my lifetime, since no government will be strong enough, it seems, to embark upon the kind of great crusade at home or abroad that will create a new consensus.

All this has enormous consequences for the Millennial generation (born 1982-2002?), whom Strauss and Howe expected to be the new GIs. Such, it seems, is not after all their destiny, since no Boomer leadership is going to enroll them either in massive public works programs or in a crusade abroad. Like the GIs in the 1930s, they will be preoccupied for a long time with finding work and setting up families. Their idealism and willingness to tackle problems may still do a lot of good, but mostly, it seems, at a local level and on a relatively small scale. In the same way that the GIs did so much to undo prejudice between religions and even between the races, the Millennials will finally break down prejudice based on sexual orientation, and they will probably begin a move away from strong religious belief. But for a variety of reasons, which I hope to explore in months and years to come, it seems that no one alive today is likely to see any kind of replay of New Deal America.

Friday, June 14, 2019

Blasts from the Past (I)

I have decided to spend a couple of weeks reposting some material from the distant past.  This piece, entitled, George W. Bush: Man of the Sixties, first appeared on October 21, 2004, in the midst of a presidential election campaign.  It certainly identified many of the issues that I have been focusing on ever since.

President Bush likes to contrast himself and his policies with the 1960s. “We’re changing the culture of America,” he says, “from one that says, ‘If it feels good, do it,’ and, ‘If you’ve got a problem, blame somebody else,’ to a culture in which each of us understands we’re responsible for the decisions we make,” (When Dick Cheney used the language of the 1960s in the face of an opposition U.S. Senator and defended himself because he “felt better,” the irony got less attention than it deserved.) Culturally, of course, the President rejects the sexual liberation of his youth, and portrays himself as a reformed sinner. Politically, as a conservative, pro-war Republican whose father had campaigned against the Civil Rights Act of 1964, he was certainly out of step on the Yale campus of 1964-68. All this is, however, entirely misleading—and the country, particularly its younger voters, should try to understand exactly who and what they are voting for before the election. George Bush and his Administration actually represent the worst of the late 1960s—a terrifying certainty determined to repudiate the past, disrupt the present, and risk the future for an ideological ideal. His certainty is not merely, as Ron Susskind argued last in last Sunday's New York Times, a question of his faith—it is all too characteristic of his entire generation. 

As George W. Bush’s college years drew to a close, the most visible political faction on most campus was the Students for a Democratic Society, which took over the main Administration building, provoked a police bust, and temporarily halted instruction at my own school, Harvard, in the spring of 1969. They were distinguished more than anything else by a complete rejection of everything our parents stood for. In their eyes, the Cold War’s “defense of freedom” was greedy imperialism; civil rights laws simply masked enduring American economic racism; marriage and family were outdated bourgeois conventions; and democracy was a sham. They and they alone knew good from evil, and they had less than nothing to learn from the past. Even within their own ranks, they had contempt for democratic processes. In April of that memorable year, a vote of the SDS turned down a proposal to occupy University Hall by a vote of about two to one—but the next day, the losing minority faction undertook the occupation anyway, dragging their colleagues (and eventually most of the student body) in their wake. 

A similar omniscient spirit has dominated the Bush Administration from the day it took office. One by one, the achievements of our parents’ generation—who occupied the White House from John F. Kennedy through George H. W. Bush—have been gleefully tossed aside: the ABM Treaty, the rigid separation of Church and State, overtime protection for workers, environmental protection, and especially the spirit of compromise and civic responsibility that allowed Republicans and Democrats to work together for the good of the country from the 1950s through the 1980s. In foreign policy they have even repudiated, in effect, the NATO Alliance and the United Nations. Events in the fall of 2002 were particularly revealing. Prodded by Colin Powell, who remembers the 1950s, the Administration sought a second Security Council resolution to authorize war against Iraq, but when they found they had only two other votes on their side, they simply disregarded the opinion of the world in the same way that the SDS disregarded the majority vote the night before the occupation of University Hall. Meanwhile, our Boomer-crafted new National Security Strategy gives the United States both the right and the duty to decide what nations shall possess what weapons, and summarily to remove hostile regimes. My Harvard classmate Elliot Abrams opposed SDS’s attempt to rule Harvard University according to their lights, but he is now enthusiastically doing his part to assure that he and his Administration colleagues rule the whole world in the same way. 

Other memories from the Vietnam era come to me these days. One Saturday afternoon in 1970, I sat in a packed Harvard Square theater watching Sam Peckinpaugh’s The Wild Bunch. Midway through the movie, William Holden (himself a member of what we now call “The Greatest Generation”) tried to explain to his fellow gang members why Robert Ryan was now working for the other side. “He gave his word,” Holden said, speaking for an older America. “It’s not whether you keep your word!” one of his companions shouted. “It’s who you give it to!” The audience went crazy with delight. Isn’t that the same spirit in which the Bush White House has patronized the scurrilous, baseless campaign of the Swift Boat veterans? John Kerry is on the wrong side; therefore, he can’t be a war hero. And such is the partisanship of our times that even Bob Dole and George H. W. Bush Sr. have joined this campaign—although John McCain, significantly, refuses to do so. 

Reality, of course, is a casualty of classic Baby Boomer thought. SDS members truly believed in 1969 that workers and students were going to overturn the established order—because it was right. In the same way, George W. Bush, in defiance of mountains of evidence that Iraq is disintegrating and that our intervention has reduced our standing in the Arab world to new lows, repeats that Iraq is on its way to a democratic transformation that will spread through the region. Freedom, he explains, is the Almighty’s gift to every man and woman on this planet—an homily which leaves a calmer observer wondering why the Almighty has been so stingy about bestowing it in so much of the world for so many centuries, or whether the President believes that he is fighting Satan’s evil presence on earth. 

Caught between ideology and reality, the Administration constantly resorts to Orwellian language. A loss of jobs becomes economic progress, less health care means more, opening national forests to logging becomes “The Healthy Forests Initiative,” and so on. In the same way, the SDS explained to us that dictatorship of the proletariat was the only true democracy. And the Administration cares nothing about federalism, because federalism could stand in its way. In 1960, when Kennedy and Nixon debated federal aid to education, Nixon argued that federal money would eventually mean federal control. Now a new Republican generation is using federal money to discredit and weaken public education through the No Child Left Behind Act. 

The Bush Administration and its supporters are usually less obvious than their left wing contemporaries were about their repudiation of our parents’ works, but the other day, Grover Norquist—the anti-tax activist who has bragged about his close relations with the White House for four years—let the cat out of the bag in an interview with a Spanish newspaper. The Weekly Standard has printed quotes from the tape of the interview. Here is now Norquist assessed the coming election. 

"And we've had four more years pass where the age cohort that is most Democratic and most pro-statist, are those people who turned 21 years of age between 1932 and 1952--Great Depression, New Deal, World War II--Social Security, the draft--all that stuff. That age cohort is now between the ages of 70 and 90 years old, and every year 2 million of them die. So 8 million people from that age cohort have passed away since the last election; that means, net, maybe 1 million Democrats have disappeared… 

"This is an age cohort that voted for a draft before the war started, and allowed the draft to continue for 25 years after the war was over. Their idea of the legitimate role of the state is radically different than anything previous generations knew, or subsequent generations. . . . Very un-American. Very unusual for America. The reaction to Great Depression, World War II, and so on: Centralization--not as much centralization as the rest of the world got, but much more than is usual in America. We've spent a lot of time dismantling some of that and moving away from that level of regimentation: getting rid of the draft . . . "

Norquist, a younger Baby Boomer, has actually hit the nail on the head. The twenty million men we drafted to win the Second World War (a conflict he apparently regrets) deserved, and got, their countrymen’s reward, in the form of the GI bill, 4% mortgages, generous Social Security benefits, and real pensions. Franklin Roosevelt, Harry Truman and Dwight Eisenhower confirmed the government’s responsibility for their well-being and that of their families. Such policies have now become “un-American” as the Bush Administration leads us towards their New Jerusalem—really a new Gilded Age. Norquist is actually exalting the collapse of civic virtue and mutual responsibility that he has helped to promote during his political career. Younger Americans should understand one thing: our current leadership is impervious to facts. Ultimately, like so many of my contemporaries, they care less about any specific changes they make at home or abroad than about simply proving to their own satisfaction that they are right and everyone else is wrong. They have already left the nation and the world a dangerous legacy.

Sunday, June 09, 2019

Witnessing history

Two weeks ago I attended the 50th reunion of the infamous Harvard Class of '69, of which a was a member.   This was as always an emotional experience, since many of my friends then remain my best friends today, and two of them stayed with me for the week.  But it had historical interest as well.  One classmate had collected a book of reminiscences and thoughts about the  Vietnam War, to which I contributed the original draft of the last few pages of American Tragedy.  A great deal of the discussion at other symposia dealt with the activism of that era, which many classmates seemed as proud of as ever.  The most striking session, however, was one featuring five contemporary campus activists, a mixture of undergraduates and grad students.  The session was recorded, and hope to be able to share it with readers soon.  It showed, more than anything else, just how influential the Boomer style of protest that debuted during our college years had become.

The students focused on several particular issues. One young woman was especially interested in "unraveling [I think that was the word she used] the rape culture on campus," and another, a grad student, complained that Harvard, in negotiations with grad students over their working conditions, did not want to include a commitment to punish those accused of sexual assault because they were afraid of lawsuits from perpetrators.  Several students also focused on university divestment from the fossil fuel industry (which would require complete change in the way the university manages its endowment, a subject about which I and other classmates have been agitating for 15 years now) and from the "prison-industrial complex," and one pleaded with us not to give Harvard any money until it took those steps.  At least one of them also attacked capitalism, which she linked to patriarchy, racism, and xenephobia.  One activist emphasized the need to "imagine" new human arrangements that would be free of these defects.  Several of them used a good deal of ideological jargon, as the SDS was wont to do 50 years ago. I do not mean to dismiss their concerns by any means--I share some of them--but am simply trying to report as neutrally as possible.

I had been designated by the organizers to say a few words, and I hadn't prepared anything, preferring to wait and hear what there was to comment on.  I began by saying that I could see a very direct line from the activism of my own class to what I had heard that day.  First of all, both had a strong moral tone, arguing, about one issue or another, "This is wrong: therefore it must not be."  One of the panelists nodded at that.  Secondly, it seemed to me that today's activists, like my contemporaries, saw themselves standing outside, and in opposition to, a corrupt system.  Lastly, a great deal of the activism, in both cases, was directed against Harvard itself.  In those days that led to a successful attack upon the presence of ROTC on campus because ROTC was implicated in the Vietnam war.  Lastly, I said that I thought that activists might have more success by appealing to their fellow Americans as citizens, rather than as members of particular demographic groups.  I did not use the phrase "identity politics," but one of the panelists did, defending such an approach, when she took an opportunity to reply to me.  

All this got me thinking, not for the first time, about the atmosphere on campus nowadays, and back in the late 1960s as well.  I think some things have gone wrong with higher education since it expanded so rapidly in the wake of the Second World War, and I want to speculate about what it might be, with particular reference to what is happening in elite institutions.  To shed more light on it I want to quote, once again, from one of the Boom generation's founding documents: a speech in the fall of 1964 by Berkeley student activist Mario Savio, who had worked the previous summer as a civil rights activist in Mississippi, in the campaign in which three civil rights workers were killed.

"Last summer I went to Mississippi to join the struggle there for civil rights. This fall I am engaged in another phase of the same struggle, this time in Berkeley. The two battlefields may seem quite different to some observers, but this is not the case. The same rights are at stake in both places -- the right to participate as citizens in democratic society and the right to due process of law. Further, it is a struggle against the same enemy. In Mississippi an autocratic and powerful minority rules, through organized violence, to suppress the vast, virtually powerless majority. In California, the privileged minority manipulates the university bureaucracy to suppress the students' political expression. That "respectable" bureaucracy masks the financial plutocrats; that impersonal bureaucracy is the efficient enemy in a "Brave New World."
"In our free-speech fight at the University of California, we have come up against what may emerge as the greatest problem of our nation -- depersonalized, unresponsive bureaucracy. We have encountered the organized status quo in Mississippi, but it is the same in Berkeley. Here we find it impossible usually to meet with anyone but secretaries. Beyond that, we find functionaries who cannot make policy but can only hide behind the rules. We have discovered total lack of response on the part of the policy makers. To grasp a situation which is truly Kafkaesque, it is necessary to understand the bureaucratic mentality. And we have learned quite a bit about it this fall, more outside the classroom than in. 

"As bureaucrat, an administrator believes that nothing new happens. He occupies an a-historical point of view. In September, to get the attention of this bureaucracy which had issued arbitrary edicts suppressing student political expression and refused to discuss its action, we held a sit-in on the campus. We sat around a police car and kept it immobilized for over thirty-two hours. At last, the administrative bureaucracy agreed to negotiate. But instead, on the following Monday, we discovered that a committee had been appointed, in accordance with usual regulations, to resolve the dispute. Our attempt to convince any of the administrators that an event had occurred, that something new had happened, failed. They saw this simply as something to be handled by normal university procedures. 

"The same is true of all bureaucracies. They begin as tools, means to certain legitimate goals, and they end up feeding their own existence. The conception that bureaucrats have is that history has in fact come to an end. No events can occur now that the Second World War is over which can change American society substantially. We proceed by standard procedures as we are. 

"The most crucial problems facing the United States today are the problem of automation and the problem of racial injustice. Most people who will be put out of jobs by machines will not accept an end to events, this historical plateau, as the point beyond which no change occurs. Negroes will not accept an end to history here. All of us must refuse to accept history's final judgment that in America there is no place in society for people whose skins are dark. On campus students are not about to accept it as fact that the university has ceased evolving and is in its final state of perfection, that students and faculty are respectively raw material and employees, or that the university is to be autocratically run by unresponsive bureaucrats. 

"Here is the real contradiction: the bureaucrats hold history as ended. As a result significant parts of the population both on campus and off are dispossessed and these dispossessed are not about to accept this a-historical point of view. It is out of this that the conflict has occurred with the university bureaucracy and will continue to occur until that bureaucracy becomes responsive or until it is clear the university cannot function."

In a post more than five years ago I tried to explain how the comparison of Berkeley undergraduates, who were receiving an extraordinary education, far better than any available on any campus today, free of charge, could accept their identification with the terrorized, poverty-stricken black population of Mississippi.  I won't repeat that analysis here--anyone one interested can check it out for themselves--but I will say this: Savio was identifying at least one real issue.  Modern society can't exist without powerful bureaucracies that do their best (and never without some failures) to administer impartial rules.  They are, literally, the price of civilization.  Yet at the same time, they inevitably provoke negative reactions from much of the human race, particularly when young.  In college I learned about a parallel youth rebellion against the bureaucracy of the French Third Republic from Stanley Hoffmann, and such a rebellion eventually brought down Communism in the Soviet Union.  The whole Trump movement is also a rebellion against a bureaucracy, the "Deep State," which economic interests have resented for  the better part of a century.  And perhaps campus rebellions periodically play such a role in our lives because young people in our modern age simply become too frustrated at endlessly having to meet the demands of adult bureaucracies.

 I do not know if my own Boom generation was the first to have a majority of its members attend college, but it certainly sent a higher proportion to college than any previous generation had.  The experience of the Second World War and the GI bill had firmly established higher education as the chief path into the middle and upper middle classes, and more and more people were taking advantage of it, instead of going to work full time at 18 and marrying and having children within a few more years.  A great many young people, however--such as myself--go to college at 18 with a great deal of unfinished emotional business.  Once there, a lot of pent up feelings towards their families can find some other outlet. They have a lot of time on their hands.  They have to resolve questions relating to their future place in the world, and their sexuality.  Some, like me, feel right at home in the classroom and the library; others do not.  It becomes very easy for many of them to focus on the shortcomings of the older generation in general and the institution they are attending in particular.  Among my generation, many were moved not only to reject the values of the universities and colleges they found, but to go into academia themselves and change them.  Older grad students and faculty from the Silent generation encouraged this process.

Now my contemporaries' rebellion took off, of course, because of our parents' generation's catastrophic mistake, the Vietnam War.  That certainly proved that our elders were fallible (though hardly more so than those of many other great nations, some of whom made much bigger mistakes), and it also forced all the young men into a confrontation with the government over military service.  But today's students, I think, face a much less certain future and a series of frustrations much worse than what we had to cope with, partly because they have been going on longer.

The competition for places in our top institutions, to begin with, is much worse now than it was then.  The college-age population has swollen much more rapidly than the number of highly regarded schools.  The proportion of women competing for those places is also much higher.  Today's undergrads have gotten where they are by meeting an endless series of tests administered by parents, teachers, coaches, music teachers, and heaven knows who else.  When I saw the first Hunger Games movie, I was struck by its message: that life was a series of dangerous, often fatal tests, imposed upon young people by their elders for the elders' entertainment.  The extraordinary popularity of the books in the series suggests that that message really resonated among the Millennial generation.  They have other problems in college:  many of them are borrowing tens of thousands of dollars for their education.  Many of them emerge unable to find a job that will enable them to live in a major metropolitan area, much less marry and start a family.   And last but not least, my generation, at least, had access to an extraordinary education such as I have described in A Life in History.  We also had to spend a LOT of time doing schoolwork to perform adequately.  Today's students don't get nearly as stimulating an intellectual diet, and their workload is a fraction of what ours was.  

Like Mario Savio and the mostly white, middle-class students who responded to his speech, today 's undergrads identity--one might say compulsively--with what we now call marginalized groups in society.  The representatives of those groups who now attend elite schools in much greater numbers tend to dominate campus controversies, and the opinion pages of student newspapers.  Meanwhile, the problem of defining one's self as a sexual being, and deciding what that definition will still mean for one's life, remains.  A small but growing number of today's students are taking a more radical approach to it by rejecting traditional gender roles and definitions.  The concept of "non-binary" individuals crossed into the mainstream just last weekend, when a long article in the New York Times Magazine discussed it almost entirely uncritically.  

As Liah Greenfeld has pointed out in the book I discussed here some months ago, modern society forces us all to define ourselves in almost every way. This is a terrible burden, felt most intensely in youth.  We have made it much harder.  Neither our political system, nor the humanities departments of our universities and colleges, provide the kind of common anchor to our experience that they tried to give in earlier periods.  It is not surprising that so many bright students become obsessed with the evils of the university itself, while others eagerly work to cash in their ticket to the economic elite.