Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Suffer the Children

3/29/2024

0 Comments

 
When Jesus said: “Suffer the little children…to come unto me.” (Matthew 19:14), he (or rather the 17C authors of the King James Version of the Bible) was using a now archaic meaning for “suffer.” In our modern diction, we would say “allow.” However, what we do as a 21C society is make them suffer (in the modern sense of the term). It is horrible and hypocritical and not historically unusual.

I have noted elsewhere the anomie and resentment that is deeply embedded in young people today: the sense of not only being dealt a tough hand (e.g. Covid), but facing considerable disappointment in terms of career and housing opportunities, as well as a looming climate catastrophe. On top of this they recognize the profound dysfunction of our political culture which makes efforts to address these substantive challenges seem unsolvable.

There is considerable evidence that parents allowing (“suffering”) their children to immerse themselves in smartphones and, especially, social media is a principal cause of the current wave of disaffection/alienation and broader psychological distress with which many young people contend. Indirect peer pressure (children insisting on access cuz all their friends have it) seems to overwhelm whatever prudent and “common sense” response parents might have. Our society seems to have no problem in saying that certain brain-altering agents (tobacco/alcohol/cannabis) are off-limits for those who (as a class) have an insufficient capability of managing themselves. We have been slow to catch up to technology and add screens/social media to this list. It is not clear whether those whose youth has been smartphone-dominated (i.e. born since 2000) can recover, nor how many more will be allowed to harm themselves in this way.

Juxtaposed against these woes is a cultural mythology (hardly unique to the modern US) of cherishing our progeny. It’s built on a profound genetically-rooted set of practices. Pretty much every parent makes sacrifices (often heroic) to support and protect their children who are especially vulnerable to the vicissitudes of life. Limited knowledge prior to the 20C often made childhood a minefield of illness and death. For millennia, financial necessity meant marshaling all family resources to such a degree that the sharp limitation of child labor is a marker of social and economic progress. When we look at the health, education (formal and practical), and life choices which the modern world provides to our youth, there is no doubt that they are, as a group, better off than those of a century or more ago (even if a sharp family-wealth gradient remains).

What, then, are we to make of a society which sends those children to schools where “active shooter” drills are commonplace? What, then, are we to make of a society where schools have given up their standards and allow (suffer?) their children to pass classes and graduate high school without basic educational capabilities? Some of these are peculiarly US issues, but the increase in climate-related illness/hunger/risks is world-wide. There the dangers are less immediate, but pretty much universal in impact; and mitigation/reduction efforts founder on the short-term economic claims of the generations who have already profited by modern capitalism’s exploitation of the globe. What are we to make of an immensely wealthy America which allows a set of break-through Covid-driven child support programs to founder? One where schools in general and child-support systems of many kinds are chronically under-funded?

There are valid points (e.g., liberty, limited government) underlying all the arguments which lead to the demise of these practical improvements in the lives of children. No one is—on the face of it—against children, health, and opportunity; it’s just that the cost of providing them is (apparently) too high and something has to give. Such arguments around the need for compromise and trade-offs are normal in political discourse. When they are made, for example, by competing industries around the need to regulate or benefit one such group to the detriment of the other, we may have one policy preference or another. But children have few lobbyists and lawyers and the resulting economic benefit flows to embedded/voting/older generations.

If it weren’t so predictable, we could look at those who want the State to intervene in support of the “rights of the unborn” but are unwilling to support such intervention to the health and education of the “recently-born.” We could look at the champions of “family values,” a phrase which has become a euphemism for parental protection/insulation from the valuing of or evolving beliefs of children. We could look at those who demand the liberty to carry dangerous weapons to such a degree that the liberty of those to be free from the fear of dangerous weapons is shoved into a closet.

Globally, issue of health and hunger are much more dire than in the rich “West.” Children there suffer from the shadowed lives of both children in general and poorer countries (and “of color”) that, combined, make them nigh invisible to most with wealth. Pretty much every crisis we read about, be it political, economic, or cultural, is a more dire crisis for children.

Little of this is new, historically speaking. But the current situation seems more acute because of the combination of threats and lost-opportunities which stand in sharp contrast to the amassed wealth (both individual and societal) which could be deployed. We cannot blame the children when we seem to actually care so little about them.

0 Comments

No Straight Lines

3/22/2024

0 Comments

 
As the dullest electoral primary season in recent history is effectively over (but for the breathless media reports, widespread hand-wringing culminating in the national conventions/infomercials), real attention has disproportionally shifted to the geriatric general election (with its breathless media reports and widespread hand-wringing). In particular, I know a fair number of folks who are distraught (not unreasonably) at the prospect of the return of he-who-shall-not-be-named.

With that perspective in mind, I have been wondering why this prospective disaster-in-the-making is different from some other historical precedents.

We rarely can predict the future with any degree of confidence. Looming disasters/slow-motion train wrecks including the US Civil War, or WWI, or WWII, were amply foreseen, but only vaguely or conceptually. The natures of those calamities, not to mention their outcomes, were in line with the views of only a few prognosticators; even more so, their aftermaths. In each case, horrors, shocks, and destruction set the stage for periods of amazing growth and activity. History has no straight lines.

In my work as a teacher of the history of democracy, I use many examples of the erratic course of “progress.” The promise of the post US Civil War amendments for Black people was repeatedly dashed by the deep inertia of racism legally, politically, and culturally. Weimar Germany morphed into Nazi Germany. The French Revolution was, famously, a decades-long ping-pong match between radicals (of various flavors), moderates (of various flavors), and monarchists (ditto). Even to take a current situation, there’s a potential future history of 21C US politics in which the 2016-2020 era is a fluke, washed out of significance within a decade. (Yes, there’s also a scenario in which the Biden era is seen as the last flailing gasp of mainstream politics as the country/world swirls around the drain). We just don’t know.

I’ve been doing some research lately about what has to be one of the most remarkable flip-flops of political structure/culture (certainly in European history, if not globally): the mid-17C Civil War which led to a British republic.

Frustration with a politically clumsy King (Charles I) led to armed conflicts in 1640s England, Scotland, and Ireland (not yet formally combined into the United Kingdom) between Royalist forces and a Puritan-led Parliament. Disputes mainly centered on an overbearing and increasingly authoritarian state, taxation without consent, and a bewildering variety of religious differences. (Voltaire later remarked, with a pithy double-back-handed compliment, that “England has 42 religions, but only two sauces.”) Parliament raised an army to fight the King, but lost control of this “New Model” Army which became its own political forces, eventually leading to the capture and execution of the King (1649). Oliver Cromwell, a minor provincial figure who had risen to command of the Parliamentary Army became “Lord Protector”, but Parliament was pretty ineffective and he (and, briefly, his son) ruled until things fell apart and, by popular demand, the monarchy was reinstated in 1660 under Charles II.

Charles II and his brother James II ruled until 1689 when the latter’s pro-Catholic policies proved too much for the English and he fled to France. This “Glorious Revolution” led to the establishment of a constitutional monarchy with a dominant Parliament that looks a lot like the structure of British government through the 20C.

What do I take from this (much-abbreviated) story?

First, the see-sawing of power between conflicting philosophies and parties and structures. The French Revolution, which began a century later, produced its own version of this swinging back-and-forth. Both were dramatic illustrations of the futility of long-term prediction and the power of contingency. There were times of great turmoil and danger and then things settled down. Virtually every party and outlook had, at various times, reasons to rejoice and feel triumphant, only to shortly find themselves with the short end of the stick politically-speaking.

Second, the road to our current political culture (a global version of liberal democracy) was far from assured. It could easily have been truncated (if Charles I and James II had been more moderate, they could have maintained a vibrant monarchy for some time) or, alternatively, have been radically accelerated. There were active groups in the late 1640s that proposed structures of power and principles that would have been recognizable to American and French Revolutionaries a century-and-a-half later. Indeed, it was the “republicans” (Cromwell et al.) who did more to suppress these proposals than the royalists of the day.

So, in terms of our current predicament, there is reason to fear, but not likely—on the history—reason to despair. Things may turn out well or badly—but, we have no idea today.

It is fair to ask, beyond this framing, whether the nature of modernity, and in particular, legal/social structures, the disintermediation of social media, and the sticky nature of the degradation of the global climate, makes such analogies from history irrelevant. Could well be.

Still, there is no basis to crumble into a small puddle. The Levelers of mid-17C England would have seen their ideas of voting power and equality embodied by most countries in the world by the 20C. The world’s most powerful hereditary power structure is a small, bleak country in East Asia and the Bourbons, Windsors, etc. are consigned to shadow plays of performance art. After Weimar fell, the “thousand-year-Reich” lasted only a bit more than one decade. This is not to dismiss the darkness that occurred during the down-cycles. Nor is there any reason to think that democracy is assured by some simplistic reading of “human progress.”

There is work to be done.
0 Comments

Moral High Ground

3/15/2024

0 Comments

 
The history of geopolitics is an unending tale of “rise-and-fall”s. The king of the hill in the 19C is scrambling for position in the 20C. Different Chinese dynasties held a range of positions in east Asia over thousands of years. Moghuls in India were replaced by the Brits in the 18C. The Spanish took over from the Aztecs and the Incas in the 16C, but their empire in the Americas had collapsed by the early 19C. There is much to be told about the whys and hows, patterns to be traced and exceptions noted.

This history is usually a story of power—primarily military/coercive and economic (sometimes also coercive)—and, according to the Realist school of international relations, that’s all there is. Notions of morality and ideologies, central to the Idealist school of understanding how countries actually behave, don’t count for much (however much one might aspire to morality in international affairs).

Of course, from a historical perspective, the nature of  those ideas and ideals has changed over time: you can’t understand European politics from the 12C through the 19C without looking at the internal structures and changing configurations of Christianity during that era. Ditto the variations of Islam across North Africa through Central Asia from the time of Muhammad. For us in the 21C, the ideological sprawl of modernity is everywhere: Communism, Fascism, Liberalism, Nationalism, democracy, etc. For example, the conceptualization of the individual and ideas of international cooperation spawned in the 19C have played a global role, not least in the concept of human rights.

When combined with mass media and “public opinion,” there is motivational power—a “soft” power—in these ideas; at least to such a degree that virtually all countries pay at least lip service to the ideas of democracy and law and (dare I say it) morality. Yes, there are many variations and exceptions, and such “soft power” won’t win a war on its own, but what passes for global public opinion has had some alignment and securing its approbation is attractive even to the most craven (although this gets pretty faint at the extreme end).

As with other forms of power, this public esteem/consideration/acquiescence can be spent. It’s usually not as dramatic as a ship being blown up, or an IMF deal being reneged upon, but over time the reputation of states that gives them credence can also fade.

Probably the clearest example of this phenomenon lately is Israel. Here is a country that was established at the behest of European/Western countries in the aftermath of WWII in recognition of the horrific crimes committed against the Jewish people in the Holocaust. Tainted, to be sure, from the beginning, by the impact of its establishment on other peoples who had also been living on the same land; Israel’s alignment with Western democracies and its material success, its embrace of Jews leaving less friendly places around the world, has garnered wide respect, as well as accommodation for the actions it has taken vis-à-vis other groups in the Middle East. Notwithstanding the history that it stands on, Israel’s more recent pattern of aggressive behavior and settler imperialism has done much to erase its accrued goodwill. The latest excesses in Gaza have created a moral deficit that will take decades to offset.

A second example is South Africa, which gained substantial moral credit from its decades-long struggle for majority rule and democracy (now marking its 30th anniversary), embodied in the person and leadership of Nelson Mandela. It’s leadership since then has been spotty and prone to several varieties of corruption that often plague countries without deep practice in national community and civil society. Despite its regional (economic) influence and the lack of competitors in Africa for leadership, it risks leaving Mandela to history and becoming rather ordinary.

In the 20C, the Soviet Union initially garnered great credit for its (nominal) commitment to socialism and its success in modernizing its country (the costs and oppressions were helpfully hidden). It spent its moral capital crushing Poland, Hungary, and Czechoslovakia and had few friends when its economic contradictions finally undermined its political culture and its empire.

Finally, our own United States, which has claimed a mantle of democracy for over two centuries and whose commitment to global improvement over the 20C, most particularly during the two world wars, but also noted for its attention to democracy and human rights outside of wartime, is also facing a reckoning. To be sure, our efforts towards the world have always been colored by the promotion of our own interests and a string of excesses in the projection of power. For all those shortcomings, the US has likely done more good for others (both individuals and groups) than any other modern country. Still, memories are short (and getting shorter) and there’s no small sense of “what have you done for me lately? in the perception of 21C America in the world.

It is natural that both the power structures and the wider population within these countries (and there are others) have drunk deeply of the stories and characterizations that reflect this moral high ground. It becomes part of the national psyche, but that only makes the erosion more easily seen from the outside. It comes as a great shock to learn that one is not universally loved/respected. But the conflicts, compromises, and shortcomings inherent in the exercise of power (of all flavors) are more visible externally and the harms they cause are appropriately counted against whatever moral “assets” have been accumulated.

Countries that claim the “moral high ground” in terms of international stature, competition, and conflict seek to reap both tangible and “soft” benefits by holding themselves out as role models. They often seek to redefine morality so that their actions and beliefs place them on that “high ground” as well. Some countries, on the other hand, don’t care. But those that trade in this “moral economy”  need to remember that even the loftiest  and most robust enterprises can go bankrupt if they don’t watch their balance sheet.

0 Comments

Froth and Fundamentals

3/8/2024

1 Comment

 
One of the delights of spending a lot of time with young people is that I get sharply different perspectives on the world. It’s especially good when I can get myself to disengage from my professorial lecturing mode and listen to what they see and what I can get them to say.

A minor example occurred a few weeks ago when we were talking about current social relationships among college students. I heard a term for a particular personal attitude and wrote it down on the board as “dog -like,” only to see smirking eyes, which one of the group explained by saying that the term was “dawg -like.” My point has less to do with the specific situation or definition (You are welcome to google it, but the references to its use 10+ years ago as a complimentary and endearing term seem to be already outdated), but rather the personal distance of perceiving social change. Not having raised kids, I didn’t go through this process from the elder perspective (although I was part of the younger end of the “generation gap” back in the day). At this point, with students who could be my grandchildren (many born 2000-2005), the “gap” is wide and widening.

Different mores, styles, and language are commonplaces of modern life; filled with a technology-based acceleration in the normal pace of changing how we live. As a historian, I am always on the lookout for ways in which “change over time” manifests.

From a historical distance, it is a challenging exercise to parse incremental social developments and see which ones—which often garnered a lot of attention at the time—fall into one of three characterizations: 1) the shock of the new, 2) the adjustments of transition, and 3) more long-lasting alterations. The first two are, pretty much by definition, transitory, but it is usually nigh unto impossible to sort them out while we are in the middle of things.

The impact of AI is a case in point: great brouhaha a year ago, followed by various degrees of cautionary tales and, simultaneously, wider changes in current practices and further improvements . It seems clear that—whatever “AI” is—it’s still way too amorphous to define it, much less assess it. Much the same could be said of the longer-lasting, more encompassing terms “computer age” or “information age.” Telling my students about how I used “punch tape” to “code” a “time-share” computer some decades back produced glazed eyes from them, but fifty years ago it was as wondrous as SIRI (introduced in 2011) or the app de jour of 2024. Indeed, good arguments could be made (simultaneously) that 1) there’s nothing profoundly different about AI; it’s just souped-up computing processing and 2) we are only at the start of the “AI revolution” which will fundamentally change the idea of what it is to be human. How many of the current debates will have faded in ten years (or fifty) as archaic or trivial transitional questions (how many outdated connector cables do you have in your closet?); and how many will be seen as fundamental framings of the way the 21C worked out?

At another point, I asked some students about their use of social media and was not surprised to learn that few use Facebook (apparently now principally the preserve of “older” folks). But I was quite intrigued to hear that their use of social media generally is going down considerably. The main reason is the proliferation of ads which makes it harder both to find postings from people with whom one has a “real” connection and reciprocally to ensure that your social media contacts will actually see your posting.

Whether these changes will be long-lasting we will discover in due course. But the very question of whether either the “rise” of social media or the current drawing-back from its use among younger people are a blip or a trend highlights the current uncertainty of these phenomena. It also makes clear that much of the surrounding breathless excitement over the past 15 years has been created without any sense of durability or significance. Stated simply, we don’t know what’s going to stick, alter, or fade.

Similar points could be made about drug use patterns, personal savings rates, gender identity, or indeed, the importance of a college education. There was a time when jazz music was an important part of the US music scene, but now it seems to be waning and the demographics may consign it to a place not far from folk music as part of our popular culture. There was a time when all “businessmen” wore neckties every day, and now I just hang on to a fraction of my collection for use at major social events. Bumps are not trends, trends rarely last; while we’re in the middle, we can’t tell what is a “secular” change or a fad, or what is the “norm” and what is the anomaly.

Thus, the benefit (to me) of engaging with a distant generation in the classroom and even more distant generations in my research and lecturing; each provides a critical corrective to getting stuck in my own cultural cul-de-sac, imagining that a person’s range of experiences are—either in my own particulars or my students’ or those of the past—have some fixed meaning and effect.

1 Comment

Alias

3/1/2024

0 Comments

 
There was a time (starting in the late 19C) when the terms “moron” and “imbecile” were considered scientific and enlightened; an improvement over the then commonly used “lunatic” or “mad.” As is often the case, such terms gradually acquired derogatory connotations are were consigned to the dust-bin of history, at least among official channels and the “better” parts of society. For a while, “mental retardation” was the preferred terminology, but in 2010, the official US government term was changed to “intellectual disability.”

Similar paths have been followed by a variety of ethnic and “racial” groups. The early 20C saw the establishment of the National Association for the Advancement of Colored People (1909), a term that by mid-century seemed outdated and burdened with considerable social baggage. At various stages, colored, negro, black, afro-american, african-american have all had their vogue.  Names were often changed because of the negative connotations that had accumulated over time and a desire for a fresh and positive denominator for a group that sought a clear identity. Of course, some terms have been so heavily weighted with animus that they are de factor barred in common social discourse (e.g., the “N-word”).

In the last decade or so, “people of color” has gained in currency; sometimes as a larger grouping of all non-“White” peoples, sometimes as a grouping of all such peoples who are neither “Black” nor “White.” (As I have noted elsewhere, this is a particularly US issue since in most of the world, “White” people are the exception. (Not to mention that “White” people aren’t white and whatever color of their skin might be called, it’s a color; so that the whole discourse is heavily skewed by the history of global epistemological power.)

The linguistic history of labels for people with various physical or mental limitations or abnormalities is also the story of the interactions of medical/scientific nomenclature and theories, changing social perceptions; most recently seen in the differentiation of “disabled people” from “people with disabilities.”

At this point, I should acknowledge that I have not had the experience/perspective of any of these groups (i.e., I am “privileged”). I also believe that people should be able to choose the labels by which they wish to be addressed/referred to.

Historically speaking, the repeated restructuring of such terminology frames seems to be unique to the last century and a half (i.e., a product of modernity) and, I suspect, has contributed to the social confusion/disorientation that many have experienced.

The underlying issue, however, is not the labels by which groups are identified. It is the animus or disparagement with which such groups are regarded. The (repeated) efforts to change people’s minds by campaigning to change the socially acceptable language which they use seems to me to be at least ineffective, and sometimes counterproductive.

There is some research in linguistics which suggests that the language people use does reflect or cause a different epistemology. Inuit people have dozens of words for snow, while in the 21C US, we have dozens of words for salty crunchy plant-based foods.  So, changing language perhaps can change minds. I certainly agree that mustering the forces of linguistics for social justice makes sense but, at the same time, let’s not pretend that new labels will give us much purchase for deeper social change. Indeed, I can’t help but suspect that arguments over labels and nomenclature is but a substitute for a meaningful dialogue (and the hard psychological work) about the feelings behind the words. Instead, we argue whether “person of color” means something different than “colored person,” or “deaf” means something different than “hearing impaired.”

There might be some (moral?) improvement in the shift from “disabled person” to “person with disabilities.” Is it better to refer to someone as “fat,” “obese,” “zaftig,” “weight-challenged,” or “full-figured”? I understand the effort not define such an individual by a particular aspect of their lives. However, there seems to be some degree of awkwardness about this change. It also runs against a general preference for using the “active voice” (perhaps as part of an effort to designate these individuals as passive subjects of the condition of their bodies). There is also something off-putting and distancing about this mode of construction; it carries a “scientific” and analytic connotation that seems disconnected from the realities of social relations. Perhaps, at the margin, this kind of language is OK, but it’s still no substitute for figuring out ways to stop derogating people for aspects of their lives that they 1) cannot change, and 2) have no moral significance.

In my own practice, I try to actually not use such adjectives at all. I find that the patterns of speech which I developed over the years, especially in referring to others as, e.g., the “young black woman in my class,” would be better I if just referred to that person as “a student.” Most of the time, identifying adjectives don’t really add anything to the point of my talking about them; and, where it’s relevant, I can add the appropriate adjective back in.

This requires some effort to be conscious of what I’m saying; an effort beyond that called for in terms of changing underlying attitudes of prejudice, not using terms designed to incite animosity and resentment, and actually seeing people for who they are.
0 Comments

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly