Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Little Brother is Watching

3/25/2022

0 Comments

 
George Orwell crystallized the image of the omnipresent state with his 1948 creation of “Big Brother” in 1984.  Since then, amazing advances in technology have made it clear that our privacy is more a function of the resources and priorities of those who would watch, rather than any legal/social constraints. In other words, if the NSA wanted to watch you, they could and they would.

Movies and video ads make clear that minute cameras and microphones on increasingly smaller drones can get to lots of places and see/hear/record events. Who can doubt that a large percentage of people who eagerly live much of their lives with ear buds virtually glued in place will not also quickly move to technologies embedded in various nooks and crannies of their heads for both input and output.

We’re already well along. Police body-cams and increasingly ordinary video cameras in public spaces  have exponentially increased the amount of “film” available to law enforcement. Google glasses and virtual/augmented reality headgear are bringing these kinds of capabilities even closer to our brains, seeing what we see and hearing what we hear. The “internet-of-things” will ride on ubiquitous Wi-Fi and other networks to send that information to any data warehouse instantly. AIs will enable sorting and retrieval of particular incidents.

Still, my concern today is not with the State crushing personal privacy aided by the onslaught of technology. After all, most of us have precious little of interest that would be of interest to spymasters or even to Mark Zuckerberg and other would-be moguls of our techno-future. Rather, I’d like to consider the implications on a smaller scale: How will the presence of (seemingly) ubiquitous recording devices affect day-to-day interactions in stores and offices, comings-and-goings in homes and cars, and conversations with families and friends.

Historians are quite aware of the fallacies of memory. Indeed, anyone who has ever played a round of the “telephone game” knows that even instant repetition is fraught. Memories suffer not only from wholly unintentional omissions and limited perspectives (increasing over time), but also from unconscious desires to construct a friendlier, more self-supporting, and more coherent past.

Now (or in a not-very-distant future), we will be able to reach that defining goal of modern history: to find out what "actually happened.”
  • Did little Kim start the fight with their sibling Robin over the use of some critical toy or vice-versa?
  • Did a certain spouse wink at an ‘ex’ at a recent party as was later debated in the car on the ride home?
  • What was the expression on Rosemary Woods’ face when she famously erased 18 minutes of the Nixon tapes?

Will family life become more peaceful? Will law enforcement become more mundane?

I suspect that each of us often take refuge in the ambiguity and irretrievability of the precise past. A little fudging has undoubtedly preserved many relationships, not least of which with each of us ourselves. We may find out whether, as Jack Nicholson’s character posited in A Few Good Men: “you can’t handle the truth.”

Are we ready for the “truth”? How will the awareness of an “objective” record affect how we behave? Will little Kim stop picking fights with Robin? Will cocktail party flirting cease? If the case of George Floyd is any indication, it has taken some time for police body cameras to change their habits and patterns. Likely for ordinary folks, in ordinary situations, it will take longer. But perhaps, we will internalize the presence of this cyber Jiminy Cricket, who, perched on our shoulder, regularly reminded us to “let your conscience be your guide.”

From another perspective, the ubiquity and constant nature of such surveillance might well work as a further modification for how we record and process information. Why take notes in class or meetings when you can easily call up an actual recording (or have an AI do it for you)? Fascinating work has been done on the impact of writing on the social practices of memory, which became less central to accessing the past. It’s easy to imagine that we won’t tax our organic memories nearly as much as techno-memories become more commonplace. The rap against older people that their memories are failing may acquire less resonance, since the quality of their internal neural network won’t matter as much.

For historians, one of the fundamental challenges of research was finding enough original material to try to discern what “actually happened” in the situation under study. Did anyone actually argue with Napoleon about launching his ill-fated adventure into Russia? Who did stab Caesar? Who was the first to make a salad named after him? The problems generally get worse the further back we go. In the electronic ages, Historians are already getting swamped with thousands of emails and, soon, hours of videos. We will have to turn to AI Historians to parse through all the material (there aren’t enough underpaid grad students to throw at this).

It may be that, as with many aspects of human social/psychological adaptation, making any of these fundamental behavioral adjustments will take generations. The transitions, especially inter-generational, promise to be somewhere between “interesting” and problematic. In any event, this revolution will be televised.


0 Comments

The Limits to Growth

3/18/2022

0 Comments

 
God tells us, according to Genesis 1:28, to “be fruitful and multiply; fill the earth and subdue it.”
Well, we’ve done a hell of a job ever since, especially lately. Indeed, one might say that growth is a deeply-rooted human epistemology. For a long time, it was a nice concept with few practical consequences. It’s now driving us off a cliff.

For millennia, families would have lots of kids (far higher than the “replacement rate”). Land was cheap, many children died very early and who didn’t want progeny to carry on the family name and perpetuate our genes.

What we now call the “Anthropocene era,” the geological moniker for the period of time in which human intervention into natural processes became significant, began roughly 400 years ago. Since then, human population has soared (see my piece on “Pop Culture”).

Also about that time, Europeans began to recognize the relationship between the number of people in a country and its military prowess. Ditto for economic strength. And countries began to consciously foster both, in order to enhance their national/regal/imperial glory. [This awareness and policy may have happened elsewhere as well, but I haven’t done the research.] Growth thereby became a matter of inter-national competition.

By the 19C, the rise of the science of economics and the political pressures of democracy added further rationales for growth: profits and domestic peace. The profit part was pretty straightforward: bigger groups could mine and grow more stuff, and then make and sell more things; and the owners/investors could build nicer houses. Competition between countries spurred (European) imperialism and the take-over of most of the world in one way or another.

The democracy connection is a bit more subtle. As ordinary folks began to be more aware of their political power in the aftermath of the French and American revolutions, ruling classes (who (unamazingly) combined political and economic power) had all sorts of reasons for maintaining their position/privileges. Growth as a doctrine answered multiple needs. It provided a basis for more/bigger activities whose profitability was directly beneficial to those groups and provided a public theory of opportunity for those well down the socio-economic ladder.

Since (the implicit theory ran) no one would (should!) expect the rich to have less, the only other way for the “rest” to get more was for economic growth to produce more wealth and thereby provide a means by which the “rest” could advance their standards of living (even if those new comforts were always trailing the luxuries of that era’s “1%”). It’s sort of macro-economic version of “trickle-down” economics. Other than its disingenuity, exploitation of most of humanity, and on-going inequality at both the national and international levels, this “growth” thing looked like it was just the ticket to wealth, domestic peace, and a stable international structure.

But, as with most good things, they can be overdone: one glass of champagne is divine, the second is excellent, the third is fine, and after that, it’s a blur.

Growth, both in terms of population and economics is running into a wall. The planet can’t support the amount of economic activity necessary for all 8 billion of us to have the standard of living “enjoyed” by those at the US poverty level (currently about $13k/person) without those at the top getting a haircut. For everyone to live at the current average rich country standards, we would need to “grow” the world economy to more than twice the current amount. Needless to say, in the midst of a slight constraint on planetary output imposed by global warming and environmental disruption, this isn’t likely. [Just to do the math: 8B people, OECD average income- $30k = $240T; current global GDP=$94T.]

If our current trajectory isn’t tenable, then we need to figure out a new way of thinking about wealth, growth, and economics in general. Early in the 20C, the Bauhaus architect, Mies van der Rohe said “Less is more,” as the justification of a minimalist aesthetic. Modern minimalism was the result. There’s something to be said for a simpler line in design (though it, too, can be taken too far). Various groups (the Amish, hippies, Marie Kondo) have urged simpler lifestyles over the years. Malthus (1798) warned that there were real limits on the population of human societies. Many of us remember Paul Ehrlich’s 1968 apocalyptic (and a bit overhyped) warning: “The Population Bomb.” In 1972, the Club of Rome famously published a more sober precatory report on the modern global economy entitled: “The Limits to Growth.” All have been downplayed or “pooh-pooh’ed” by conventional thinking. The consequences of the necessary change are fearful to contemplate. It’s difficult to conceive of most people I know of voluntarily living on $30k (OECD average), much less the US poverty level ($13k). So, we have all implicitly bought into this “growth” and inequality model.

Stanford historian Walter Scheidel has persuasively argued that the only way economic inequality has been alleviated on a broad level is by disasters, especially wars. This makes sense, since the rich have the most to lose and, in extremis, their property is less protected. This historical perspective is not pretty, but it reinforces the attractiveness of the growth epistemology as well as hinting at the likely outcome of the upcoming climate disaster.

In sum, there are only a few factors driving a solution to the global mathematical equation: 1) fewer people, 2) lower standards of living, 3) reduced economic inequality, or 4) technological solutions that increase the economic capacity of our planet to support our species. The debate about what share of the burden each of these factors will bear will underpin the geopolitics of the 21C, as well as decisions at societal and personal levels.

I was once given the sage advice: “Never confuse your net worth with your self-worth.”  It’s good counsel. If we give our species a chance at a choice over the next few decades, we will have to sort this out.

0 Comments

This Means War

3/11/2022

1 Comment

 
The startling re-emergence of war among the consciousness of Western elites in the past few weeks has occasioned all sorts of sympathy and support for the Ukrainians who are fighting and killing on behalf of ideas—freedom, independence—that the readers of this series presumably share.

This particular conflict has seen some well-established military tropes, including tank columns (prominent since WWII), guerrilla conflict (active since the early 19C). It has also seen some emerging techno-tactics such as drone strikes and cyber warfare and, the notable application of economic “sanctions.”

These last two are distinctive in the “annals of warfare” because they don’t involve killing/wounding/capturing the enemy, but are directed at the civilian population and the communications and logistical infrastructure of the military. Still, while the technologies involved might be quite 21C, the principles of leveraging something other than swords and guns to coerce/defeat some other country are hardly new.

One of the first major foreign policy crises of the young American republic in the 1790s and 1800s required the US to navigate between its former foe and ally as Britain and France engaged in a series of conflicts known as the French Revolutionary and Napoleonic Wars. President Jefferson imposed an embargo on trade with Britain in 1807. Napoleon had already worked to construct the “Continental System” under which mainland Europe (at least the parts that he controlled) would also cut off trade with Britain. All this didn’t work out great for the US, and our unpleasant War of 1812 resulted.

Of course, the use of blockades to prevent supplies to embattled ports had already been a well-established mechanism of war since the ancient Greeks and were a regular part of warfare since the 18C. One of the reasons we know this is that the subject of blockades was regularly addressed in early modern treatises on international law (Hugo Grotius’s work of 1625 being the most famous). This was all part of an effort to think through the nature of war and peace and establish rules of “civilized” behavior for both situations.

In a world where all economic relations were physical, physical blockades with ships intercepting other countries’ shipping or military sieges were the sensible means of depriving the “enemy” of economic support. Now, in a world of services and data and digital money, we have “sanctions,” formal seizure of assets and denial of licenses to do business with the enemy country and its citizens, including access to shared systems of commerce and finance such as the SWIFT banking system. By one count, sanctions have been imposed over 1100 times by various countries since WWII.

All well and good, but the imposition of these sanctions and the concurrent (and usually secret) use of cyber attacks (again, non-physical disruptions of enemy property and systems) raises the question of whether they constitute “war.” As with most concepts, “war” can mean many things. Political scientists have compiled long lists of events of mass violence according to certain criteria, in order to demonstrate certain patterns of war in history. For most folks, looking back in history is more visceral, marked by big, deadly, or long-lasting military conflicts with familiar names; often landmarks in national, regional, or global history.

In this context, we can also note the long-term effort of “international lawyers” to demarcate war from peace is rather artificial. The effort to impose rationality and humanity on states’ exercise of power is a testament to either a noble perseverance to improve the human condition or futility.

The same is true in the vague treatment of war in the US constitutional system, where Congress is given the sole power to “declare” war, but no one doubts the effective power of the President to do all sorts of things (physical, economic, electronic) that attack/harm an “enemy” or protect the country. This well-fudged line has plenty of precedents, including (just from recent US history): the “police action” in Korea,  the Gulf of Tonkin resolution (1964), and the authorizations for US military actions in Afghanistan and Iraq early in the 21C.

The Russian invasion of Ukraine certainly meets all the usual criteria. However, whether the actions of the US and its allies also constitute “war” remains to be seen. Putin warns the West that its current steps “threaten” war; raising the spectre of nuclear retaliation. NATO makes clear that its core principle of mutual self-defense (Article 5) has not been triggered since Ukraine is not a member; but since virtually all NATO/EU states have been shipping weapons to the Ukrainian Army, providing intelligence, and imposing a wide array of economic sanctions on Russia, it’s not clear exactly why we’re not already “at war.” So far, it hasn’t behooved Putin to invoke the term, but otherwise, “a rose is a rose is a rose.”

Looking back at the Afghanistan and Iraq “wars,” we can see not only the presence of US/NATO troops and the full range of overt military activities and structures. I guess we were at war, even without an official Congressional declaration. Both places also saw a wide array of “war” activities, even if non-official (via the CIA, various contractors (mercenaries), etc.). The “war on terror,” of which the Afghan conflict was at least nominally a part, highlights that in the modern world, non-state actors can trigger or become the recipients of state military action.

This line has many antecedents in terms of guerilla actions for centuries. Indeed, the office nature of “war” is closely tied to the rise of the “state” and its claim to monopolize the use of force in society (both domestic and international). War became an activity in which only states could engage; other military actions or, indeed, all non-military actions were defined (with help from the international lawyers) as outside the meaning of “war.”

We can only hope that, by whatever name, the killing stops and this situation gets no closer to whatever definition of “war” leads to more death and destruction.

1 Comment

Construction Zone

3/4/2022

0 Comments

 
Earlier this year, I wrote about nationalism in history and today I want to expand on that topic to highlight its artificiality. In other words, there are not really any such things as nations, they are social constructs, cobbled together out of distorted traditions, bad genetics, and human desires to belong to a group. You don’t need Putin’s mythology of the essential unity of Great Russians, Little Russians (Ukrainians), and White Russians (from Belarus) to see this or its impact in the real world. If nationalism hadn’t been such an essential concept in global political development with continuing repercussions, this artificiality wouldn’t be such a big deal. But this emperor has no clothes.

As I wrote last time, nationalism essentially arose in the 19C as a way-station on the road to globalization. As human societies became more interconnected, secular, and democratic, traditional local identities were being squeezed and regional elites leveraged this angst to regroup power structures on a larger scale. Nationalism was seen as a forward-thinking ideology.

And as is common in human societies, innovation becomes conventional and then reactionary (yeah, I know, a good historian shouldn’t spout such gross generalizations, but…) By the late 20C, those local elites—whether in the West or post-colonial contexts—were well entrenched in their larger zones of influence, but the onward push of technology and globalization increasingly shifted the scale and scope of economics and culture to a larger plane. Good burghers—members of the local textile guild—hated it when Bismarck consolidated Germany in the late 19C, as did, mutatis mutandis, zamindars (tax collectors) in pre-Raj India. Their descendants are now adherents of the Alternative fur Deutschland or the increasingly-intolerant Hindu-driven BJP in India; not too concerned about preserving local culture, but much more focused on preventing globalizing integration and assimilation and the dissolution of national culture.

All this nationalist clamoring is anthropological/historical nonsense. Ultimately, we all trace our lineage back to Africa (actually, east Africa, actually probably Kenya or Tanzania). At the same time that Trumpians are concerned about a few years of Salvadoreans migrating to the US, we should remember that it took humans several hundred thousand years to spread around the globe. (So much for human terroir!) A combination of contingency and environment created different cultures and languages. And the movement of peoples didn’t stop there. Waves of migration—voluntary or forced—have recurred ever since. These changes were sufficiently long ago and slow enough that we have to look closely to trace their components. Ethno-linguistics and anthropological DNA mapping can tell us a lot about what kinds of mongrels we all are.

The only difference in claims to “nationality” is timing. After all, according to the “birthers” (from the controversy 13 years ago), Obama was born in Kenya and came here in the ‘60s. Whereas, all us other “good, patriotic ‘Muricans’” actually came from Kenya in various stages over the past 150,000 years. Indeed, we are all African-Americans.

The thing about nationalists is that they have taken a snapshot of a group of people from a certain period and declared “WE are the [fill-in-the-blank] people!” Wherever you were when the photo was taken is what counts. It wouldn’t matter if “your people” lived on the banks of the Vistula or the Rhone for hundreds of years, if you moved to Bavaria before the 19C, you could count yourself as German from then on (Roma and Jews excepted).

Roman citizens from 2100 years ago fought against their brethren—for example, as French against the English--from 1000 years ago until 120 years ago. Scandinavian nobility shuffled kingdoms between Denmark, Norway, and Sweden for hundreds of years. The British have been ruled by Germans (The Houses of Hanover, Saxe-Coburg (until they took a friendlier, English name during WWI)). The hodge-podge of Hispano-indigenous blends in South America didn’t claim/discover/ invent nationality until they were throwing the Spanish out in the 19C. Most African chiefdoms didn’t understand or care about this European-imported ideology until they were forced into colonial administrative structures (of split in two by a line drawn by the British/French/German/Portuguese) in the late 19C. Most current African political crises are functions of disputes between such traditional groups being fought out within the forced framework of Eurocartography. The fact that few African polities fit the “nation” model is more an indictment of the model than of the peoples on whom it was imposed.

This “freeze-frame” mentality to nationality was reinforced in most cases by the construction of culture around the identity. Languages in the largest city in a “nation” leveraged the rise of the mass press to become standardized national tongues and local dialects, Breton, Gaelic, Bavarian, Cantonese, Tamil, etc. were pushed to the sidelines. New traditions were invented: national holidays and parades, symbols and ideas such as Scottish tartans (early 19C) and Bastille Day in France (invented in 1880 to solidify support for the 3d Republic in the aftermath of being beaten a few years earlier by the Germans). We can see the latest manifestation of all this in the differing Russian and Ukrainian perspectives on nationality and nationalism

[You can read about this process in these excellent history books: Benedict Anderson, Imagined Communities; Eric Hobsbawm, The Invention of Tradition; and Eugen Weber, Peasants into Frenchmen, among others.]

So, here in the 21C, nationalism morphs into a reactionary ideology in the face of a tide of globalization. We have shifted our identities broadly from the local to the regional/national and we cling to them as migration, trade, and instantly-communicated culture makes geography less relevant. Just as national elites emerged in the 19C to create or co-opt sensibilities, so do 21C cosmopolitan elites act, move, and think on a global plane (yes, the “jet set”).  Despite the media chatter about the revival of nationalism, it seems difficult to imagine a relapse into autarky (but so, too, they thought just before WWI).

Still, nationalism retains a powerful sense of comfort of tribe and identity in the face of the cold, ineluctable momentum of modern capitalism. As with other stories, sometimes the facts get pushed to the side and mythology holds sway until, eventually, it is punctured and collapses.
0 Comments

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly