Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Bemji Calling

6/2/2023

0 Comments

 
Bemji Calling    

A couple of years ago (040921), I wrote about my interest in and relationship with the country and people of Bhutan. I closed that piece expressing the hope of returning.  This week, having just  returned, I want to talk about Bemji,  a village near Trongsa in central Bhutan, where we visited the family and home village of our long-time friend, Karma Nima.

Most of the time in my four trips here, has been spent either in towns or the city (the capital, Thimphu, is the only place with more than 25,000 people) or trekking in the countryside. So, to spend a day and night here in a small village (100+ people) gives another perspective. Nowadays, everything is pretty much connected or connectable—via phone, internet, or satellite—so, there is no more illusion of rustic isolation. Even ten years ago, I had to ask our trekking guide to get off his cell phone so we could try to imagine that we had escaped from modernity amid the country’s natural beauty. Still, Bemji is a 45 minute drive from the main road (even if mostly “paved”), so going to or from is not a daily event.

Small towns/villages/hamlets still run to a local rhythm, even if they are well connected. They are a microcosm of what Bhutan itself is trying to do on a larger scale: maintain the hand-made tapestry of life amid the digital onslaught. Our friend Karma arranged for us to stay in the village temple in a room just off the main shrine. About 20 locals gathered to meet us for dinner there (including a formal presentation of locally-made arak to drink, followed by dancing (Facebook video link available for a small fee!). The next morning we had private prayer ceremony and, due to an auspicious coincidence, the neighboring monastery was hosting a special ceremony of unwrapping a model of Zangtopelri (Guru Rinpoche’s heavenly palace). We got to meet Karma’s 93 yar old uncle and innumerable cousins (by both blood and affection).Then, Karma arranged a demonstration of an ancient Bon (pre-Buddhist) warrior dance which was great fun.

The vibe of the village reminded me of a trip we had taken twenty years ago this week to a small town in the Marché region of Italy, where our friend Ezio gave us a tour of his small home town. The walk took several hours, because Ezio couldn’t get more than half a block without running into someone he knew who insisted on catching up. Karma is less effusive than Ezio, but the sense of human connectedness was the same. It’s a sense I rarely find in the US, between metropolitan frames of life (83% of us live in metro areas) and the ease with which most of my friends/acquaintances have relocated, often multiple times, around the country. But, since a real sense of “people in place” seems to take at least two generations, it’s not so clear how modern American cities can replicate this aspect of “community.”

This is my fourth trip to Bhutan over 25 years. Beyond the usual eye-openers of travel and the chance to establish more extensive relationships with some individuals, my time here has given me a chance to see change over time in a way not possible at home, where compounding daily/weekly/monthly events of change become a blur and are hard to see at a distance. I’ve been going to London for fifty years, and have seen great changes there, too. But they seem incremental as well. The process of modernity, which, in the West has taken us about five centuries, is compressed here to less than fifty years. Of course, that is the great challenge which Bhutanese culture is wrestling with.

While I enjoy and marvel at the chance to observe this, I am also acutely aware of the risk of falling in with modernity’s arrogant stance towards those less “advanced.” Given the widespread evidence that Western modernity has gone off the rails, no small amount of cultural humility is in order.

One of the principal reasons for my trip has been to check in on two projects on which I’ve been working for a few years: democracy education and tree planting. The latter is going great guns, we are on target to plant 190,000 trees this year as part of a five-year, million tree project. Bhutan is the only carbon-negative country in the world and I hope that our project will keep pushing the frontiers of what is possible to fight what I see as our civilization’s greatest threat. I’ll be talking about democracy in Bhutan in an upcoming posting.

Our first day in Bhutan, a bit bleary from travel, we were given a tour of the new JSW Law School, a remarkable campus high in the hills above Paro. The Dean explained how he had led the design and construction of the school and, perhaps more importantly, a new curriculum to train a new generation of Bhutanese leaders: a combination of law and humanities (this is a college-level, not (yet) graduate-level program); a combination of Buddhist tradition, current Bhutanese law and modern international law/relations. It was an inspiring example of a rooted modernity; exactly the kind of innovation that moves a traditional culture forward. It’s also the other end of Bhutan in terms of engaging with modernity.

The village of Bemji is fading. It had well over 300 people scattered across its valley when Karma was growing up in the 1970s and ‘80s. As has happened in countless other countries over the past few hundred years, the lure of cities and the opportunities of the world are eroding the local culture and weakening the roots of tradition. Karma told us that it’s been a challenge to find someone to learn to lead the Bon warrior ceremony, now that the currently leader’s dancing is not so sprightly anymore. Bemji is calling, but now its calling via Facetime.

We left Bemji on Saturday and got back to SF on Tuesday; a testament to the wonders of global travel. We had a raft of great experiences, both personal and cultural (both religious and modern). We’re already planning a return in a few years. Stay tuned.

0 Comments

Some Legacies of Empire

5/25/2023

0 Comments

 
I’ve been doing some work lately on the process by which the British Empire disaggregated, in the mid-20C (mostly from1945-71). During that time over forty new countries achieved their formal “independence from the “mother country” (this list excludes the “settler” colonies: Canada, Australia, New Zealand, and South Africa, where the descendants of European settlers had been dominant by at least early in the 19C and which were more-or-less recognized as independent states by 1931). All of these other newly-independent states started out as republics or constitutional monarchies, generally under the rubric of the “British Commonwealth of Nations.” Since then, however, almost all have struggled to maintain a democratic form of government, with a variety of coups, strongmen (or women), authoritarian regimes, and corruption.

It’s not just the British, the same could be said of the countries from other formal European empires which ended in the 20C: Austro-Hungarian, German, Ottoman, French, Dutch, Belgian, American, Russian (in no particular order).

Even beyond a comparative framework, this is not a surprise. As I have noted previously, democracy is hard; and even those places which claim the strongest traditions and practices have faced some serious challenges of late. Most such places took many generations to construct a society and an economy that enabled democratic political institutions to emerge.

The British, it seems, understood this. As late as the 1940s the general sense was that it would take even the most developed African and Caribbean countries a couple of generations to be “ready” for independence. (Understanding and evaluating why imperial masters had not done much about this situation is a separate point.)  A variety of factors intervened, including local demands (some violent), pressures from the two superpowers (both the US and the USSR were at least to some degree opposed to other countries’ empires), and British moral and economic exhaustion. Things moved much faster than projected and by the 1970s, the unraveling of the Empire became mundane.

A lot of the pressure for independence came from local communities and leaders who were reacting to British condescension/paternalism/oppression. Whether their societies were “ready” to be independent from a British perspective and by the standards of the West generally, wasn’t really the issue. Indeed, any claims to “objectivity” on the part of imperial bureaucrats had to be dismissed, at least for domestic political reasons and (usually) personal ambitions of the new leadership. Moreover, the sorry state of European culture in the aftermath of the slaughter of WWI, Nazi brutality, and the treatment of colonial peoples generally undermined the moral foundations for any claim that white Euros had a better sense of what it took to be a proper modern state and society.

On top of that was some appreciation that, regardless of their value as models in the abstract, the European states and their international system were a product of their particular geography and history which was not at all an obvious basis for comparison with the African and Asian cultures which had seen little engagement with modernity until the 19C. This manifest in efforts to articulate a “third way” (between the Western liberal democracy and Soviet state-centered systems). But whether confined to theoretical explorations or expressed in the practice of government and state organization, nothing seemed to take hold with any solidity in terms of a political culture that could support sustained economic development. There are lots of reasons for this dead-end, but I’m not here to point fingers.

Indeed, I suspect that the historical contingency of western liberal democracy/capitalism and its outcome made the emergence of a democratic culture (however “liberal” or “socialist”) in these former colonies pretty much of a non-starter generally. Regardless of the viability or value of this particular model; nonetheless, it does seem that few if any of these new “countries” was coherent enough to establish a stable and beneficial political culture (which I take to be the baseline for any viable state).

The historical focus of my research project is on how the Euro imperialists thought—at the time and from their own perspective—they could go about promoting democratic norms and forms as these colonies barreled towards and into independence on the world stage. In particular, I’m trying to understand how the Brits got Ghana, Malaysia, Sudan, and other places ready to be international actors—with foreign policies, ambassadors, and a sense of the world.

There seems to be a lot of historical analysis of the “constitutional” issue at the end of empire; i.e. how was power transferred and what domestic political structures could be built to “receive” independence and govern the new country. I suspect that there are a bunch of studies of how the Kenyans took over their railroads, or the Nigerians built an army, or Jamaicans managed trade. I’m looking in another corner, but not just at the perspective of bureaucracy, administration, and protocol. I’m also interested in whether and how the British—whether through their Colonial Office, the emerging Commonwealth Office, or the Foreign Office—sought to help these folks see the world.

After all, Gambians had been involved in running ports, and folks in Botswana had been dealing with agricultural issues for a while (even if under imperial supervision). But, the conduct of foreign policy had been always reserved to the British and—unlike a host of “ordinary” domestic issues—this had been handled from London on an Empire/Commonwealth-wide basis. There was scant room for (or interest in) even the most sophisticated colonials to participate in the UN or be part of an embassy staff.

And yet, sometimes with a little engagement with international organizations or the emerging “non-aligned” movement of countries dancing between the US and USSR, sometimes with seemingly little warning, a host of countries were expected to deal with the broad and dynamic scope of the world.

The condescension from London notwithstanding, it was a tall order. Likely made only marginally easier by tidbits of training once it became clear that independence was actually going to happen.

So, I’m going to dig into this set of questions and circumstances and see if I can do what Ranke said was the historian’s job: find out what actually happened.

0 Comments

Human

5/19/2023

2 Comments

 
It’s getting harder to be human. Whether you count “humanness” from when we elbowed out Cro-Magnons and Denisovians (35-40 millennia ago) or, from Socrates, Shakespeare, or the philosophes of the Enlightenment (more recently), we’ve had a pretty good run. Perhaps, as I have argued vis-à-vis capitalism and the environment, too good a run. But perhaps we’re at the threshold of something new.

It’s not so clear what this newness will consist of, much less whether it’s “better” than the humanity we’ve been doing for a while now. Indeed, it’s pretty much impossible to evaluate, even if we had a clear picture of the future. Its strangeness makes our mentalitè obsolete and our resentment of this newness skews our ability to understand, much less “objectively” assess what life might be like, or take a stance on whether the new is even “human.”

There are many possible perspectives on what it is to be human: physical appearance, sexual reproducibility, social connection, legal status, percentage of standard DNA, some combination of intelligence and language, the presence of a “soul,” and a gaggle of attributes/abilities we might look far, such as curiosity, tool-use, poetry, and an affinity for ice cream.

Virtually every one of these aspects can now be accomplished by some (human) creation/invention; so “human” might better be defined as some collection of multiple features, not just a single one. But what to include on the list? Is there one aspect that is really critical?

Bio-engineering, which encompasses in vitro fertilization, cloning, CRISPR gene-splicing techniques, sophisticated prosthetics, and organ replacement pretty much leaves only the brain as a part or function of the human body that cannot foreseeably be constructed/replicated/replaced (so far!).

Chatbots can write creditable poetry, music, and literature; comparable devices can paint and sculpt. If these creations cause “humans” to respond emotionally and intellectually, what does it matter that they were midwifed out of a computer using a rich set of inputs from prior “human” experience. What else is “art”? Wouldn’t a rose planted and tended by a robo-gardener smell as sweet?

The challenge is not confined to pseudo-Shakespearean sonnets. Big data-driven replication of everything we do is foreseeable. From coding to design, to teaching, to legislating, barrista-ing, and taxi-driving, homo ludens (man who works) is, sooner-or-later set to be superseded.

In 1993, Vernor Vinge, mathematician and SciFi writer wrote an essay called “Technological Singularity.” He said “Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.”  [Hmmm …thirty years: 1993-2023]. The broad concept was picked up by Ray Kurzweil and more broadly popularized
In his 2005 book, The Singularity is Here, in which he lashed together trends in computing, bioengineering, nanotechnology, and robotics/AI to forecast the end of the biological era of humanity.

If we are out-done in terms of world changing techne; if we are losing the Darwinian competition for reproducibility/survivability, What’s left?  And, is this next stage “bad”? Creation for self-satisfaction seems solipsistic. Similarly narrow are those “utopias:” intellectual idylls and environmental utopias that seem more defined and valued by projecting what we (as a species) have thought would be great for thousands of years than any “objective” standard of idealization and progress. Indeed, one might ask (with Godel), whether we are even capable of defining what “human” is, since we are, by definition, in the middle of it.

A hundred years before Vinge, in The Time Machine (1895), H.G. Wells talked about humans diverging into two species: the dark, mechanistic, and oppressed Morlocks and the ethereal and effete Eloi. The former seem useless and the latter seem pointless. So, Wells could see that fin-de-siècle humans couldn’t make sense out of what such a future might hold; even if he could see that it would be radically different. I’m not sure we’ve really advanced the ball much since then. Our dystopias depict the mass of humanity teetering on the brink of de-evolution, while some  (usually small) percentage  live a glamorous life; a combination of lotus-eaters and videogamers/virtualrealityists.

If so, we had better figure out something else; some other angle on “human” existence. Is it enlightenment (somewhere between Kantian and Buddhist) at either a personal or societal level? Or, perhaps, our collective job is just to keep the species going until we can figure out something better along the way? After all, as I noted last week, enlightenment is a “process” and there’s no reason to think we have cracked the code quite yet. Who knows what we might discover/figure out/invent. Indeed, to take a more optimistic slant, if we’ve come this far in the last 100,000 years, we might get to some real answers in the next several billion years before the Universe shuts down. It might take us a while, but it could keep us busy in the meantime.

2 Comments

Growing Up

5/12/2023

0 Comments

 
I’m a big fan of enlightenment; both the historical era/phenomenon and in terms of personal development (from a certain angle, of course, they’re the same). Historians generally refer to the (French/European) Enlightenment to encompass a robust list of thinkers who flourished and influenced European culture in the 18C. This list usually features Voltaire, Rousseau, Montesquieu, Diderot, and, among non-Frenchmen: Hume, Adam Smith, Benjamin Franklin, and Immanuel Kant.

In the middle of all this flourishing of ideas, Kant responded to an inquiry: “What is Enlightenment?” with a brief essay in 1784. He characterized enlightenment as “man’s emergence from his self-incurred immaturity,” adding “Immaturity is the inability to use one’s own understanding without the guidance of another. This immaturity is self-incurred if its cause is not lack of understanding, but lack of resolution and courage to use it without the guidance of another.” He noted that “Laziness and cowardice are the reasons why so many…gladly remain immature for life…. It is so convenient to be immature.”

A brief scan around the world 240 years later shows pretty clearly that enlightenment, using Kant’s description, is very much a process and not an event (certainly not a singular historical event). There are those who would argue that, technology and stacks of scientific “knowledge” notwithstanding, we are only marginally more “enlightened” than were the denizens of the 18C and that “progress” (the self-proclaimed goal of many thinkers of that era) remains pretty thin on the ground (at least of the moral variety). Kant didn’t know much about technological development (pre-industrial revolution, and all); but he thought that, with ‘enlightening’ leadership and the discarding of blind adherence to religiously-constrained understandings of the world, we humans would move forward.

Kant’s use of the concept of “maturity” shows that he understood the parallel between individual human development and the path of the species. Eighty years after Kant, the English philosopher Herbert Spencer advanced the idea that the education of the individual echoed the evolution of humanity towards modern civilization. I don’t know if this idea works with what we understand these days of psychological development (or of what we understand of the “civilizing process” for that matter). Still, I think there’s something significant here.

Beyond the accumulation of data/memories and the laying down of thought processes, individual human maturity does seem to imply some changes in attitude towards the world. There is the accommodation of the presence of others in the world, including an extensive line of thought seeing humans as social animals, often with an ability to embody morality and altruism. I think there is also a notable tendency for people, as they “grow up” to move past instant gratification and think in the longer term [N.B. I am definitely not making any blanket statements here that either individuals or the species always do so.]

I suspect that it has something to do with the capacity to imagine the future. As we move thru and past the constant change of adolescence and approach the (relative) stability of adulthood, we can more easily see ourselves as adults (partners/parents/etc.). Our own parents shift from being part idealized role models and/or semi-adversarial bosses to people whose roles and behaviors we can (increasingly (if we’re lucky)) understand. Being socialized into increasingly institutionalized settings reinforces the ability to contextualize ourselves in larger frameworks. At a practical level, this shows up in planning (college, budgets, careers, relationships).

So much for Steve’s theory of individual development. I doubt Kant had this in mind when he wrote about human maturity, but I think the point is the same. The question he might raise is whether this broadening of perspective is, at the species level, the meaning of enlightenment.

Our societies are increasingly filled with programs for long-term financial support and health care. We invest not only in businesses but in infrastructure. We imagine the future in detail and we scenarioize, plan, and speculate about it. Our demographics, particularly our increased longevity, makes us each personally more aware of what the world will be like decades hence (our selfish gene has to be more future-oriented than it used to be).

Another way of asking the question of where we are as a species, enlightenment-wise, is to take a look at how much we feel responsible for the future. There are plenty of people and groups of people/countries who still fulfill the role that Kant described: laziness and cowardice make it easy to be immature. Altruism is especially hard when you’re not already top dog (either in capitalistic or geopolitical terms). Competition (again, of either variety) provides a compelling distraction from self-care and long-term perspectives.

Sometimes, people don’t grow up until they hit a wall or get a great shock. Kant was optimistic about enlightenment continuing, even incrementally. However, he (and the rest of the philosophes listed above) didn’t have much sense of the mind-boggling effectiveness of the technological branch that enlightenment from religion and the ancien régime enabled. They likely thought that there would be plenty of time to continue our incremental progress and that we would, in due course “grow up.” We’ll see….

0 Comments

Half a Baby in the Hand...

5/6/2023

2 Comments

 
In the Bible (1 Kings 3:16-28), Solomon famously challenged two women claiming to be the mother of an infant to cut the baby in half as an equitable split if they could not agree on who the actual mother was. The real mother demurred, preferring her son to live, even if not with her.

Such clarity and integrity (by both Solomon and the mother) is wonderful for such teaching stories, but finds rare application in the real world of modern geopolitical diplomacy. What is more common is hard-fought negotiation both as between the governments involved, each wrapped up in recursive theories of power and appearance, and intense politics within each state as various bureaucratic and political groups demand priority for their particular concerns. In the last 150 years or so, the lawyers have gotten heavily involved, leading to a lot more verbiage. One example of strategic clarity can be found on a napkin on which Winston Churchill scrawled out the allocation of influence between the British and the Russians over the various countries of central Europe in the aftermath of WWII. Stalin famously looked over Churchill’s proposal and put a big check mark on it and, as they say, “that was that.”

This all happened at a bilateral summit in late 1944 in Moscow. A year later, after V-E Da, it was implemented and, a year after that, Churchill went on to decry the impact of what he was the first to call the “Iron Curtain” which had “descended over Europe. (Unlike the mother standing before Solomon, he was quick to disown his parentage.)

I mention all this because regardless of the outcome of the current Ukrainian offensive, it is unlikely that they will retake all the territory which Russia has seized since 2014 (including Crimea). And yet, for many reasons, the war must come to an end.

Of course, everyone involved—the fighters, their allies, and the commentariat—is busy taking a position on what the shape of the peace should be; almost all of which is more a reflection of each one’s desired public perception at this point in time or an expression of hopes and fears, rather than an indicator of the ultimate shape of the peace agreement. As is seemingly inevitable in these circumstances, all sorts of historical precedents are being trotted out (nationalist claims, geopolitical history, the ever-popular appeasement trope, noble claims of the priority of freedom, etc.), almost none of which tells us anything about how to handle this situation.

While I am a supporter of Ukraine’s efforts, I am sufficiently far from the front so as not to claim the right to an opinion on what President Zelenksyy should do. Regardless of his decision, the second-guessing will go on for some years.

Of all the scenarios being bandied about, it’s hard to envision a wholesale collapse of the Putin regime, followed by an abject apology and a fully compensatory peace treaty. The last time the Russians sued for peace was in 1917 when, in the aftermath of the Soviet Revolution, Lenin agreed to give up 34% of the population and thousands of square miles to the Germany, Austria-Hungary, and Turkey and their allies. It didn’t end well and laid down a precedent the echoes of which would be hard for any 21C Russian government to avoid. It’s a nice dream, nonetheless.

Much more likely is some armistice and the acknowledgment by Ukraine that Russia will control some of the territory heretofore considered Ukrainian. I say “heretofore” advisedly, since the borders in this part of the world have been pretty fluid over the past several hundred years and their relative stability from 1921 (after the Russian Civil War) until recently has been unusual and deceptive. Indeed, Crimea was part of Russia until 1954, when it was transferred to the Ukrainian SSR.

All of which is to say that facts on the ground in the 2020s will determine where borders are drawn; not history, not ideals, not “principles.” It will be interesting to see if Zelenskyy (or Putin for that matter) has the moral clarity of the mother in the Biblical story. Or, put another way, the geopolitical clarity of Churchill, who wrote off much of central Europe in 1944, understanding that there was no means by which—principles and sacrifice notwithstanding—Britain or the US was in a position to do anything about Russian dominance of that region; and he salvaged what he could.

“Realpolitik” has a bad reputation for disregarding morality, justice, affinity, and principle. But teeth-gnashing and hand-wringing don’t win wars (otherwise MSNBC and Fox would conquer the world). I suspect that Russia will, in fact, collapse, but later rather than sooner; a victim, as Lenin would have said, of its internal contradictions. There will be further ebb and flow of power and control over this region (as most others) over the longer term. Sweden and Lithuania were once great powers in this part of the world. Russia has had an extraordinary run of 300 years at the upper levels of the global pecking order. Few these days pay attention to the decline of Sweden or Lithuania. Chinese historians of the 22C will likely pay similar levels of attention to Russia of the early 21C.

2 Comments

The Arc of the Universe

4/28/2023

0 Comments

 
“The arc of the moral universe… bends towards justice.”

So said the abolitionist Theodore Parker in the run up to the Civil War, thereafter adapted by Martin Luther King and Barack Obama, among others, to provide encouragement, patience, an inspiration in the US’ ongoing struggle with racism. They are fine words and a noble sentiment, but there is little evidence that the universe cares about (or even knows about) the idea of justice or morality.

Parker and King were deeply steeped in Christian thought and scripture, but there is nothing of God, Jesus, or the Gospels in this phrasing. Rather it is a secularization of the traditional Christian theology in which long-suffering humans are assured that their pains and sacrifices will be redeemed in the end. When Jesus reappears for the Second Coming, surely justice and morality will guide Him in His judgment of men.

When, just a few years earlier, Marx characterized religion as the “opiate of the masses,” he was critiquing this effort (not just of Christianity) to assuage the pain of everyday lives in the world of the 19C. From his perspective, it was a distraction from the condition of the downtrodden working people of the world so that they wouldn’t see their true condition and rise up in revolution (as Marx’s view of history saw as inevitable). Marx went on to propose another utopian scheme (commonly called communism) which proved (relatively quickly (~140 years)) to be more challenging to implement in the real world than he had hoped. As practiced by communist regimes in the 20 and 21C, Marxism was deployed as an opiate itself; a justification for sacrifice for the advance of socialism. We’re still waiting.

Another variant on this theme can be seen in the idea of a continuing and fundamental “progress” which came to maturity in the aftermath of the Enlightenment (Condorcet, 1795) and which animated much of European “civilization” across the “long-19C” until it fell off the cliff in WWI. While the aim of this “progress” was not some fixed event as the Christian version would have it, directionally, they were similar, However, the universe knows as little of “progress” as it does of justice, as both WWI and recent bumps in the world have shown.

Indeed, my biggest problem with the whole concept is not the tenuousness of “progress” or the ambiguity of “justice,” but rather placing the “universe” (or its more terrestrial manifestation: Western Civilization) as the motive force for either. Placing humans as the recipients and beneficiaries of this cosmological condition not only does us a disservice, but contributes to a passivity that is more than sufficiently prevalent in our species.

I have a debate with a friend about whether the human condition is improvable (i.e., whether “human nature” is immutable). He doesn’t think so. I, on the other hand, see (or desperately want to see) slight improvement over time. I just think that any such “progress” is not so much a matter of a few decades, but rather millennia. Perhaps I’m deceiving myself; let me know when we get to 3023, we may have enough of a track record by then to tell.

Those who urge a long-term confidence in progress/justice do well to urge us to not get too caught up in the bumps and slides of everyday life (sometimes, it’s one step forward and two back); counseling (at least implicitly) that we can overcome set-backs and regain our course. The other problem with relying on a long-term perspective (i.e. cosmological time-frames) is that the emergence of justice may run into the wall of technological catastrophe (environmental, nuclear, or otherwise). Stated differently, we may get bounced too far off course to regain the curve of our “arc” and recover (and it will be up to the cockroaches as the successors to humanity to devise their own version of justice).

There’s no reason not to wish/hope/believe in the inspiration of Parker, King, and Obama, but there’s also no reason to sit around and wait for the universe to take care things on its own; sometimes, too, we need to actively step up and ensure we don’t go over a cliff in the meantime while keeping our eyes on the prize.


0 Comments

Let's Talk Turkey

4/21/2023

0 Comments

 
Turkey has been much in the news of late; most recently because of the horrendous damage and death from the earthquake in a country with limited public resources and dubious enforcement of preventative building codes. There’s also been some coverage of the Presidential election next month in which Erdogan will seek (yet) another term; falling into the classic pattern of self-proclaimed indispensable leaders who build overlarge palaces and drive their countries into the ground over time. Erdogan is notable for his peculiar approach to inflation (currently over 50%/year): cutting interest rates and stimulating the economy even further… he’s a real head-scratcher.

But even before that, he garnered attention for the elegant dance he was doing on the international stage, juggling between Iran, Russia, and the West with regard to Israel, Syria, and Ukraine. He actually seemed to be pulling off a delicate balance which is pretty impressive given the players and the multidynamic situations Turkey faces.

Of course, Turkey/Ottoman Empire has had long experience in being at the crux of regional imbroglios, with feet in many camps. For hundreds of years, still run by the Byzantine Empire left over from the glory days of Rome, it was the juncture between a still-slow/sleepy/medieval Europe and the “mysterious Orient.”  (Oh, the empire was the site of the Crusades, too! “It’s always good to beat up on a Saracen!”) When the Ottomans swept in during the 15C, their new blocking position put pressure on Spain and Portugal to seek new trading connections to Cathay and Cipongu (Japan), thus launching the globalization of European empires which culminated in the 19C.  Meanwhile, the Ottomans stretched their own empire directly into Europe, famously laying siege to Vienna in both the 16C and 17C and occupying diminishing chunks of SE Europe into the 20C.

When added to their formidable presence in the Mediterranean during this time, “the Turk” (as the Sultan was called by (resentful/envious/fearful and generally ignorant) Europeans) was a major player in regional geopolitics, butting heads with Russia, Poland, France, Spain, various Habsburgs and a gaggle of others. Indeed, no small part of the coherence of “Christian Europe” emerged out of the differentiation/hatred/fear of the Muslims of which the Ottoman Empire was the largest manifestation. By the 19C, even though the Ottoman Empire was overstretched from the Atlantic to the Persian Gulf. Still, it had to be taken seriously geopolitically and, as “Europe” was taking shape diplomatically in the aftermath of Napoleon, the Ottomans were included in the club of major (European) states even if they were considerably different from a cultural/ethnic perspective. This tension was exacerbated by the economic/technological progress of the industrial revolution which accelerated the development of global power by Western European countries and left the Ottomans—with little by way of resources or cultural tools—trailing (…badly). In addition, by this time, the Ottoman Empire, like those of the Russians and the Austro-Hungarians, faced the centrifugal forces of nationalism. It was not for nothing that by the middle of the 19C, Turkey was often referred to as the “sick man” of Europe. By late in the century, the principal (European) powers were engaged in long-term jockeying over picking off pieces of the declining behemoth.

Despite efforts at reform, Turkey was unable to keep up. This led (similar to Russia) to a combination of resentment vis-à-vis the Western Powers and an urgent desire to emulate them. Even the “Young Turk” revolt of 1908 which sought to modernize the (now clearly diminished) country couldn’t figure out how to mobilize nationalistic energy across a disparate empire which, on top of Western interventions, contributed to the bloody efforts to rid themselves of Armenians and Greeks in an attempt at some sort of religious and ethnic coherence (which echoes today in its relationship with the Kurds).

A bad choice in WWI (siding with Germany and Austria-Hungary against Russia and Britain and France) didn’t help. Still, by the post-WWII era, geopolitics trumped religious differences and a cultural/economic mismatch. In order to flank the Soviet Union, Turkey was brought into NATO and, eventually, even had the prospect of joining the EU dangled in front of it (never to be fulfilled).

So, now, in the 21C, it remains caught betwixt and between. A bit modern and a bit traditional/underdeveloped. A bit secular and a bit Muslim. A big regional country, but caught between Russia and the West, between the West and Iran (and other Muslim countries). Too big to ignore (84M people, slightly bigger than Germany), but not big enough to throw its weight around more than a couple of hundred kilometers from its borders.

There is no easy path here, even if some stability (post-Erdogan) emerges. And, as much as I dislike his autocratism, his disrespect for human rights, and his terrible grasp of economics, Erdogan has managed to keep Turkey in the thick of the global mix. I hope he loses, but his successor will have serious challenges in repairing the domestic economy and society and in maintaining his balance on the world stage.

I’ve been to Turkey three times over the past forty years. I’d be happy to go again. The people are immensely friendly, the food is delicious, the cultures are rich, the history is deep (9,000-year-old cities, Greece, Rome, Byzantium…).  It’s a great place to see the diversity of paths in the world all in one place. The Hagia Sophia was a cathedral in Constantinople, a grand mosque under the Ottomans, a museum in modern Istanbul (and, apparently, just reconverted into a mosque). It was the largest building in the world for a thousand years (our Pentagon has only been around for 75). The Turks will get past the earthquake, they will get past Erdogan and will be richly redolent for some time to come.


0 Comments

The Fallacies of Instant History

4/14/2023

1 Comment

 


There’s a difference between current events/journalism and history. When I teach my course in recent European history, I stop about the year 2000 (with a coda on Brexit). The last twenty-ish years are too recent for historians to get much perspective. As I have previously quoted E.H. Carr: “History is a dialogue between the Past and the Present,” and until there is a decent-enough interval of time, the past and the present are too close to each other to have much to say. So much of what splashes across our daily feeds/consciousness becomes—in due course—ephemera. To be sure, there is history to be mined from ephemera, but it requires a delicate touch to extract those items that still resonate with meaning after twenty or a hundred or 250 years; and the historian who does so must start with a particular frame of reference or they’re just picking up flotsam on the beach.

So, it was with some skepticism that I saw a piece in the NYT recently under the headline “100 Years from Now, This is What We’ll Say Got Us Through the Pandemic.”

I needn’t have worried that the authors made any serious attempt at history (the title notwithstanding). The article comprised short pieces from 17 “cultural critics” on “Pop Culture Moments that Define the Covid Era.” As if.

As if the “Covid Era” is how people of the 22C will see the last four years. It’s certainly possible, of course, but given the geopolitical crisis in Ukraine, the threat of war in Taiwan, the precariousness of global democracy, and the nigh unavoidable environmental cataclysm, it’s not clear that Covid will make the cut as the defining phenomenon of the 2020s.

As if eras are, in any event, definable by cultural “moments.” Even in our media-crazed times, eras tend to be defined by geopolitics and economics. In the list of “big things” in 1939, “Gone with the Wind,” and “The Wizard of Oz”, the first NCAA Men’s Basketball Tournament, the birth of “Batman,” and the opening of LaGuardia Airport don’t make much of a dent in the start of the European phase of WWII. Ditto for the 1929 birth of “Popeye,” and “Tintin,” the opening of MOMA in New York and the invention of the game of “Bingo,” which are similarly second tier to the start of the Great Depression.

As if “pop culture” tells us much of lasting import. I will readily admit to not being tuned into “pop” culture (I was wholly (and blissfully) unaware of most of the NYT list of 17 “must see/do” items).  I am particularly ready to take a stand, nonetheless, that any list that includes any Kardashian or the eighth version of some video game is worth my time to consider. More fundamentally, however, the nature of “pop” culture is to be popular and (almost by definition) ephemeral. The article’s claim of significance is belied by the mental concentration that would be required for us to recall even half a dozen significant “pop” cultural icons of 2003. If it doesn’t mean much after twenty years, then a century seems like quite a stretch.

As if a list of seventeen items can claim to broadly represent the state of US culture during this time. I am aware of the standing of Taylor Swift, for example. But as with her predecessors, Celine Dion, Cher, Barbra Streisand, et al., their appeal only goes so far. This list includes but one book (seven streaming video shows), nothing from Hispanic culture, barely a nod to the heartland or to spontaneous events (other than Will Smith’s slap of Chris Rock at the Oscars); nothing about serious culture, nothing about anybody outside the US of A. It’s a highly selective metro/media-skewed selection.

As if these are the things that “got us through the Pandemic.” There is certainly inspiration and solace and insight to be had from a society’s culture in times of death and trial. For ordinary folks, the fear, disruption, and pain were countered by perseverance, creativity, and mutual support.  Where is any mention of the outpouring of support for front-line health care workers by banging pots at 7pm? Where is the awe at the speed with which vaccines appeared?  Where are the manifestations of heroism and hunkering-down. Unfortunately, the list trivializes the trauma, the courage, the science, and the humanity of the Covid experience.

As if there’s any sense of history. Per the Carr quote above, we have virtually no idea what people a century hence will think about us or how they will characterize our age. As an exercise in constructing a virtual “time-capsule,” this is much more about self-reflection; i.e., it tells us much more about how this particular group of “cultural critics” saw their peculiar slice of the world in 2023 than it will tell historians about what will seem historically significant. Indeed, I suspect I would be quite surprised, even as a 22C cultural historian of the US in the 21C, if more than a couple of the seventeen items still stood out.

Was it the stuff on this list that “got us through the Pandemic?” Rather to the contrary, one could argue that they were more noise and distraction than aid and comfort. Given our sorry national record on “listening to the science” and the unwillingness of millions to make voluntary sacrifices in the pursuit of “freedom” and “normalcy,” our “getting through” isn’t so much to celebrate.

Depending on how you count it almost 1.5M folks in the US died because of the pandemic, part of 20M+ globally. They, their families, and the many millions more who had serious repercussions—of health or livelihood—didn’t “get through.” Historians of the 22C will assess whether that’s more meaningful than a few yuks on some Netflix dating show

1 Comment

Knowledge and Liberty

4/7/2023

1 Comment

 
I’ve been reading a bunch of Locke and Rousseau lately. My course on the history of revolution emphasizes their role as intellectual godfathers of the American and French Revolutions, respectively. Both wrestle with the question of why people get together in organized societies and both make use of an idea called “the state of nature” (although they use the concept a bit differently). There is a core idea, however, that people when agree to live in society, they give up some aspect of their unbridled freedom to do as they wish; they (implicitly) acknowledge that they will forego doing things which harm another person (that’s the theory in any case). In other words, their liberty goes only so far as it doesn’t infringe on the liberty (life, liberty, property) of another member of society.

From one perspective, this is not all that far from the basic moral principle (with plenty of Judeo-Christian expressions): “Do unto others as you would have them do unto you.” Kant said much the same thing as a premise of his moral philosophy (which he called the “categorical imperative”).

Now it’s no coincidence that these ideas were being bandied about in the 17/18C at the same time as the so-called “scientific revolution” was gathering speed. The rational analysis of the way the world and nature work was closely connected to the rational analysis of how humans and societies work.

Here’s the problem: the more we know about how the world works, the more we understand the ramification of human actions/omissions, the more we see how those actions can infringe on the lives/liberties/property of others, and therefore, the more we are therefore obliged to constrain our actions (i.e. reduce our liberty).

Thus, the discovery of knowledge (e.g., about pollution, or psychological distress, or market pricing mechanisms) has to lead to a loss of liberty. The addition of laws about sulfur dioxide emissions, hate crimes, or price-fixing are a tangible manifestation of the increase of knowledge, all of which goes a long way towards explaining why a mere ten commandments is wholly insufficient for a modern society.

This tension is not novel; human societies have been developing more and more constraints for millennia. Even ten commandments were not enough for Hammurabi (whose code (from about 3700 years ago) ran to about 300 laws. Debates about freedom burgeoned in the 17/18C in which state intrusion into individual liberty was  consistently decried; sometimes in opposition to despotic/absolutist monarchical power, sometimes against  more modern modes of the “State” (whether democratic or dictatorial). The most frequent concern was with taxation, constraints on trade were also a popular subject of attack. Typically, those with property are better positioned in society to participate in the political process, so much is heard of infringements of “private” economic power. These groups have been (as Marx said) in control of the “means of production,” so they are also concerned with the state’s infringement of their liberty to operate their businesses. Thus, the consistent string of attacks on the “regulatory state.”

So, where does the knowledge part fit in? First, as the basis of an expanding set of “regulatory state” rules limiting the actions of those with power in society in order to protect the individual liberties of those affected by that unbridled power. Labor, antitrust, and environmental regulations are important parts of this category. Other incumbents (e.g., professionals from lawyers to hair dressers) also push for limits on entry (i.e. on competition), nominally to protect innocent consumers. All of this arises because we have plausible theories of causation and effect (e.g. OSHA-like rules to prevent black lung disease, prevention of monopolies which raise prices, and mandatory composting ordinances to reduce the stress on nature from land-fills). Yes, it’s true that the Code of Federal Regulations now runs almost 200,000 pages (about a 9-fold increase since 1960). The question I’m raising here is to what degree this increase is a function of changing levels of understanding of how private actions affect others (as compared with changing political outlooks (e.g., “liberalism’s” pro-active approach to addressing societal challenges)).

The second way in which there is an inverse correlation between knowledge and liberty approaches the same phenomenon from another angle. The premise of modern liberalism is that the state needs to actively support the abilities of all individuals to fulfill their goals (consistent with the usual caveats). So, the same increase in understand of cause and effect—in terms of both the world/nature and humans/society—means that we now understand that there is more for the state to do to foster that aspect of liberty (“positive liberty” as Isaiah Berlin called it). This means not only more protections (the flip side of the regulatory state noted above), but also increased intervention in basic economic arrangements of society, i.e. redistributive economics and progressive taxation.

None of this analytic framework which I have just laid out tells us very much about exactly where lines should be drawn in particular situations. There are still justifiable concerns about overreaching bureaucracies and stifling of individual initiative. There are still plenty of reasons to be concerned about economic inequality and the inability of the market (or should I say the unwillingness of those with economic power to alter fundamental market parameters) to recognize and incorporate into its pricing mechanisms the real effects of many human actions. There is still a need for ethical debate and the outcome, as I say, in any particular case is not so clear.

Nonetheless, as a historian, I can’t help but wonder about how the continuing increase of knowledge (of causes and effects) will alter our mix of liberties. We can’t “unknow” the implications of our actions. We can, however, accept that some degree of these generally visible harms might be acceptable as a trade-off for the less visible harms of the loss of liberties. Such an exercise in subtle political engineering might be beyond the capabilities of our current political culture. So we end up with less effective policies, less liberty, and a gnawing sense of unease that we’re off track.

1 Comment

March Madness

3/31/2023

0 Comments

 
There are many excesses of capitalism. Indeed, one of the central problems with the whole concept (i.e., a money-based epistemology) is that there are no inherent limits or balancing values. Unless some cultural/moral value can be quantified and incorporated into the market; it gets ignored. Education, especially at the college level, has already been well caught up in the whirlwind of career and earnings. Humanity (both the group and the value) and the Humanities (plural of the latter) seem to be fighting a rear-guard action.

Another aspect of the capitalist perversion of education can be found in college sports. The friendly and inherently meaningless competitions of youth are increasingly packaged and quasi-professionalized. The values that would justify the otherwise nonsensical pursuit of various sized and shaped balls—camaraderie, cooperation, perseverance, self-discipline, and “sportspersonship”—have taken a back-row seat (on a bus!). To be sure, colleges still mouth the mantras of noble aspiration, but they put their money/time/prestige into luxury-box-equipped arenas, coaching salaries and training palaces. Chatbot tells me that the average Division 1 college basketball head coach is paid $2.7M and the average college professor is paid $80,000 (a ratio of 34:1). Something is clearly out of whack.

I was recently discussing the surprising outcomes (so far) in the NCAA Men’s Basketball Tournament with a friend of mine, who is a fully registered fan of many sports (particularly baseball and basketball). He explained that since it is now legit for college athletes to be paid for endorsements (since 2021) they are incented to play for teams and in cities where they can best leverage their “brand” and bring home pretty substantial (6-figure) bucks. The NCAA even made it easier for them to transfer from school to school. Thus boosting the game of the U of Miami. The whole thing is a recipe for corruption. (I offer no explanation for Princeton’s Cinderella act or other upsets during the tournament).

Now, I do not begrudge young athletes “cashing in” on their abilities. The line between “professional” and “amateur” athletes is hopelessly blurred. Rather, my concern is with the remnants of integrity to be found in the halls of academe. Under the rubric of “competition,” such student-athletes are scouted, recruited, subsidized, tutored, and graduated on the backs of a wave of resources that dwarfs those available to the average student.

And for what? The prestige of being an also-ran in a tournament of the top 72 college teams in the country (i.e., one of the 71 teams that didn’t win; that’s over 98% of them). A tournament whose results, particularly for these “also-rans,” and after a year or two, fall into the nether reaches of Wikipedia.

In this critique, I understand and dismiss the arguments that “all the extra money comes from outside fans/donors,” and that colleges “profit” from college athletics financially (as well as in terms of prestige). These rationales are prime evidence of submission to capitalist mentalités from institutions who are increasingly struggling to deliver their prime objective: productive and responsible adults/citizens. These colleges all have well-honed machines that take all manner of government research grants and shave off 5-10-15% as “overhead,” which funds go into the general university budgets. Why not a 25% slice of all athletic donations to support actual education?

Why not a cap on college coach’s salaries at three times the average salary of full professors or the average of the top five academic administrators in the university?

Why not a limit on recruiting expenditures? There are plenty of paid and alumni talent scouts out there. Why should some coach from Texas be scouting in California (or vice-versa)? The talent will rise, it can just rise locally.

Why not limit the number of athletic scholarships and put the resources to academically capable but needy students? Or, at least, have the scholarships available only to students from the home state of the institution? What would be lost to society if players played for the schools in their home states?

Such steps might have a marginal effect on the big athletic programs around the country, but their enactment would be a useful signal of purpose and values. This is especially true for public universities (who generally have the biggest programs and the smaller academic endowments).

More radically, universities could just drop their programs in the sports that have the biggest professional leagues (basketball and football). Right now, these (very) profitable businesses are getting all their talent developed for them at virtually no cost. Let’s have them set up “minor leagues” as baseball does. They could even keep the same uniforms and pay the universities rent on their stadiums. In fact, just to keep it simple, each university could sell its “franchise” lock/stock/barrel to the NFL/NBA. The operations could remain as they are, but wouldn’t be formally part of the University. Coaches and athletes would be paid market rates, but the fiction of “scholar-athletes” could be dispensed with.

In the end, it’s not clear to me why intra-mural sports or friendly club competitions between schools in a region couldn’t achieve the same level of benefit to the students in terms of camaraderie, cooperation, perseverance, self-discipline, and “sportspersonship” without all the empty hoopla.

0 Comments
<<Previous

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly