Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

CongressBot

1/27/2023

0 Comments

 
Washington, January 26, 2023 – The early demise of Kevin McCarthy’s Speakership of the US House of Representatives has been the frequent topic of pundits since the results of last year’s election became clear. Few at the time, however, foresaw the upset election of a young gay Republican from Nassau County, New York would be the trip wire for McCarthy’s downfall. Fewer still, had any inkling of his replacement; indeed, even among the technoscenti, few had even conceived of, much less heard of the new Speaker three months ago.

So, it is an exceedingly rare (not to say bizarre) confluence of events that led today to the election of SpeakerGPT to preside over the House.

McCarthy’s downfall was triggered, as expected, by the rule he ruefully agreed to in early January which allowed any single Representative to make a motion to declare that the Speakership vacant, thus requiring another vote (or perhaps parade of votes) to select a replacement. The motion was made by freshman Representative George Santos (R-Denial) when Santos learned that McCarthy was going to allow a vote on whether Santos would be expelled from the House for any number of false statements, likely election law violations, and a variety of expected fraud charges.

Santos made his motion while Representative Lauren Boebert (R-Heterotopia) was presiding over the House as Speaker Pro Tem, a duty which is regularly rotated among members of the majority. Boebert, upon hearing Santos’ motion, said: “Sure, what the H---, let’s do it.” Unfortunately for McCarthy, he and the rest of the House Republican leadership team were at NRA Headquarters in suburban Virginia for a briefing and couldn’t make it back to the Capitol in time for the roll-call vote. A sufficient number of Democrats were, however, present, and the Speakership was declared vacant by a vote of 212-192.

Former Speaker Nancy Pelosi (D- Semi-retirement), who had been minding the floor for House Democratic Leader, Hakeem Jeffries (D-NY) then immediately moved that SpeakerGPT be elected to the post and the motion was carried by the same tally.

Under the Constitution and the House Rules, the Speaker need not be a member of the House, indeed, there is no requirement that the Speaker be human, or even an American.

Pelosi, later asked about why she chose to nominate SpeakerGPT said: “I wanted to find someone who would sound intelligent and fair. I wanted to find a Speaker who would bring the same authenticity and human connectedness to the job as the average Republican so that they wouldn’t be too upset about the situation. Besides, it’s made in San Francisco; so how could it be bad!”

Sources in Pelosi’s office said that they had just heard that the artificial intelligence program called SpeakerGPT was being launched as a one-off variant of the publicly available AI called ChatGPT.  Open AI, the company behind both AI models confirmed the report. ChatGPT, which had been publicly announced only in December, had set off a heated controversies about the impact of artificial intelligence in schools and public forums across the country.

According to Open AI, its new model, SpeakerGPT, is specially adapted to replicate the intelligence of the average member of Congress. An Open AI engineer, who was only willing to speak without attribution since they weren’t authorized to represent the company, said that “It wasn’t too hard to make the adjustments. We dialed down the logic processor, randomized the intelligence processor, and eliminated the learning functionality. After that, it was simply a matter of spinning up the bloviating vocabulary ratio and we were set to go.”

Shortly after being sworn in, SpeakerGPT said: “As Speaker of the House, my ultimate responsibility is not to my party, my conference, or even our Congress. My responsibility — our responsibility — is to our country. Our nation is worth fighting for. Our rights are worth fighting for. Our dreams are worth fighting for. Our future is worth fighting for.”


SpeakerGPT then adjourned the House sine die (i.e. for the remainder of the term).

When asked whether it was referring to humans, Republicans, or Americans when it used the word “our,” with regard to “country,” “nation, “ “rights,” “dreams,” and “future,” SpeakerGPT smiled, but had no comment.

When asked for comment, Representative Santos, who started this entire chain of events, said: “I’m delighted. McCarthy was about to throw me under the bus. It’s another reason to be against busing.  

“I mean, really, Jake [Representative Jacob Auchincloss (D-Mass)] just read an AI-generated speech on the House floor yesterday. What’s the difference? Think of all the money and time we could save if we just had a bunch of AIs up here doing the legislating. They could be programmed by the voters of their district. It seems like it would be a lot more efficient than how we do it now. Besides, my own election shows that voters don’t really care if candidates make stuff up. In fact, I am planning to launch a whole series of SantosBots to run in districts all over the country. They will each run on my platform, but they will create their own resumès. After all, it worked for me.

“If we’re going to fabricate, we might as well be state-of-the-art. When I taught constitutional law to President Obama (back when he was at Harvard Law School). I told him it would come to this, and I’m not lying.”

0 Comments

Casablanca

1/20/2023

0 Comments

 
      “Here’s looking at you, kid.”

      “I’m shocked, SHOCKED, to find that there’s gambling going on here.”

      “Round up the usual suspects.”

     “I’m no good at being noble, but it doesn’t take much to see that the problems
       of three little people don’t amount to a hill of beans in this crazy world.”

You will have your own favorite quotes from Casablanca; but most folks “of a certain age,” have some recognition of lines and scenes. It’s part of our cultural heritage, even if we weren’t born then (1942).

The same cannot be said of those born in the last thirty years.

It’s not just the techno-differences between generations, it’s broader cultural change, like the difference in popular music (pre- and post- Elvis/Beatles), sex, reading habits, work style (silk ties every day for “white collar” workers), and so many other things.

As one who didn’t have kids to keep me (relatively) current with popular culture, it’s a particular effort to comprehend and communicate. I do my best to connect with my students, but I know that (even professorial demeanor aside) I’m struggling to avoid being dismissed as so much “dad culture.” I have no hope of “hip-hop,” I struggle with social media (even FB), and don’t even get me started on “tats.”

Ah well, “We’ll always have Paris.”

When I first started teaching, I tried to tell a Marx joke (Groucho, not Karl); but the blank looks made clear that neither was comprehensible to my millennial students. (I do wonder which one will be more remembered in 200 years!). Now, I stick with much more topical puns (but that assumes that they’re on top of current affairs). It’s a tough gig.

I feel sad, both about apparent decreasing relevance, but also because I enjoy (revel in) the cultural milieu in which I grew up. This all challenges my sense of “timeless” cultural values and what makes for a “classic.” Maybe it really is all ephemeral. After all, I wouldn’t have been caught dead paying attention to vaudeville in the 1970s; however much my grandparents thought it was the “cat’s pajamas.” I’m willing to acknowledge that my affinity for “disco” for a few years might not make that genre into a “classic,” but I will insist on the enduring brilliance of the original SNL cast/characters/skits.

One of the benefits of teaching history is that the entire enterprise is founded on the relevance of the past, no matter how distant. How is Putin like Kaiser Wilhelm II? Does the decline of Europe in the 20C offer any useful insights into America in the 21C? Students who walk into a history class know that they’re going to have to think in these kind of terms and a fair number accept this premise; which makes for a better class, allows me some leeway to draw the connections and, as E.H. Carr said, put the past “in conversation with the present.”

So, at least I have a chance at pulling out some old chestnuts. Mostly this takes the form of quotes from significant or preceptive figures from the past: Churchill, Napoleon, Jefferson, Gandhi, Voltaire, and Marx (both Groucho and Karl).  Contextualizing these thinkers (essential in any serious historical discussion) ensures that I don’t get lost in nostalgia and idolizing. It also gives students a clue that just as the past must be taken on its own terms; so, too, must the present: their present. There are (significant) limits to connecting the past with the present. And we can’t pretend that our work/ideas/sensibilities are timeless; merely do the best we can in the circumstances as we understand them.

“Of all the gin joints in all the towns in the world, she walks into mine.”

Casablanca reminds us that contingency is ubiquitous. It is essential to parsing history. It is also useful to remember in the course of teaching. Each class is different, the presence of one student changes the spin of the class and requires a different approach to the material we’re working with. Just as Rick’s world turned upside down when Ilsa walked into the bar, so do I get to remember that all my lovely lesson plans can quickly go by the boards. As I learned when I did corporate strategy, “the only thing we know about a plan is that the reality will be different.” It was true for Madison at Philadelphia in 1787 and for the Germans trying to knock out the French in 1914 and for my conception of how to talk about how Communism collapsed in the 1980s as part of my course on “Europe in the 20th Century.” Being aware of this precariousness tells us a lot about how to draw “meaning” from history. It also keeps teaching fun; because even the seventh time I give the lecture, it comes out different and I have to pay attention.

I guess I could teach a whole history course based on Casablanca: war, nationalism, race relations, colonialism, corruption, idealism, historiography, love. Could I figure out how to make all of it “relevant”? How to get the students to connect to it and draw skills and insights that they could use in the rest of their (likely) less exotic lives? It might be fun (even if I could never get it past the Curriculum Committee).

0 Comments

Uprisings

1/13/2023

0 Comments

 
Since I’m teaching a course on Revolution next term, I’ve been paying more attention than usual to the ideas and patterns that lead to sudden, usually violent, and wholesale change in a society’s political structure. Social scientists spend a lot of time trying to model these patterns (complete with complex algorithms). Historians, on the other hand, insist that every society and situation is unique and that there usually isn’t enough data to tell us anything meaningful about the so-called patterns. (There’s a great illustration of the social science-y statistical approach to the complexity of life here; I find it highly inconclusive.)

Being of the latter persuasion, and also being mindful that both flavors of academics are looking in the rear-view mirror, I have been intrigued of late by watch the uprisings/protests/dissension going on—real-time—in three of the biggest authoritarian regimes in the world: China, Russia, and Iran. The causes of all three are radically different, as are the scope, extent of violence, and, indeed, the nature of the protests.

So, one question is whether there is anything to the co-incidence of the three. There were waves of revolutions in Europe in 1830 and a bigger wave in 1848. Some find some common causes in the revolutionary activities in Mexico (1911), China (1912) and Russia (1905, 1917). There’s clearly a close connection between the changes in post-Communist central Europe, the Baltics, and other former parts of the Soviet Union at the end of the 1980s. The “Arab Spring” of 2011 is another example. There’s some good evidence of the connections between them, but there are a whole raft of revolts, uprisings, etc. scattered across European and world history over the past 250 years which had little to do with events and developments in other countries. In the not-too-distant field of democracy studies, there’s a strong story about “waves” of democracy; the most recent of which occurred in the collapse of the Soviet empire in Central Europe and in Central Asia. “Waves” are possible, but have to be handled with care (more care than the media is likely to use!) and that’s assuming that there is enough comparability between the circumstances to begin with.

In Russia, Putin’s repression has forced most opposition underground or out of the country. The forces with enough oomph to actually work change are likely buried deeply inside the Russian State and are invisible to those outside the intelligence community. But, as demonstrated by the French in 1789, the Russians in 1917 (and 1991), and the Iranians in 1979, once a process begins, it’s pretty hard to predict how it will twist and turn. Still, the impact of Ukrainian War sanctions will put pressure on both military and economic sectors to get out from under their respective predicaments.

In Iran, the popular protests have been most visible; as have the state suppression with both direct and judicial violence. The same pressure from Western economic sanctions has made clear to common folks that there is a significant price to be paid for the aggressive international posture of the Islamic State. A State that, at the same time, seems to lack broad support for its religious ideology. Indeed, the prominent role of women in this set of protests is remarkable. It may be that the Ayatollahs’ efforts to reject Western modernity, which has dominated the country for over forty years, is losing its grip. Whether the military will find religion, seek a more visible domination of the country, or allow some other shape of leadership to emerge remains to be seen. Of course, it’s easy to see the religious state, with military backing, crushing the popular uprising and delaying shift towards modernity for another decade or so.

In China, unhappiness is widespread, but there is no clear picture of an alternative to the Communist Party. Compared to the other countries, the State is strongest here, evidenced by the lack of visible popular protests. It’s possible that the incipient COVID outbreak, coupled with severe economic downturns (both cyclical and fundamental) could lead to a stunning change in the leadership of the Party and its policies.

The lack of real-time visibility into complex, dynamic, and highly contingent processes can only make us humble about guessing whether any of these situations will turn into a serious revolutionary effort (even if unsuccessful). This is why historians wait.

As to whether there’s any connection between them, my guess is not; or, at least, only in the broadest sense of reflecting a number of global changes. Unlike Europe in 1848 where there was conscious parallels and some coordination, it’s hard to imagine much connection between pro-“democratic” forces in China, Russia, and Iran; or between their militaries. Local factors are primary.

There’s no telling how any of these will evolve. In Iran in 1979, protests turned into Revolution; in Tiananmen Square in 1989, protests were suppressed. Authoritarian regimes are hard to assess from the outside since they are, by nature, not transparent. A few key players might shift allegiances and the whole edifice collapses; popular protests might have little effect or provide an opening for an insider to crack open the incumbent system. You can’t tell until its actually happening, and developments are notoriously erratic (I’m certainly NOT making any predictions). And then, even if something gets going, its course and outcome are as uncertain as whether the incumbent power structure goes by the boards.

Still, we shouldn’t be surprised if—some years from now—we look back on the troubles of 2022 as part of a path of revolution (one or more!). The historians of 2052 will let us know.


0 Comments

Is Genocide Important?

1/6/2023

0 Comments

 
I’ve been asked to teach a course on The Holocaust and Genocide next spring; a topic that is nominally within my “modern European history” wheelhouse, but which I have studied only incidentally to other aspects of that sprawling field. As a result, I’m spending a bunch of time reading and thinking about both the topic and, as importantly, how best to teach it.

The course is sponsored by the Jewish Studies Department at SFSU and, as a person of Jewish culture, I have a particular resonance with the strand of antisemitism that led up to that most horrific of genocides by Nazi Germany in the 1940s. There have been many other incidents of organized and targeted mass murder by states and militias across the 20C and more recent events in Darfur, Western China (the Uyghurs), Myanmar (the Rohingya), and southern and eastern Ukraine (this year). There are many incidents from earlier times as well. Overall, unfortunately, it’s a lengthy list. As a historian, then, it would seem that I’ve got plenty of material to work with.

However, one thing I like to ask my students at the beginning of each course is “Why are you here?” Self-reflection is good (and under-practiced) and I hope to get them past the superficial answers of “it fit my schedule” or “I needed it for my GenEd requirements.” This question is especially interesting the context of a course on The Holocaust and Genocide since it deals with the most gruesome of topics, the worst aspects of human nature, and with little opportunity for lightness and fun. Students in this course will have, I expect, some more substantive purpose for spending their time and money on a course with no apparent vocational benefits.

There are likely to be some who, from family or cultural contexts, want to understand how such events arose that led to the deaths and suffering of so many people just like them. There will likely be others who are intrigued by the limits of human behavior  (or lack thereof), those that wish to vent against evil, and those that want to figure out how to prevent such things from happening again. We will have to wrestle with the fundamental disconnect between trying to understand these phenomena and their utter incomprehensibility.  This includes both the problem of lacking the words to express horror and the wrenching frustration and disorientation rooted in cognitive dissonance. The same was true of many at the time.

As I have noted previously, I don’t have much truck with the idea that history has “lessons” which, if properly learned, will enable us to steer the future. The best we can look for are echoes/rhymes that alert us to pay attention to similarities in current actions that might lead down a similar path as the past. Rather history presents a comprehensive set of examples for study and reflection, whether at the level of macro/national policy or the level of personal behavior and attitudes.

In this light, a course on The Holocaust and Genocide has a tremendous amount to offer, since it gives us the chance to come face-face with the worst part of ourselves. It’s ugly and not for the faint-of-heart. Indeed, there is a considerable amount of the historiography of the Holocaust devoted to explaining how this was a uniquely German phenomenon, based on a unique German history which has little applicability to “us.” Of course, such analyses sweep to the side any number of evils/oppressions/brutalities which “we” in the US or innumerable other countries have committed in our own histories. Genocide is a human problem.

As a nation, Germany has done perhaps as good a job as we have seen of recognizing and coming to terms with their own sins in this regard. In comparison, the Japanese still pretty much reject any idea that they behaved barbarously in the 1930s and ‘40s, the Turks have gone to great lengths to deny their concerted attacks on Armenians during the 1910s, and there remain many in the US who can’t comprehend that racism has been a virulent strain in our own history.  Perhaps this course can help each of us find, acknowledge, and even take a step to repair whatever damage we might have done to another; recognizing, at least implicitly, that we have some shade of the same darkness in us (that I have some shade of the same darkness in me).

As I have been studying the idea of genocide closely of late, I note that while the list of incidents goes well back in time, it was only in the aftermath of WWII that the concept of organized and targeted mass murder was labeled “genocide.” Over the past thirty years, a whole sub-field of historical and sociological analysis has grown up; replete with debates about meaning, definitions, and modes of improving humanity.

I suspect that modern genocide is actually not worse than the violent practices of powerful people over the millennia (with appropriate adjustments for population growth and “improved” technology). What is so stunning is that “we” (i.e. sophisticated “modern” “civilized” people who occupy the world in the 20C/21C should still be doing this. “Gee, I thought we (as a species) were past that sort of thing.” Only to find it is still going on, and that “human nature” has not progressed as fast as technology (or even more than incrementally) over the past several thousand years.

So, maybe one thing such a course can do, by way of seeing (as best we can) the faces of those in pain (and those that caused the pain, and those that watched the pain) is to recall this lack of “progress” and the distance still to go.

In the absence of the Holocaust or the brutality of the Tutsis/Burmese/Russians/et al., very few of such victims would otherwise be remembered by history. Our act of remembering is not, therefore the usual role of history to capture the significance of historical events and figures. What we can do is to actively rebut the dehumanizing attitudes and murderous actions of those who perpetrate genocides. We do that by insisting that each victim be remembered—as a person—with a  life and a family; to tell the murderers that they have utterly failed in their mission of extermination.

That seems like enough for one semester.
0 Comments

To Boldly Go...

12/30/2022

0 Comments

 
“To boldly go…” (probably the most famous split infinitive in modern English) was, perhaps, the essential embodiment of the Star Trek series. It captured the spirit of adventure and discovery which that series sought to tap into, a spirit which is a deep part of human nature since long before there was much in the way of technology.

All this came to mind as I watched the new documentary: “Goodnight, Oppy” about the Mars rovers “Spirit” and “Opportunity” whose mission (intended to run 90ish days) actually continued from 2004 until 2018. It’s a pretty slick production (available on Amazon), a paean to modernity, filled with the romance of engineering (yes, there is such a thing!), not a little anthropomorphizing, a bunch of science, and, most of all, human achievement.

I’m not ashamed to say that I was touched by the spirit of accomplishment and teamwork which the Mars Rover Team conveyed. I had had a similar sense when I saw the successor to “Opportunity”: “Curiosity” land on Mars in 2012 (still running after 10 years); using a stunning conception of multiple technologies to make it happen. The chills were not just envy of their comraderie, but also a feeling of participation—of human-ness—in the face of the vast emptiness of space. In such an expanse, the values of family, clan, and team go a long way towards ameliorating the bewilderment of the cosmos.

I’ve read more than enough science fiction in my day with many tales of interplanetary adventure and portraits of cultures spread across the galaxy on the backs of human ingenuity. Star Trek is certainly a leading example; so is Star Wars, and neither is among the best in terms of writing or imagination. Yet, the fun of the fictional accounts (of whatever quality) lacks the reality of our accomplishments over the past two decades on Mars.

The epochal thrills of July, 1969, when we landed on the Moon have faded for many and are unknown to the 85% of people today who hadn’t been born (or were too young). Space-wise, it’s been a pretty quiet half-century; with much of the techno-awesomeness shifting to the information and biosci areas. And, to be sure, there is definitely a sense of “early days” about the Mars Rovers. I doubt folks went “gaga” over Columbus’ maritime tech in 1492, and most of the so-called “Age of Exploration” was highly colored in the mode of the “Christian West takes on the world” or various flavors of inter-national competition.  We have much the same, only lately it's been USA vs. USSR, or USA vs. China as the reality behind the slick tag lines (Neil Armstrong and Buzz Aldrin left a plaque on the Moon reading: “We came in peace for all mankind.”).

The “space race” of the 1960s reflected much the same mix of economic/technology development and geo (cosmo-?) politics as did the earlier “Age of Exploration.” There’s little doubt that NASA’s recent moon rocket test (Project Artemis) was also spurred on by the extra-terrestrial presence of Chinese Taikonauts and tech (not to mention SpaceX, Blue Origin, Virgin Galactic, and other billionaire space projects). We beat the Russians back in the 1960s and we don’t want to let some other “commies” out-flank us on the Moon or Mars in the 21C.

Even leaving the moral and political issues to the side, we look back on the deeds of centuries ago with much more amazement at their meagerness (compared to where we are now) than amazement at their accomplishment/bravery/foolishness. I suspect that if we are around to do so in a couple of hundred years, that people of the 23d century will feel the same condescension towards our initial, often faltering, steps. And those from the 33d century may well feel the same about us as we do about Viking ships and Polynesian long-distance sea canoes.

Still, from where we are now, the story is remarkable and so is the opportunity (ditto for the “Spirit,” the “Curiosity,” the “Perseverance”). Perhaps the part to which I so connected—even more that the team’s energy—was that these projects symbolize our human potential. Amid all the (very) messed up things on this planet, it’s not our lack of ability that is in doubt; rather, we insist on pettiness, greed, and short-sightedness instead. These impediments are as human as our dreams. I’m glad we have R2D2’s squeaks and Captain James T. Kirk of the Starship Enterprise “to boldly go….” I’m even more glad that we have lots of unglamorous folks working hard to nudge a little robot that is actually trundling over red sands on another planet.

In this holiday season, there’s still room for wonder and awe.

0 Comments

Grand Turing Test

12/23/2022

0 Comments

 
You may have seen something lately about the hot new tech toy: ChatGPT. It’s obviously a name that didn’t come from a marketing department and it doesn’t even proclaim itself as ready for business. Rather it’s a test bed for AI. You can write a question and (in most cases) a pretty well fleshed out answer comes back. Check it out here.

Just for yuks, I asked “What does “condemned to repeat it” mean?”

In 15 seconds, this is what came back:

“The phrase "condemned to repeat it" is often used to refer to the idea that history has a tendency to repeat itself, and that if we do not learn from the mistakes of the past, we are doomed to repeat them. The phrase is often attributed to the philosopher George Santayana, who wrote in his book "The Life of Reason" that "Those who cannot remember the past are condemned to repeat it." This means that if we do not learn from the events of the past, we will be doomed to make the same mistakes again in the future. The phrase is often used to encourage people to study and learn from history in order to avoid repeating the same mistakes.

It's not a scintillating answer, but it’s on point and plausible. Let me just say that I have seen worse answers from college freshmen.

There are still quite a few limitations and rough spots, but, like all machine learning; it will get better fast. Indeed, that is the whole point of ChatGPT: to train the machine to do better.

This public splash has engendered all sorts of chat about the coming age of AI (as to which, see my post from August, 2021 here). But, I’m not going to go down the road of bewailing the end of humanity etc, etc; nor go into detailed analyses of the current state of AI in general or ChatGPT in particular. It’s way too early to be parsing the answers it’s giving at this stage. We (society in general) need to pay attention and start thinking about how to handle this capability. There are a wide range of implications; not least: how to deal with the “It must be true, I read it on the internet!” in our truth-challenged discourse. But let’s not panic quite yet.

Even for those of us in the teaching (or, more precisely, the evaluating/assessing/grading) biz, the challenge is rapidly approaching, but not imminent. Simply put, if a student can grab the prompt from a take-home essay exam (which I usually use) and drop it into their friendly I responder, will I get back anything I can use to understand what the student actually absorbed out of the course? Do I really want to spend twenty minutes chasing down the source of any individual answer to an exam? Grading is already draining enough.

Most tests “back in the day” were “in class,” usually closed-book, and were as much about memory capabilities as about reasoning and understanding. I realized some time ago in my teaching that memorization wasn’t all that important at the college level, and turned to open-book, take-home tests where the student would have to marshal ideas and information to come up with an answer showing some insight about the material and issues covered in the class. Then, along came Wikipedia and other internet sources which provided all sorts of “facts.” So (it seemed to me) that there was not only little point in forcing students to memorize facts that they could easily look up, but that I didn’t care (again at the college level) about them remembering whether Napoleon was exiled to St. Helena in 1815 or 1816.

At least (so I figured) I can still pose interesting questions that require thought, reflection, integration of sources and ideas, etc. Students can pull their facts from Wikipedia and the course materials and show me that they “got it” when I talked about the impact of guilt on the treatment of perpetrators and collaborators in the aftermath of WWII in Europe.

Well, that seems to be going by the boards soon, too. ChatGPT provides some decent answers to questions like: “How has democracy changed from Ancient Athens to the present day?” or “Why don’t we have so many political revolutions anymore?” Clearly, I (and teachers everywhere) have some serious work ahead in reconfiguring our exams and other assignments. As a good friend of mine points out, we need to improve how we assess students already and now we have another incentive. More, we can use the defects in ChatGPT to point out the difference between an intelligent answer and one that is merely coherent. In any event, the academic “arms race” is escalating!

One of the tools already in many college teachers’ arsenal is a service called “Turnitin,” an antiplagiarism software program that takes students’ electronically submitted essays and compares them to all the material on the internet, including Wikipedia, scholarly articles, and papers from students at other universities across the country. It’s very helpful, but it is a ‘dumb’ tool, it just matches words. Now that ChatGPT and its progeny will start writing student essays (slightly different every time), it’s about to go the way of the bi-plane. Fortunately, there is a new site that promises to be able to detect when an answer is written by an AI like ChatGPT. It’s called Originality .

I haven’t tested it yet, but I am quite curious to see how it develops. Competition (both economic and techie) being what it is, I can see a serious escalation of software vs. software intelligences coming up.

All of which leads up to the title of today’s posting: Grand Turing Test. The original concept was developed by the brilliant British mathematician Alan Turing in the mid 20C, at the very dawn of the computer age. He posited that computer “intelligence” could only be determined by a human who would pose questions to an intelligence (human or machine) in another room and, if the questioner was unable to tell whether the answers came from a human or machine, then the machine was, in fact, intelligent.  This is the goal towards which AI has striven for several decades.

I’m proposing that we will soon be facing a slightly different version, i.e. whether one computer/software program/AI (like Originality) can tell whether an essay is written by a human or by another AI (like ChatGPT). Both sets of programmers will be beavering away to instill even more intelligence-appearing (or intelligence-detecting). Is there an end in sight? Doubtful.

In the meantime, I hope to continue writing these essays with just enough quirkiness and insight that it will be some time until I can be replaced by an AI-blogger. Or, perhaps, I will just find a program, drop in a couple of prompting words and tell them to riff for 1100 words or so.

Or, maybe I did already.


0 Comments

Counting

12/16/2022

1 Comment

 
As of December 18, I will have been on this planet for 25,000 days. (No estimates are available for time (or non-time) I’ve spent on other planets, dimensions, or universes.) It’s a milestone of some significance, even if uncommon  in our normal reckoning of duration and lives. You can run your own calculations here.

Years and days are the only standard measurements of time that are “natural” rhythms of life (i.e. astronomical in origin). All others—months, weeks, hours, minutes, and seconds—are artifices, adopted for purposes of social coherence. So, it makes sense that they resonate more deeply and provide a more “organic” foundation for observance and commemoration. Still, We don’t normally count days (much less the other time periods) for an extended duration.

Just for illustration, 5000 days is about 13 ? years; 10,000 days is about 27 ? years; 20,000 days is about 54 3/4 years; and 30, 000 is just over 82. The oldest person alive currently (and the 4th longest recorded life ever), Lucille Randon of France, born on 11 February 1904, has seen 43,409 days.

According to the Bible, (Psalms 9:10) a full span of life is 70 years (~25,500 days). As I’ve noted earlier, health and demographics, particularly in the last 50,000 days (~ since 1870s), has lifted that target for most people on the planet. According to the Social Security Administration, I’ve got (on average) another 6,000 days or so

Days pass quickly enough and, almost as quickly, cycle into weeks and months. Every “older” person will tell you that time does seem to go by faster as one ages, but even in our youth these markers spin by, almost to no notice at all. Perhaps this phenomenon is related to the time dilation that Einstein posited in his theory of relativity; i./e., time slows down as we move faster.

Still, for newborns, counting by weeks seems OK for about half a year, then counting by months seems OK for 2-3 years. After we reach adulthood, for reasons of either boredom or fear of realization of age, we tend to shift to only paying attention to “big” birthdays, every five or ten years.

Whether days or years, round numbers tend to become occasions for celebration and reflection; both of which are good things, so I guess we shouldn’t pass up any convenient opportunity for either. I would invite you to bring over a cake, but I’m more of a pie guy.





1 Comment

Worth it?

12/9/2022

0 Comments

 
Some years back, when I was just starting my focused study of history, I had the inklings of a perception/question about the state of modern civilization—both our marvels and our struggles—and came up with a pithy and provocative framing question on which I have ruminated from time to time: “Was the Renaissance worth it?”

There are, of course, many reasons to criticize this question. The Renaissance was but one step, and not even the most important one in the formation of the modern world. The question’s pithiness borders on the cuteness of pop culture/journalistic attention-grabbing. More fundamentally, what do I mean by “it” and “worth”?

So, I’ll grant you the first two points, if you will acknowledge that “is modernity worth it?” is a bit more academic and off-putting phrasing. I could pitch the question in terms of whether the “scientific revolution” was worth it or “the Enlightenment,” or the “industrial revolution,” but the underlying issue comes back to the complex of developments, principally in Europe during the period from 1500-1900 that drove the principal components of how we live (or aspire to live) today; what I will loosely call the “modern world.”

Another way of framing the question of “Was it worth it?” is to adapt the standard political query every election cycle: “Are we better off now than we were 500 years ago?”

Even before we get to the intractable question of “worth” or “better,” it’s an immensely difficult question to tackle. We are the proverbial fish who swim around with little (no?) sense of the fact that we’re in water. It takes a lot of attention to contemplate living in a world without all the toys and tools and trials that we take for granted: vaccinations and travel, and longer lives and more knowledge, cultural diversity and globalization, electric pick-up trucks and telecommunications—a seemingly endless list of technological and cultural developments that many of us enjoy (or at least make use of) constantly. Even those (billions and billions) people today who are not at the higher end of the economic scale are still (mostly) not living in abject subsistence poverty. And the well-documented proclivity of humans to forget history means that with rapidly increasing populations over the past century, the percentage of folks who are aware of what life was like twenty years before they were born (to pick an arbitrary marker) is pretty small; a fortiori, 200 years before.

Being aware of who/what/where we are is tough enough. It is no easier to take a second step and imagine the life of the “pre-modern” person by way of comparison. We’re not talking cave persons here, but your average Ting, Dietrich, or Hassina who was born along with Leonardo da Vinci in 1452. Even in terms of material life, comparison is difficult. The only stuff we can measure is…the stuff we can measure. The first people to ride a train in the 1820s thought it was amazing to go 30mph; for us going that fast is mundane (except trying to go cross-town in Manhattan during rush hour). Technology, standards of living, longevity/health, scope of knowledge are all stunningly (and reasonably accurately) known to be better.  

Happiness is another matter as are human nature/morals. We have no way to measure these criteria and, indeed, we have trouble even figuring out the units of measurement, not to mention the profound differences in what those concepts meant to Ting, Dietrich, and Hassina vis-à-vis each other or vis-à-vis us, born 500 years later.

It’s easy to dismiss this axis of comparison as being unmeasurable (and therefore meaningless) and sticking to the “hard” countable standards. But, of course, that’s mechanistic nonsense. In addition, it’s useful to remember that (per human nature) much of human happiness is perceived in relation to our perceptions of others, not by any absolute standard. In the kingdom of the flushable toilet, there is still plenty of room for resentment and envy for those better off (and, of course, billions of people don’t even have flushable toilets).

Aphorisms like: “human nature never changes” might be true in some ways, but beg more questions than they resolve. For one thing, if human nature doesn’t change (certainly not over a span of a few hundred years), then the scope of potential human improvement has to be sharply limited. In other words, if we’re still the same “humans,” how can we be better off compared to 500 years ago?

In terms of morality, there might be a case for being better off. Yes, Martin Luther King did tell us that the “arc of history bends towards justice,” and in terms of our own (modern, liberal) standards, the relative status of women and people of color are demonstrably better. Still, we need to be careful, for nowhere is it more clear that it is difficult to see and assess ourselves than in terms of our epistemological ethos. Pretty much every culture has felt self-satisfied with their own morality, only to be looked down on by subsequent generations for their relative barbarism. To argue that this is a reflection of moral improvement looks dangerously like the victors writing their own history; it’s not to be fully trusted.

Finally, I have to mention all the stress, pain, and madness that seems to be part of modern life. I doubt that there is any way to assess mental health over centuries, but it does seem clear that the inherent discontents of civilization (as Freud described it (1929)) are piling up. I rather suspect that these are not due the current (i.e. last 50 years) state of the world, with globalization, consumerism, and techno-overload. Instead, these recent developments are but the accumulation of disorientation, alienation, and a general acceleration of the pace of life which seems increasingly overwhelming to so many. This is not just a matter of school shootings and election-deniers, but can be seen more broadly in the general crisis of governability and social cohesion which is plainly evident in many (most?) countries around the world.  The creeping sense of climate fatalism only makes it worse.

There is, of course, no way to go back and “re-boot” the system. Nor am I a fan of gratuitous “Luddite-ism.” However, to the extent we can make choices about our future direction as a species, we need to consider those who, over the centuries have called for greater attention to the art of living well; not merely being “well-off.” Perhaps, at the least, we might slow down and shift our priorities away from the material. Maybe we have enough “stuff.” The frenetic drive for “improvement” is, after all, pretty narrowly focused. It’s partner—the drive for “growth”—(as I have argued earlier) is similarly suspect.

Was the Renaissance worth it? It’s hard to argue with Leonardo, Newton, and Watt/Edison/Tesla et al.; particularly from the comforts of the modern 21C home office. As a historian, I understand the nonsensical nature of such a question. As a person, I have to say that there are significant costs for the road we’ve been marching down and we need to stop pretending otherwise.

0 Comments

Sauce for the Gander

12/2/2022

0 Comments

 
Recent anxieties about Chinese expansionism, especially towards Taiwan/Formosa/Taipei, have featured a host of admonishments, mostly from countries who have no ability to affect the situation, as well as a rich strain of language filled with “strategic ambiguity” from the US about its likely response. The President has been the most forthright about warning off the PRC from reclaiming what for the past fifty years has been acknowledged on all sides to be an integral part of China. It’s strategically problematic, but—as usual—I’m more concerned with a bunch of interesting historical parallels.

The posture of the US and the “West” to Chinese hints and exercises has been couched in terms of the preservation of liberty, democracy, and opposition to international aggression, leveraging the rebuff of Putin in Ukraine, and with relatively little mention of our dependence on the world’s dominant microchip manufacturers based in Taiwan.

In 1823, a different President (James Monroe) proclaimed a “doctrine” of a US sphere of influence over the Western hemisphere. He was especially concerned with Spanish efforts to reassert control over its recently rebellious colonies occupying the majority of the hemisphere south and west of the US border (roughly a line from Miami to New Orleans to Denver in current terms). This was a political gesture, since the US had no capability to project military power outside its boundaries at the time. Over the balance of the 19C, the “Doctrine” was restated and expanded (e.g., the “Roosevelt (Teddy) corollary”), usually in alignment with the expansion of US military and commercial power.   During the Cold War of the later 20C, it was revived (aka the “Truman Doctrine”) with an eye towards Moscow’s promotion of world revolution. It provided a fig leaf of rationale for dozens of US military interventions across the 19C and 20C.

While the Monroe Doctrine claimed a foundation, too, in democracy and liberty, it was often described in more “realpolitik” terms as a type of “cordon sanitaire” (or protective belt) to keep other global powers well away from our shores. Indeed, recognition of various great powers’ spheres of influence was a well-established diplomatic practice to prevent the “big boys” from bumping into each other by ensuring that their informal imperial areas did not become a flash point for conflict.

While there are  examples from the Congress of Vienna (1815) to the Versailles Peace Conference (1919), perhaps the most famous and long-lasting example was that accorded to the Soviet Union in the aftermath of WWII. Churchill agreed (and FDR acquiesced) in Soviet control over most of central Europe. The US understood that support for liberal and democratic governments in these countries was futile, given the presence of the Red Army and the general weariness following the War.  Realpolitik trumped idealism.

A similar story played out at the end of the 20C when the US acquiesced in the Chinese take-over of Hong Kong from the erstwhile British hegemon; even if China had to wait until its own economic and financial system was strong enough to stand on its own, making Hong Kong’s nominal independence expendable.

So, what is the basis for the US to deny China the right to a protected zone?

Before you answer that, give some thought to a scenario where the Confederacy had holed up in Southern Florida after losing at Appomattox in 1865. What if President Grant started some table pounding about “finishing” the Civil War and some (literal) saber-rattling about launching an expedition to preserve American “national integrity”? How would our (the “Union”) position compare with what China is saying now? Is their moral stance based on Chinese “national integrity” and a rejection of the exploitative economic model of unbridled capitalism all that different from what would have come ringing out of the North back then?

Generally, the US has a robust history of holding others to standards to which we might aspire but far too often fail to meet: very few war crime trials for the Allies in WWII, we get really angry when some “radical muslims” kill about 3,000 Americans, so we proceed to kill some multiple of that (just in terms of innocents/collaterals) in retaliation, and our extensive list of “interventions” around the world in the last 100+ years which is hard to distinguish in some ways from other countries’ “aggressions” (beyond our vastly better PR). We’re very good at self-satisfied justifications, colored by myopia, and a refusal to consider others’ epistemologies.

We heard similar outcries over allegations of Russian “interference” in our elections in 2016 and 2020. It’s almost as if we forgot the actions of the CIA and diplomatic/propaganda efforts with regard to democratic processes in dozens of countries in Latin America, Africa, and Europe.  Most other former imperial powers have comparable lists and we all are conveniently self-righteous when the shoe is on the other foot.

In other words, there’s been an awful lot of “how dare they…[do to us what we did to them]” going on. As global power shifts, and the 200+ year dominance of the West draws to a close, it would be good if we had a bit more humility about our aims, practices, and effects. It’s OK to play global power politics, let’s just not pretend that there is an automatic congruence between our past actions and our current posturing or between what we say and what we do.
0 Comments

Feelings

11/25/2022

2 Comments

 
No, this is not a riff on the maudlin, lounge-lizard standard from “Cats” in the 1980s. It is about a phenomenon that we all encounter every day and which is embedded in our public discourse; i.e., the conflation of feelings and solutions.

More or less contemporaneously with the Andrew Llyod Webber hit, John Gray wrote a book called “Men are from Mars; Women are from Venus.” The core concept (which Gray unfortunately pitched in a gendered frame), is the vast gap between the way people express themselves and what is understood by their listeners.

All of us process the world from both emotive and analytic perspectives. But most of us have a default mode (I’m definitely more towards the “analytic” end of the spectrum; and rate much more towards the “thinker” end of the Myers/Briggs spectrum than the “feeler” end). We may also express ourselves more from one stance and hear others from another. These differences are a recipe for a vast range of “failures to communicate.” Regardless of your own preferences and tendencies, it takes some real attention to parse what people really mean underneath what they are saying. Expressions of unhappiness may appear as the rejection of plans which are not (apparently) connected to the source of the discontent. Ideas are advanced that are more a manifestation of angst or delight than any assessment of the need for or cost/benefits of the nominal proposal.

If I were a professional psychologist, I would vector off into a discussion of how this shows up in personal or business relationships or friendships (familiar to anyone who has a spouse, child, or business associate). But as an observer of the political culture, I can’t help but notice that much of our current cultural confusion is tied to this problem. I was particularly struck by the attention to poll numbers earlier this year which showed an uptick in Biden’s “approval” rating. The commentariat was quick to draw potential implications for the mid-term elections, but it seemed to me that these results were a great example of my point.

The nominal polling question was: “Are you satisfied with the President’s performance of his job?” But most folks don’t know what the President does or even what he has the power to do. The global economic upsets caused by the war in Ukraine and the pandemic are but the most recent examples of events which profoundly affect people’s lives over which the President has little control. Hell, as Trump discovered to his frustration, the President is barely in charge of the Executive Branch, much less the whole US Government—much, much less the economy and the vectors of global infection.

Most people were actually answering a different question: “Are you happy about your current life and prospects?” (i.e. based on health, family, jobs, views on abortion or who won the 2020 election). In other words, they were giving a “feelings” answer to a “solutions” question.  This obviously doesn’t tell us much about whether Biden is actually doing a good job or even what people’s opinions  of his work and policies.

Much the same can be said of the reaction to Trump both during his Presidency and since, both by supporters and detractors. Indeed, he has made a political career out of not really caring about the merits and policies but giving voice to the anxieties of a fair number of Americans. (Much the same can be said of a host of other world leaders who tap into the same visceral fear in many of their citizens.) At the time of his first campaign, there were those who said that the Dems erred by taking him literally, but not seriously; while his supporters took him seriously, but not literally. I think there is a lot to this framing, but rather than using a Red/Blue (or even, per my recent post, a Left/Right) dichotomy, I suspect the real demarcation is between those that are analytical and those who resonate more with their feelings (at least when it comes to politics). How that split maps into the political spectrum is an interesting question (for another day).

Trump is hardly unique in this regard, even among US political leaders. You may call it “political license” or disingenuity, but leading with comfort and reassurance rather than policy specifics seems to be a prerequisite to electability rather than a disqualification. Neither Wilson nor FDR expressed a desire to go to war, based on their assessment of the public mood, even as they recognized the realities of global affairs. People vote for psychological security, confidence and comfort rather than briefing papers.

You may consider that this is but one more example of me “tilting at the windmills” of our 21C human world; but I regularly find it useful to think in these terms when listening to others talk, whether politicians, spouses, friends, or others. I’m not sure that I understand their feelings, but if I can at least pause rather than responding to their literal/nominal statement; I find that I’m better off.

2 Comments
<<Previous

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly