Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Calculators

7/25/2025

0 Comments

 

A long, long time ago, when my early science fiction reading was focused on Asimov, Bradbury, and Clarke, I read a bunch of Asimov stories about what were then unimaginably powerful computers. One was called “The Feeling of Power” (originally published in 1958) and another was “The Last Question” (originally published in 1956). 60+ years later, technology has finally caught up with imagination and both stories shed light on our current challenges in coping with AI.

[I have to confess that when I was trying to track them down and re-read them for this blog post (50+ years from when I read them initially), I realized that the plot lines I thought I had remembered was completely different from what Asimov actually wrote; testimony to the creativity and elusiveness of human memory!]

In “The Last Question,” there is a highly-advanced computer intelligence called “Multivac.” Across the millennia of the storyline, Multivac is regularly asked the question about how the universe was created and always replies that there is insufficient data to respond. Eventually all human intelligence (we don’t worry about bodies anymore) is uploaded and Multivac amalgamates all human knowledge. Now god-like, it provides and implements the answer.

Now, I can’t predict the extent of AI capabilities five years hence, much less after several millennia, but this story highlights for me its essential limit: AI can’t be more than the sum of its parts. It’s a very deep and fast compilation of what we (think we) know and it is already accelerating the accumulation of knowledge; but it can’t do philosophy, it can’t get to “TRUTH.” It’s a product of science and science is a human and social process of moving towards knowledge and approximating truth. Ditto for art. AI will produce beautiful pictures and moving literary passages (not to mention more succinct business memos), but will it capture the ineffable? 

If AI capabilities get to an asymptotic approximation of truth and beauty, this raises, in turn, the two questions: First, if truth and beauty are merely human constructs, then an AI product should be able to achieve that level of capability. Second, even if AIs can only get close, isn’t that sufficient for most people, most of the time. After all, even disregarding the current trends of ‘dumbing down’ and reduced reading, most folks don’t read Shakespeare and Tolstoy very much. Murder mysteries (and SciFi) are fun and “Squid Game” is Netflix’ most popular show. It’s not hard to imagine all AI-designed and produced video series in the next decade. The upshot is that it’s hard to see the limits of AI and, as importantly, leaving a small sliver of ineffability for whatever might be called “true” art and philosophy won’t really matter; much as we all ignore the difference between the Newtonian physics we all live with comfortably and effectively every day and the “true” Einsteinian/quantum physics that actually describes the universe.

“The Feeling of Power” is, nominally, a quite different tale. An interstellar human society is at war and in its dependence on computers, it has forgotten how to do math. One guy rediscovers how to do it and his archaic techniques (well known to any pocket-protected engineer trained in the 20C) are used to win the war. 

The question implicit in this story actually predates computers. Asimov lived to see electronic calculators (remember those!) become widespread by the late 20C. I remember by then people were wondering if students were losing the ability to do math manually. Computational power has exponentially accelerated this concern, even before the arrival of AI. Ten years ago, if I had put “cube root of 193” into a Google search bar, it likely would have directed me to a webpage that included a table with a long list of such results. Today, AI-driven search just pops the answer back (btw, it’s 5.779….). Who needs to do math anymore? 

All this points to the fragility of human societies and our increasing dependence not only on the technical infrastructure that makes these answers immediately available (e.g., wifi, cellular networks, electrical grids) but on the underlying mathematical and research capabilities themselves (e.g., search engines, YouTube “how-to” videos). Even before AI, it was increasingly clear that my college students didn’t really research their papers; search engines had become a crutch for thinking through both the nature of the problem they were writing on and the logical methods of finding materials that addressed the substantive questions.

A few years ago, there was a flurry of concern about the risk of North Korea or some terrorist group setting off an electro-magnetic pulse (“EMP”) that would fry computers and the electricity network over a wide swath of the US. If you think living without immigrants is a big economic and social problem, imagine going without electricity for a month and facing a year backlog on replacing your computer. Pretty much everything in our infrastructure would collapse.

The likely ubiquity of AI capabilities will increasingly coddle us and enable us to live very nicely without learning and discovering very much. Even without the network crashing, the road to the world described in Asimov’s “Feeling of Power” seems likely to put us at considerable risk of not being able to function independently. In the story, the ability of humans to calculate missile trajectories is used to win a war. [For a similar tale, Arthur C. Clarke’s “Into the Comet” (1960) relies on the use of an abacus on a spaceship to avoid being stranded when the computer burns out.] We need not reach such rarified scenarios to recognize that an element of human psychology and culture that we’ve taken for granted for centuries could easily fade away.

In a groundbreaking historical analysis “The Art of Memory” (1966), Frances Yates studied how humans trained their memories to retain information and culture for millennia and how this art was affected by the rise of print and literacy. As we developed this artificial means of storing information and stories on increasingly readily available paper, our interest in and ability to recall began to fade in the early modern era. It now looks like we’re in the process of doing something similar. Pencil and paper—and the ability to think through problems—are increasingly at risk.


0 Comments

Crises

7/18/2025

0 Comments

 

It’s easy to feel overwhelmed these days. We seem to have run out of metaphors for extreme fear/upset/disorientation facing individuals, our society, and humanity. Breathless media coverage of the appalling political/legal action de jour contributes a lot to the sense that everything is spinning out of control. The water may, in fact, be swirling in the bowl, on its way down the drain. If so, there’s likely little to be done beyond gathering myself up, standing firmly, and connecting to those who mean the most to me.

If, however, there is another chapter to this story, thinking clearly about the challenges facing us and figuring out a plausible substantive response seems to me necessary. In this case, the advice noted above is equally applicable, at least as a start.

In order to actually do something useful, it’s important to figure out what challenges present the most significant threats to maintaining our individual and collective hopes. Unfortunately, given the inevitability of limited resources, this means triage. Working ferociously on the seventh biggest problem is not a recipe for our continuation and success. This means that some injustices need to be ignored, some stupidities should be left to molder on the sidelines, and some lives will be lost. For example, a recent piece in the NYT argued that litigators defending rights who lose in the lower courts should perhaps NOT appeal to the current Supreme Court, since they’re not likely to win there and a local loss is not as bad as a national loss and resources could be put to better use elsewhere.

Each person will have their own list of priorities, but not building such a list only ensures that effort will be wasted. Each list has to be based on hard thinking, a firm ethical outlook, and a set of parameters on which to build that list.

While I believe that the country and the world need radical change, I hope that this can be accomplished without wholesale calamity and that some mode of coherent human society can be maintained. The threshold threats to civilization thus seem to me to be 1) climate change, 2) AI, 3) nuclear/biological disasters through war or terrorism, and 4) the general collapse of national communities which are the foundation of democratic values (both across the list of countries and as a support for some kind of international system). 

My priorities are based on the likelihood of mass death and destruction and the amount of time it would take to recover from these particular crises. In other words, existential crises take priority over the merely godawful, horrible, appalling or self-inflicted.

With due regard for the interrelationship of these phenomena, and especially mindful that any of the first three could lead to the fourth, I am prepared to defer action on competing with China, defending Ukraine, any number of civil wars and famines in the world, the “war on drugs,” the disruptions of immigration, unfairness in the tax code, etc. etc. In other words, if we don’t “fix” the top items on my (your) list, it won’t matter much if HWSNBN lines his family’s pockets, or Israel gets over its current spasm of excessive violence.

Of course, I’m not in a position to do much about most of these to any significant degree; so, to some extent, my statements can be characterized as posturing or as an “academic” exercise. But since I can only be responsible for myself and my attitudes and actions, perhaps all that I (you) can do. 

Looking historically, it’s difficult to come up with any plausible precedents for our current global predicament. Prior to the modern era, the world was sufficiently disconnected so that a crisis in one country or region would have limited effects elsewhere. Similarly, the cumulation of  capabilities of our modern technologies is sufficiently recent that we can say that there was no means by which any threat could have such a wide and deep impact nor was there any means of doing something about it. We can look at the fall of the Roman Empire as having only a regional impact, even if it took a thousand years to reboot European civilization. The most likely candidate for the leading global human disaster is the Black Death of the 14C. Somewhere in the neighborhood of a quarter to a third of humanity died as a result, and that too took a couple of centuries to recover from. 

This gives us some perspective on the combination of WWI/Great Depression/WWII (probably the greatest disruption of the modern era). The prospect of nuclear war that has loomed over the world for the past 65 years is arguably greater, but we have dodged that bullet so far. The threats to global human society from my four horsemen are of a greater order of magnitude than those we faced a hundred years ago. All the rest of the crises in human society (insert your list of wars, revolutions, pandemics, etc. here) filled the news reports and fill the history books but are otherwise (relatively) small beer.

Franklin Roosevelt who, as much as anyone, helped turn the tide on the early 20C set of crises, famously said: “The only thing we have to fear—is fear itself.” It’s true at both a social level and at a personal level as well. Falling into the abyss of despair is a pretty sure route to the bottom. Roosevelt was endowed, in the words of Oliver Wendell Holmes, with a “second-class intellect, but a first-class temperament.” We were fortunate that things turned out as well as they did. In our own time, we have chosen a political leadership with a second-class intellect and a third (fourth?)-class temperament. Globally, the alternatives are only relatively better. 

So it is to my own temperament that I must turn; and you to yours.


0 Comments

1453

7/11/2025

0 Comments

 
Picking any single year in history as a focus for study and explanation for a wider interpretation of trends and developments that spread over decades is always a fraught proposition. It’s easy to pick years that have become famous as “turning points” (e.g., 1914 as the start of WWI, 1968 as a year of domestic political upset in the US, France, and Czechoslovakia). History, however, happens every day; and every year is filled with milestones in trends and events from which we can characterize cultures and eras. As long as we don’t overload any particular event with significance and remember that we’re just taking an illustrative snapshot.

A couple of years ago, I developed a world history teaching unit around the year 1905; connecting events in Japan, British India, Russia, and England as markers of the nature of modernity. 1905 wasn’t such a well-known year as years go, which was an important part of why I chose it.

In my current course on Early Modern Europe, I will be highlighting the years 1453 and 1776 as bookend years for this period and geography. I’ll talk about 1776 in a few weeks. Today, I’ll focus on 1453 and the challenges of periodization.

Before I get into the specifics of why I think 1453 is a especially significant year (or, at least, one that is especially interesting to talk about), I should clarify that “periodization” is the term historians use to describe how they pick the beginnings and end of particular period. It’s an important part of the process of defining the scope of a book or a course. It’s pretty much of a non-issue if you’re talking about the Soong dynasty, or the Roosevelt Presidency, but periodization gets contentious when historians debate the start or end dates of periods whose significance depends in part on whether certain events or developments are to be include, e.g., the “Scientific Revolution” or World War II. In the later case, some start with the beginning of WWI in 1914, some pick the rise of Hitler in 1933, some choose 1935 as the start of the military confrontation between Japan and China, some go with the 1939 start of war in Europe, and others, focusing on the US experience, start with Pearl Harbor in 1941. It all depends on what the historian’s argument is about the nature and significance of the origins of that war. In other words, periodization is an integral part of historical interpretation. 

In the case of Early Modern Europe, there are extensive debates about both the start and end points. Most college course catalogs use 1500 as a demarcation between ancient and modern, mostly because it’s a nice round number and there are any number of reasonably events that together can be used for the starting/ending point. 1500 makes more sense for European focused history (e.g. “Western Civ) than for world history and, increasingly, instructors have stretched the starting point of the latter (“modern”) period earlier and earlier.

In my newest course, I’ll be using 1453 as the demarcation. There are several events that make it useful as a “hook,” even if these events and developments slop over into surrounding years and decades. First, there’s Gutenberg and his Bible. The development of printing and its spread across Europe and then the rest of the world changed just about everything and, in many senses, hasn’t been matched for impact until the computer/AI era in which we are immersed. Printing and the (democratizing) distribution of information were essential parts of the rapid development of science, the spread of Protestantism, the plausibility of democracy as a political form, and the restructuring of societies and economies around the world. 

Second, a couple of days before Gutenberg “dropped” his new Bible, the English lost a battle to the French at Castillon. This effectively marked the end of the “100 Years War,” and closed out a chapter of the overlap of interests, land, and battles which had occupied the nobility of those two cultures (neither was quite yet a coherent country) off and on since William sailed from Normandy and took over parts of Britain in 1066. This enabled them each to focus more at home, although they continued to fight regularly through the end of the 19C. It certainly didn’t mark the end of European warfare either, but the political structure of Europe went off in a radically different tangent than if the English had won (a sort of “anti-Brexit,” if you will).

Third, 1453 was also the year that the Ottoman Turks finally conquered Constantinople, solidifying their control over the Eastern Mediterranean region and, importantly forcing Europeans to look to develop other routes to the riches of Asia and the “East.” The series of expeditions and explorations launched by the Portuguese, the Spanish and later, the English and French connected the world for the first time, and provided the platform for overseas empires, trade, and the concentration of wealth which are only now being rebalanced.

My fourth event for 1453 was less dramatic. The Italian artist Donatello completed a remarkable statue in Padua that was a hallmark of the emerging style. It combined innovative technology and an aesthetic that deliberately echoed classical Roman works. The “Renaissance” is one of those periods over which historians argue endlessly as to periodization, but this is a convenient starting point for this profound cultural development.

Not bad for one year (actually, just a couple of months in the summer). 

Was this set of events a coincidence? There isn’t much to connect them. The trick is to see if there were some deep and apparently disparate movements in history of which these are merely the items that popped to the surface. Regardless, they provide a focal point for the huge heap of people, events, and developments that constitute history. Since (whether we’re talking books or classes) we have to put some limits on how much time we take up for our audience and some thematic starting points for discussion, I’ll be using 1453.

0 Comments

Res Publica

7/4/2025

1 Comment

 
Medium-high on the list of egregious abuses of power of the current administration is their cavalier commercial self-dealing and the blithe conflation of personal interest and benefit (financial and psychological) with the good of the nation. This type of corruption is hardly new; many (most?) people in power find ways—consciously or not—to line their pockets or help their friends and families or set up indirect quid-pro-quos to the detriment of those they rule.

Notions of communal benefit are just as long-standing, and are captured today in the concepts of fiduciary duty, altruism, and public service. However, they have struggled against greed and egotism all along. For thousands of years, when governments were merely extensions of family power which dominated localities, and (over the centuries) regions, then countries (i.e. monarchies and aristocracies); rulers’ responsibilities to God (in various configurations) were rarely troubled by the idea that there was any duty to ordinary folks. Much of the development of the modern state in Europe in the 16C-19C was built in conjunction with emerging ideas (e.g. Locke’s liberalism, Montesquieu, Frederick the Great’s “Anti-Machiavel”) that the state was intended to serve the people at large. (Such a notion, I hasten to add, was slightly distinct from democratic theory that the people at large should rule or choose their rulers). Indeed, the very idea that the “state” should be seen as an entity separate from the ruler took quite a while to take hold (as evidenced by Louis XIV’s famous early 18C statement: “L’etat? C’est moi!” [“The state? It’s me!”]).

Indeed, the boundary between “public” and “private” has always been muddled. Kickbacks, “golden shares” in nominally private enterprises, bribes, etc. were pretty much the order of the day and only gradually became seen as improper. Public officials were expected to be personally paid for performing their official duties. Public offices were sold to the highest bidder who then had to recoup their investment. In 18C France, the right to collect taxes in various regions was sold off and tax collection was a business opportunity. 

Under the modern mode of government such practices still occur, but less frequently; and are usually deemed improper and criminal. They range from a cop being slipped a C-note to ignore a speeding violation to customs officials lining their pockets to fail to inspect a shipping container to selling subscriptions to personally-owned social media channels which are then used to make official proclamations and statements to the sale of government assets to “friends” for a song (see any number of Russian oligarchs). The recent DOGE effort to expunge “waste, fraud, and abuse” from government operations was led by a leading government contractor (among other attributes) and would be laughably hypocritical if it weren’t for the damage to the beneficiaries of government programs and the livelihood of government employees it engendered.

Our modern democratic theory posits a government whose purpose, Jefferson said, is to enable our citizens to have “life, liberty, and the pursuit of happiness.” What we call “democracy” is concerned with the means by which this power is directed and the rule-makers and implementers are chosen. The vehicle for making this happen is a “republic” (i.e., a clearly-demarcated “state” or government as the crystallization of our society).  The separation of this entity from private interests is essential and is embodied in such concepts as the “rule of law.” A republic is inherently different from a monarchy in which government is merely an organized administration of society for the benefit of the monarch/aristocracy. All this is adaptation from Socrates, Aristotle, Cicero et al.

The crux is the recognition of the “res publica” [latin for the “public thing”], the political community that is distinct from individual private interests and is more than the sum of those interests. This means that we acknowledge and value a shared life together; that to enhance that life for all of our group we have imagined a locus of the common good. The special value we put on this embodiment makes its abuse by corruption worse than ordinary fraud or theft. 

Of course, the existence of the res publica does not mean (Marxist utopianism notwithstanding) that there are no private interests. In addition, the inherent problems with the excess concentration of power means that there is public benefit in dispersing such power across multiple entities and individuals. Government bureaucracies (like other bureaucracies) get stultified and inefficient. In short there are plenty of reasons for a “private sector” and for the state to implement some of its activities through arrangements with companies, NGOs, and individuals. 

Precisely where those lines should be drawn is a complex and dynamic set of decisions. Should a state build its own naval ships or hire a private shipyard to do so? Should the care and feeding of the needy be handled by private, religious, or charitable groups or should a government build public housing? Historically, these have been contentious issues and as the scope of government activities has expanded (usually as a product of increased democracy), they have multiplied. There are no simple answers at either the policy or the implementation levels for these and the myriad other functions a government might undertake. There is no room for simple bright lines here.

So, while we have to carry on with doing, we also have to strive to protect this notion of the “res publica.” In our modern semantic salad, we have tossed together “democracy,” “republic,” “constitution,” “government,” “state,” “law,” and a bunch of other sliced fruits and vegetables. There’s lots of ways in which this salad can go bad or be corrupted. Decay will easily spread, so we can’t protect democracy without dealing with the breakdown of the rule of law or the abuse of the public interest (and vice-versa).

On this 249th anniversary of the publication of Jefferson’s powerful expression of our national aspiration, It’s too bad we’re being led by crooks (and ignorant crooks at that.)

1 Comment

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly