<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" xmlns:dc="http://purl.org/dc/elements/1.1/" >

<channel><title><![CDATA[Steve Harris - Condemned to Repeat It]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it]]></link><description><![CDATA[Condemned to Repeat It]]></description><pubDate>Fri, 03 Apr 2026 08:50:56 -0700</pubDate><generator>Weebly</generator><item><title><![CDATA[A Revolution?]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/a-revolution]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/a-revolution#comments]]></comments><pubDate>Fri, 03 Apr 2026 14:26:14 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/a-revolution</guid><description><![CDATA[Last week, as I was sending out my uncheerful assessment of the state of international law, a good friend passed along a recent posting from the invaluable Heather Cox Richardson about the fundamental reshuffling of the global order currently underway as part of the present administration&rsquo;s general program/chaos. My friend asked if this constituted a revolution. My bottom line: it&rsquo;s too early to tell.The Richardson piece draws on recent statements by the Foreign Minister of Singapore [...] ]]></description><content:encoded><![CDATA[<div class="paragraph" style="text-align:left;"><font size="4">Last week, as I was sending out my uncheerful assessment of the state of international law, a good friend passed along a recent posting from the invaluable Heather Cox Richardson about the fundamental reshuffling of the global order currently underway as part of the present administration&rsquo;s general program/chaos. My friend asked if this constituted a revolution. My bottom line: it&rsquo;s too early to tell.<br /><br />The <a href="https://heathercoxrichardson.substack.com/p/march-26-2026" target="_blank">Richardson piece </a>draws on recent statements by the Foreign Minister of Singapore and other developments&mdash;Iran, Viktor Orban in Hungary, the erosion/demolition of US constitutional controls&mdash;to sketch an ominous picture of both the domestic and international scenes. I&rsquo;ve commented on many aspects of this situation; the vast majority of which are somewhere between troubling and horrific. So, from certain perspectives, things look bleak. Does that make for a revolution? Let&rsquo;s look at the domestic side this week.<br /><br />If we take the loose, popular definition of revolution as a big, quick, dramatic change, then yes. But many Historians feel obliged to take a longer-term perspective. Modern political revolutions might well be dated from the English Civil War (1840s-50s) and the Glorious Revolution (1689). Since then, whether something counts as a revolution depends in part on when you&rsquo;re asking the question.&nbsp;<br /><br />Even the &ldquo;American Revolution&rdquo; (which arguably, merely replaced the ruling structure of a small peripheral country with one set of rich white guys with another set of rich white guys) has been the subject of debate as to when the &ldquo;revolution&rdquo; occurred. Benjamin Rush argued that the Revolution continued after the War had been won, but Thomas Jefferson, said the Revolution had already been completed by the issuing of the Declaration of Independence.<br /><br />The Great French Revolution went through four different regimes before Napoleon and then reverted to the Bourbon Monarchy in 1815. Important French Historians argue that the Revolution was not completed until the 1880s. The Russian Revolution, too, went through multiple stages and directions. If you had asked whether there was a revolution happening at various stages, you might well have gotten a different answer. So, a snapshot taken in April, 2026 might look pretty inaccurate by September or by 2029.<br /><br />All this potted history tells us is that you can&rsquo;t tell what&rsquo;s going on while it&rsquo;s happening, much less have any sense of what the outcome will be. Indeed, it&rsquo;s hard to find any historical evidence for a revolution ending up anywhere near what most revolutionaries thought they were starting when they were starting it. In general, the pressure of historical inertia and the complex dynamics of current events quickly and sharply skew the &ldquo;best laid plans.&rdquo;&nbsp;<br /><br />Revolutions arise though a confluence of events, trends, and personalities. Once they get past the stage of throwing out the old regime, revolutionary coalitions usually fracture, cracked apart by circumstances require that compromises and leave any pre-existing ideological program severely frayed if not in shambles. Lenin flip-flopped on basic principles of socialism once he was steering the ship. Factionalism and egomania (e.g., self-proclaimed &lsquo;guardians&rsquo; of the revolutionary spirit) usually make a hash of any coherent program.&nbsp;<br /><br />Now that we have established a firm foundation of uncertainty, we can turn to the question of our leading &ldquo;revolutionary.&rdquo; The orange-haired one is a charismatic leader of the first order, but he is no ideologue. He has surrounded himself and channeled the views of a coterie of folks whose combination of smarts, sycophancy, and smarm have given him a set of policies more notable for their drama and disruption of norms than their ability to move the nation towards their self-proclaimed vision. There are definitely revolutionaries among them: Bannon, Miller, Vought; but they are all derivative of him and lack their own power base. Most of the team is just along for the ride. This is actually fortunate; he would be more dangerous if he were actually interested in constructing a new version of the US rather than self-enrichment and self-aggrandizement.&nbsp;<br /><br />I&rsquo;m not a psychologist (even if I am married to one), but you may consider the following definition from Wikipedia:</font><ul><li><font size="4">Attention deficit hyperactivity disorder is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, impulsivity, and emotional dysregulation that are excessive and pervasive&hellip;. ADHD symptoms arise from executive dysfunction.&nbsp;</font></li></ul><br /><font size="4">I have argued previously (<a href="http://www.steveharris.net/condemned-to-repeat-it/samson" target="_blank">Samson, 030725)</a>, that his endemic short-termism won&rsquo;t move the country much past the phases of turmoil. Combined with his age and apparent cognitive decline (&ldquo;Sleepy Don&rdquo;) this will leave the country&rsquo;s direction wide-open in a few years. Still, we can&rsquo;t deny his short-term impact. Domestically, previously settled constitutional and political norms are being tossed aside at several levels.&nbsp; &nbsp;</font><ul><li><font size="4">Notions of comity and incrementalism that have characterized our political life for two hundred years are being ignored.&nbsp;</font></li><li><font size="4">Institutional safeguards embodying the concept of the separation of powers are becoming meaningless principally due to the lack of backbone shown by Republican members of Congress.&nbsp;</font><br></li><li><font size="4">The liberal/progressive project of constitutional change via judicial decisions that built much of the jurisprudence over the past 75 years has proven reversible.</font></li><li><font size="4">There are also a host of policy changes being made radically altering the scope and direction of federal government activities across the board from rights to support programs to budget priorities.</font></li></ul><br /><font size="4">Globally, the situation is much the same.&nbsp;<br /><br />As in most revolutionary situations, there are a lot of problems with the incumbent regime. I have little hope that the Democrats as currently constituted are capable of addressing the real problems the country and the world face. A couple of months ago (<a href="http://www.steveharris.net/condemned-to-repeat-it/a-poisoned-chalice" target="_blank">A Poisoned Chalice, 020626</a>) I suggested that the best that could be hoped for from the next center-left administration was to staunch the bleeding and stabilize the patient.<br /><br />In sum, while we might be able to sketch several (more or less dire) scenarios for the future, we can have little confidence about the future, regardless of the outcome of the next election, not to mention any number of geopolitical, climatic, or economic contingencies. Could we be in the middle of a &ldquo;revolution&rdquo;? Sure, we&rsquo;re at least ten years too early to tell (and likely at least 25 years).&nbsp;<br /><br />History offers few examples of rapid cultural change. Societies evolve, change takes time to digest, what happens in capitals may not show up in the ordinary life of the hinterlands for a while. Most revolutions are futile. Resist evil, but remember to breathe.</font><br /><br /></div>]]></content:encoded></item><item><title><![CDATA[International Law]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/international-law]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/international-law#comments]]></comments><pubDate>Fri, 27 Mar 2026 15:32:45 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/international-law</guid><description><![CDATA[With the US rampaging around the world, with air strikes and assassinations, one doesn&rsquo;t hear too much about international law these days. Our current Administration premises its policies on realpolitik, relegating international law and morality to the proverbial dustbin of history. I look on this with mixed feelings, born of a lengthy engagement with international law and a greater degree of self-reflection.Some decades ago, I was a big fanboy of international law. In both college and law [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">With the US rampaging around the world, with air strikes and assassinations, one doesn&rsquo;t hear too much about international law these days. Our current Administration premises its policies on realpolitik, relegating international law and morality to the proverbial dustbin of history. I look on this with mixed feelings, born of a lengthy engagement with international law and a greater degree of self-reflection.<br /><br />Some decades ago, I was a big fanboy of international law. In both college and law school, it was a major part of my academic career. I was even a student member of the American Society of International Law and the founding editor of the Michigan Journal of International Law. As a junior lawyer, although I shifted my focus to telecommunications, I continued to work on international telecommunications regulation issues for a while. When I got back to school, I chose to write my dissertation on the history of arbitration: the peaceful settlement of disputes between countries from the late 18C through WWI. I also wrote an article on certain aspects of British treaty practice in Africa in the late 19C.<br /><br />Of course, during that time&mdash;stretching from the 1970s to the 2010s&mdash;the nature of international law changed as did the ways in which we study its history. Along the way, I&rsquo;ve come to take a more critical stance about International Law than I did as a youth. Back then, I subscribed to the idealistic view of international law as a vehicle for incremental global progress towards peace and the rule of law. It was still a work in progress, but progress was being made, both in terms of principles and general adherence as well as of the construction of more extensive sets of rules and regulations governing international trade and other activities. Now I see the later portion as continuing to progress, but the grand vision is looking pretty faded.<br /><br />One of the essential problems with the general public perception of international law has to do with the word &ldquo;law.&rdquo; Most of the time, we think of traffic laws, criminal laws, corporate law, etc. These are all within a domestic context and are enforced by the relevant sovereign government (e.g. Colorado, Canada). This type of law &ldquo;works&rdquo; because it is generally accepted by the people subject to it, usually complied with, and enforced by the government through police and courts. It's part of the social contract we have all implicitly signed as members of a particular society. In this context, it&rsquo;s OK that Coloradans drive on the right side of the road and the British drive on the left or that a will in Delaware requires two witnesses, but in the Netherlands, you have to have two witnesses plus a notary. Differences in national rules and behavior are entirely acceptable.&nbsp;<br /><br />International law, on the other hand, isn&rsquo;t really &ldquo;law.&rdquo; That is to say, there&rsquo;s nobody to enforce it. Countries (which are the subjects of international law, just like citizens are the subject of Colorado law) haven&rsquo;t signed any &ldquo;social contract&rdquo; by which they agree to accept and abide by the rules enacted by the UN or WTO or similar groups. Even if a country signs a treaty, there&rsquo;s nobody to enforce it once it&rsquo;s breached. For example, a few years ago, the International Court of Justice ruled that China&rsquo;s claim of control over much of the South China Sea was unfounded. Now, China had signed the ICJ&rsquo;s underlying treaty, but there&rsquo;s nobody to &ldquo;legally&rdquo; make them dismantle their bases on the contested islets.<br /><br />In this way, international &ldquo;law&rdquo; has been (since its modern inception in the 16C), aspirational. There&rsquo;s lots of cajoling going on, and bad press if you break the rules, but not much else. Even the relatively recent development of international criminal courts (starting at Nuremburg in 1946) still have a highly selective impact. As a practical matter, its only losers in war who get tried.<br /><br />This reveals a fundamental problem with law in the somewhat &ldquo;anarchic society&rdquo; of states; it&rsquo;s highly political and its impact is often a function of power. This has been illustrated by historians working with the development of international law across the past five centuries. What we call international law is almost exclusively the product of a small group of European thinkers who were trying, in the context of Christian Europe, to defend &ldquo;civilization,&rdquo; and promote rules of behavior which they wanted countries to adhere. They had no power, just ideas. Moreover, they wrote their aspirational rules with a bald disregard for those people, countries, and cultures outside of Europe. War and slavery were acceptable, even if &ldquo;laws of war&rdquo; were written to try to make it a touch less barbaric. In other words, it was selective, distorted, and used to justify oppressive behavior by Europeans as they bestrode the world.<br /><br />History, they say, is written by the victors. So, too, is the law; whether domestic or international. That is to say law is a function of power and the most powerful are almost always subject to the least amount of legal constraint. One of the reasons for the relative demise of international law at the broad principled level (as compared to the relatively well-functioning administrative/ regulatory level) is that the most powerful country has fended off its advances. Constraints on US behavior are usually criticized as &ldquo;political&rdquo; (a defense which does, in all honesty, have more than a germ of truth); but much of the defense is no more sophisticated than &ldquo;Don&rsquo;t gotta, don&rsquo;t wanna.&rdquo; China&rsquo;s refusal to subject itself to the Western-rooted system is similar.&nbsp;<br /><br />In a world where the US (increasingly baldly) acts based on a worldview based on power and declines to even pay lip service to morality in international affairs, there is less and less reason for the 190+ other countries in the world to act differently.&nbsp; For centuries, international law was at least allowed to be the hypocritical standard of behavior around the world; even that is now in danger.&nbsp;</font><br></div>]]></content:encoded></item><item><title><![CDATA[Seven Guineas]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/seven-guineas]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/seven-guineas#comments]]></comments><pubDate>Fri, 20 Mar 2026 15:07:14 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/seven-guineas</guid><description><![CDATA[I&rsquo;m just starting work on a course about &ldquo;Modern Empires&rdquo; to be offered this summer. In poking around this vast topic, it struck me that there is some significance behind the etymology of a term that arises in multiple locations and contexts: Guinea. As a place name it appears to show up in six different countries on three continents (Guinea, Guinea-Bissau, Equatorial Guinea (all in Africa), Guyana and French Guiana (South America), and New Guinea (Oceania)), and a currency. An [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">I&rsquo;m just starting work on a course about &ldquo;Modern Empires&rdquo; to be offered this summer. In poking around this vast topic, it struck me that there is some significance behind the etymology of a term that arises in multiple locations and contexts: Guinea. As a place name it appears to show up in six different countries on three continents (Guinea, Guinea-Bissau, Equatorial Guinea (all in Africa), Guyana and French Guiana (South America), and New Guinea (Oceania)), and a currency. And therein lies a tale.<br /><br />&ldquo;Guinea&rdquo; first entered European awareness as a Portuguese adaptation of the term &ldquo;guineus;&rdquo; their way of referring to the &ldquo;black&rdquo; Africans (as distinguished from the lighter-skinned Africans of the Berbers and Arabs of North Africa. The Berbers themselves used the term &ldquo;Ghinawen&rdquo; meaning &ldquo;the burnt people.&rdquo; Or it might be tied back to the important trading town of Djenn&eacute; (now in Mali). In any event, the term became applied to the entire region as well as to the nearby portion of the Atlantic.&nbsp;</font><br></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="http://www.steveharris.net/uploads/3/2/0/9/32095583/picture1_orig.png" alt="Picture" style="width:auto;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><font size="4">As they hop-scotched down the coast, swapping territories and seeing which ones would be economically viable and militarily defensible, Europeans ended up with a hodge-podge of territories, paying no attention to existing ethnic groupings, chiefdoms, and empires either in terms of organization or naming. What is now Guinea was French, what is now Guinea-Bissau was Portuguese, and what is now Equatorial Guinea was, sequentially, Portuguese, Spanish, British, and finally Spanish. Each attained independence as part of the mid-20C wave of decolonization.<br /><br />Over on the other side of the Atlantic, Europeans (all the usual suspects) were equally active. While the Spanish and Portuguese took over most of South America, the French, British, and Dutch grabbed relatively small chunks of the coast just to the north of Brazil in the 16C and 17C.&nbsp; They each referred to their territories as Guiana, a name NOT derived from their African activities , but from an indigenous word meaning &ldquo;land of many waters, a reference to the many streams which flow into the Atlantic there. Suriname took a new name upon independence and French Guiana is still a part of France.</font><br></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="http://www.steveharris.net/uploads/3/2/0/9/32095583/picture2_orig.jpg" alt="Picture" style="width:auto;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><font size="4"><br /><br />New Guinea was the name applied by Spanish explorers in 1545, to the island which the locals called &ldquo;Papua&rdquo;. Ynigo Ortiz de Retez used &ldquo;Guinea&rdquo; since he saw a resemblance between the locals and west Africans. Later, control over the island became shared by the British, German, and Dutch empires. Today, the western half is part of Indonesia (the imperial successors to the Dutch), while the British and Germans did a deal in the 1880s to split the eastern half. The Brits passed their piece on to the Australians during WWI. They added the German northeast quarter in the aftermath of WWI, and granted Papua-New Guinea independence in 1975.&nbsp;<br /><br />The British have many names for their money. The &ldquo;Guinea&rdquo; was an actual coin minted from the middle of the 17C to early in the 19C, and was originally worth one pound. It got its name from the source of much of the gold that was used in the minting: the &ldquo;Guinea&rdquo; region of West Africa. It was used as an informal synonym for a pound throughout the 20C, long after its direct connection was superseded. Indeed, long after British attention to the lesser portions of their African empire had waned, this linguistic remnant continued in everyday culture.<br /><br />(Btw, &ldquo;guinea pigs&rdquo; come from western South America and are thus not &ldquo;Guinean&rdquo; (or &ldquo;guyanan&rdquo;) (or, for that matter, pigs). Nonetheless, the term may well have emerged into European consciousness via their transshipment from the Guyana region on the Atlantic Coast, thus acquiring that (distorted) nomenclature.)<br /><br />What can we take away from these linguistic connections and coincidences? First, they show the interconnectedness of the various European imperial projects. In both Africa and South America, the first empire in place inspired the follow-ons, both in terms of seeking commercial and proselytizing opportunities and in nomenclature. They also show the modern impacts of decisions made centuries ago by captains and explorers.&nbsp;<br /><br />Place names (and other words) affect how we see the world. Referring to the United States of America as &ldquo;America&rdquo; leaves out the other 34 countries on the two continents (and, of course, just referring to us as &ldquo;The&rdquo; United States omits other countries with similar names, including the United States of Mexico, Brazil, and half-a-dozen defunct polities). Calling the islands in the Caribbean the (West) &ldquo;Indies&rdquo; is good evidence that Columbus and other European explorers of the late 15C were headed for East and South Asia when they stumbled across the various land masses of the Western Hemisphere.&nbsp;<br /><br />In our case here, the Spanish explorer who brought a conception of darker-skinned people from Africa across two oceans to an island near Australia was expressing the importance Europeans placed on the difference in the color of peoples&rsquo; skin as a defining attribute even where, genetically and culturally, African &ldquo;Guineas&rdquo; were likely more closely related to Europeans than to the &ldquo;Guinean&rdquo; people of the Pacific. This definition by difference highlights that what we call &ldquo;race&rdquo; was an embedded part of European world-views long before &ldquo;scientific&rdquo; racism blossomed in the 19C.<br /><br />So, too, was the lust for gold that spurred many explorers and exploiters. Deriving the name of a significant aspect of your domestic coinage after a region named for the color of the locals&rsquo; skin says much about British imperial culture.&nbsp;<br /><br />The plethora of &ldquo;Guineas&rdquo; today owes much to the power of European imperial practices. We can never know what names would have emerged if local powers had developed on their own terms instead of being run over by the white Christians from the northwest corner of Afro-EurAsia. Nor can we, even as we recognize the awfulness of the behavior often visited upon them, take a stab at the broader questions of alternate history: What would the rest of the world have looked like if Europeans didn&rsquo;t or couldn&rsquo;t exercise the power they did across the planet from the 15C through the 20C? Who would have been &ldquo;better off&rdquo; and who worse? (Especially since our very sense of &ldquo;better&rdquo; and &ldquo;worse&rdquo; is derived from that same dominant European culture.)<br /><br /></font></div>]]></content:encoded></item><item><title><![CDATA[Democracy and Complexity]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/democracy-and-complexity]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/democracy-and-complexity#comments]]></comments><pubDate>Fri, 13 Mar 2026 15:48:40 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/democracy-and-complexity</guid><description><![CDATA[We&rsquo;ve come a long way from the citizen assemblies of ancient Athens. Some towns in New England still make decisions by getting all the local citizenry together, but that doesn&rsquo;t work with larger groups of people. The solution&mdash;more-or-less standardized since Madison and the gang in Philadelphia (1787)&mdash;is &ldquo;representative democracy.&rdquo; Having the masses choose which members of the elite would make decisions for the whole society was a way to keep power in the (rela [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">We&rsquo;ve come a long way from the citizen assemblies of ancient Athens. Some towns in New England still make decisions by getting all the local citizenry together, but that doesn&rsquo;t work with larger groups of people. The solution&mdash;more-or-less standardized since Madison and the gang in Philadelphia (1787)&mdash;is &ldquo;representative democracy.&rdquo; Having the masses choose which members of the elite would make decisions for the whole society was a way to keep power in the (relatively trustworthy and reliable) hands of those with a mix of education and wealth, while providing a means for the demos to express itself and retain nominal oversight of the process and tools of government.<br /><br />Many things have changed in the last 250 years in the US and around the world: in terms of the condition of the mass electorate, we have industrialization, an information tsunami, mass education and culture, and a whole lot more people living. While the precise spread between elites and masses have varied over time and across different cultures, the fundamental differential remains. (It&rsquo;s one of the discrepancies which our current idealization of democracy buries.) In terms of the world, these same factors, particularly the nature and extent of technologies (high and low, material and cultural) have made for a much more complex environment. On top of these phenomenological changes, our knowledge levels (or, at least, our beliefs about what we know) about causation and effects have drastically increased the difficulty of making decisions. Science and experience have made us more aware of the broad implications and longer-term effects of any nominally straightforward policy option. Whether dealing with tariffs, vaccines, immigration, AI regulation, in toting up the pluses and minuses of any potential policy decision we have a lot more entries. Keeping track of them, then weighing them, and sorting through the trade-offs has become a much more difficult process. We can see this in the construction of bureaucracies and administrative regimes (e.g. taxes, Medicare, education policy, and various schemes of discrimination and preference).&nbsp;<br /><br />What challenges do this increased complexity of life raise for the practice of democracy?<br /><br />The theory of modern democracy is that informed citizens should deliberate and select representatives who devote sufficient time and intellect to comprehend the issues and resolve the political issues that are inherent in the differing beliefs, interests, and priorities of any large group of people. Elected representatives should, in theory, become experts in sorting policy choices as well as interpreting the values of their constituency and applying them to those policy choices. That&rsquo;s &ldquo;normal&rdquo; politics. However, there are problems with this theory both in terms of the represented and the representatives.<br /><br />First, the levels of complexity and detail are so great that even most elected representatives (especially at the state and federal levels) can&rsquo;t begin to cope and, de facto, delegate their decisions to either their party leaders, their colleagues (vote-swapping), or their staff members or the implementing bureaucracy. Each presents its own particular problems and risks of corruption and distortion, but in essence, our representatives are working through representatives, most of whom aren&rsquo;t elected. This was not so much of an issue in the 18C or early 19C, but has swamped the process since the mid-20C.<br /><br />However, the more fundamental concern is that the electorate, for all its increased education and access to information, can&rsquo;t make more than a broadly directional choice when it comes to electing representatives. We&rsquo;re all different, but there&rsquo;s only a limited set of possible representatives to choose from. Both politics and policy intersect in different mixes, political parties kinda help alignments, but present their own complications and corruptions. The result, in our over-saturated media age, is decision by sound-bite, charisma, and money.<br /><br />Representative democracy thus looks inherently problematic and even more so these days. Unfortunately, neither of the two most popular solutions is a real improvement. The more well-established alternative is the popular referendum, allowed in about half the states. As we in California know all too well, this method of popular participation is messy and subject to the same money/media distortions as other modes. One principal problem is that ordinary voters are called on to read through and understand an extensive statutory implementation of some policy scheme. By the time you winnow out those who have the capability to work through these challenging questions (e.g. tax codes, environmental regulation, social benefit eligibility) and those that are not interested enough in public policy and have the economic wherewithal to devote the necessary study hours, you&rsquo;d end up with a fairly skewed (and self-selected) group of citizens; hardly a representative body.<br /><br />The same problem undercuts the various proposals to have &ldquo;ordinary&rdquo; citizens, usually selected at random, participate either as part of regularly-established legislatures or as a stand-alone &ldquo;citizens&rsquo; assembly&rdquo; which would have a role in the legislative process). It&rsquo;s nice symbolism, but I suspect this innovation would merely transfer power to the staffs to explain things and the bureaucracies to implement them.<br /><br />I have been thinking about this issue in the context of an upcoming lecture marking the tenth anniversary of Brexit, the 2016 decision which propelled the United Kingdom out of the European Union (pretty much of a disaster on all fronts and of which more in a coming posting).<br />British voters were presented with a nice short &ldquo;yea-or-nay&rdquo; question for an immensely complicated situation about which there was no clear understanding of what was likely to ensue.<br /><br />Still, I think there is a role for referenda, as long as they are relatively straightforward and high-level AND subject to either implementation by the legislature or (unlike Brexit) a further ratification of a final detailed plan. Capping referenda questions at 100 words would likely provide broad policy direction without pretending that the electorate reads the third subclause of the fourteenth section of the proposed law.<br /><br />Not a great overall solution to an increasing problem of democratic governance in the 21C, but worth a shot. We need other ideas of how to balance the complexity of the real world with ensuring all the ordinary folks can help steer the ship.</font><br></div>]]></content:encoded></item><item><title><![CDATA[One Perspective on Capitalism]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/one-perspective-on-capitalism]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/one-perspective-on-capitalism#comments]]></comments><pubDate>Fri, 06 Mar 2026 16:33:42 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/one-perspective-on-capitalism</guid><description><![CDATA[I recently bought a book (no news there!) and faced a panoply of choices as to price, delivery speed, format, and book quality. Multiple websites provided a great demonstration of the ubiquitousness of capitalism at work in modern life. Consumers&rsquo; preferences are dissected and products designed to meet those particularities. So, it&rsquo;s no small irony that the book in question was Sven Beckert&rsquo;s new doorstop (1000+ pages) providing a comprehensive history of &ldquo;Capitalism.&rdq [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">I recently bought a book (no news there!) and faced a panoply of choices as to price, delivery speed, format, and book quality. Multiple websites provided a great demonstration of the ubiquitousness of capitalism at work in modern life. Consumers&rsquo; preferences are dissected and products designed to meet those particularities. So, it&rsquo;s no small irony that the book in question was Sven Beckert&rsquo;s new doorstop (1000+ pages) providing a comprehensive history of &ldquo;Capitalism.&rdquo; As we said when I was in the law biz: &ldquo;res ipsa loquitur&rdquo; (&ldquo;the thing speaks for itself&rdquo;).&nbsp;&nbsp;<br /><br />As a work of history, &ldquo;Capitalism&rdquo; is well thought-through and remarkable in its research, if rather too heavy for the casual reader. Beckert shows that capitalism was a global phenomenon, drawing on practices and experiences far beyond the usual &ldquo;its all about Europeans&rdquo; (including the US) framework. He also shows that it has deep roots, extending far earlier than the usual early-modern/industrial revolution/robber barons/globalization storyline. After all, as I have noted elsewhere, greed and profit are hardly modern inventions. He does a good job, as well, in blowing up the myth of &ldquo;laissez-faire,&rdquo; the idea that large businesses have developed apart from and in spite of governmental activity.<br /><br />Beckert applies a phenomenological focus; i.e., he concentrates on the actual practice of &ldquo;capitalists.&rdquo; Marx merits less than 30 mentions and other theorists (pro and con) are similarly sidelined.&nbsp; I would have preferred a more inclusive approach, but his is a legitimate choice and his story benefits from its grounding in the real world. My bigger concerns are that 1) he doesn&rsquo;t pin down the definition of the concept he&rsquo;s writing about, and 2) he doesn&rsquo;t wrestle with how the capitalist mentality spread and swamped other values-based cultural systems. There is, to be sure, a reference to capitalists&rsquo; focus on markets/commodities/money, but there is something in modern commercial practice that is different from the mindset of traders a thousand-or-two years ago, and he doesn&rsquo;t grab on to it.&nbsp;<br /><br />This, to me, is the central issue. Capitalism has been a principal strand in the story of the modern world, whether economic, political, or ideological. Historians in general, however, are loath to take on psychological changes, however fundamental they might be. There&rsquo;s good reason for this, since the evidence is sparse and largely inferential and there&rsquo;s always a risk of self-projection. And yet, without understanding or at least suggesting some ways in which historical actors were motivated, we can&rsquo;t come close to understanding how history came about.<br /><br />As I said a few weeks ago (Capitalism and Me, 012326), I see capitalism as a culture (i.e., a socio-economic-epistemic system) in which we define ourselves and evaluate others and determine how to act across our lives principally from an economic perspective: morals are secondary to money.&rdquo; In contrast, the practices and institutions which manifest this mentality are what Beckert is talking about.<br /><br />It is important not to make moral judgments about these institutions per se. Lord Acton famously observed that &ldquo;power corrupts and absolute power corrupts absolutely.&rdquo; In a similar vein, St. Paul found money to be the &ldquo;root of all evil.&rdquo; They were, however, both wrong. A critic of Acton saw that power itself was not the corrupting source, but rather the vehicle by which human corruption was revealed. Similarly, we can see that money, too, is just the means by which evil is exercised. The fault, in other (Shakespeare&rsquo;s) words, is not &ldquo;in our stars&rdquo; (or our wallets or our ability to affect others), but &ldquo;in ourselves.&rdquo; The economic power which the practice of capitalism concentrates in &ldquo;capitalists&rdquo; merely allows them to demonstrate their moral (dis-)abilities and their unwillingness to face the complexities (These fundamentals of human nature are essential to parsing the semantic soup that surrounds the term &ldquo;capitalism.&rdquo; I will have more to say in a later posting about the different ways in which that term and its cognates, &ldquo;capital&rdquo; and &ldquo;capitalist&rdquo; are used.)<br /><br />It's not difficult to see the practice of &ldquo;capitalism&rdquo; as the dominant economic system of the modern era. As with other human activities, its significance is the product of the confluence of means, motive and opportunity. The motivation of capitalists, in my framing, is based in the deeply-rooted nature of humans&mdash;a desire for security (both physical and psychological) and the many ways in which that desire is overextended, as most pithily captured in the traditional deadly sins of greed, envy, gluttony, and pride. The laissez-faire mythos may have some nuggets of truth, but it&rsquo;s mostly about the desire of &ldquo;capitalists&rdquo; to claim all the credit for the work of many; in other words: ego (also not a new story).<br /><br />So, from a historical perspective, the rise of practical capitalism is more a function of the means and opportunity, which are largely exogenous factors, and which we can parse into three angles. First, institutions and practices: banks, corporations, trading networks, advertising, governmental actions, etc. Second, new technologies which have created new productivity and economies of scale and scope (not least in terms of transportation and communications).&nbsp; Third, population growth and density which have created large numbers of consumers, thereby providing the demand which pays prices well above (the new, lower) marginal cost of goods (i.e, more profitability). These factors are the normal materials of historical analysis and we can trace their manifestation, at both the personal and societal levels. But we can&rsquo;t forget that without the psychological urge, there would be no capitalism at all.<br /><br />Beckert focuses on these exogenous factors, principally the first and second. I&rsquo;ve got a bunch more reading lined up on that score and related topics; perhaps someone has taken on this moral/psychological angle in historical perspective. I&rsquo;ll be back with more on the semantics, the significance, and the solutions&mdash;in due course.</font><br></div>]]></content:encoded></item><item><title><![CDATA[Change, Transition, and Crisis]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/change-transition-and-crisis]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/change-transition-and-crisis#comments]]></comments><pubDate>Fri, 27 Feb 2026 15:29:07 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/change-transition-and-crisis</guid><description><![CDATA[In our current bleak days, it&rsquo;s easy to forget that throughout history there&rsquo;s always a crisis going on. Those in the middle of a specific crisis (that would be us!) often lose perspective amid the pressures and urgencies of the moment. Indeed, I might suggest that we can see that crises are often (usually?) the product of overfocusing on those urgencies and putting off dealing with the important underlying issues until the pressures build up and they crater into their own urgent cri [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">In our current bleak days, it&rsquo;s easy to forget that throughout history there&rsquo;s always a crisis going on. Those in the middle of a specific crisis (that would be us!) often lose perspective amid the pressures and urgencies of the moment. Indeed, I might suggest that we can see that crises are often (usually?) the product of overfocusing on those urgencies and putting off dealing with the important underlying issues until the pressures build up and they crater into their own urgent crisis down the road.<br /><br />Even if I am wrong in my causal analysis, it&rsquo;s hard to argue against the notion that crises arise due to the accumulation of change, often sprinkled with some dramatic event which manifests those changes. Since History can be characterized as the study of change over time and since Historians are not above (over-)dramatizing their work, it&rsquo;s no wonder that &ldquo;crisis&rdquo; appears frequently in History book titles.<br /><br />There&rsquo;s the &ldquo;July Crisis&rdquo; (summer of 1914 leading up to WWI), the &ldquo;General Crisis of the 17th Century,&rdquo; Marx&rsquo;s ongoing &ldquo;crisis of capitalism,&rdquo; just to name a few and not to mention the innumerable localized or brief crises scattered about, usually of a geopolitical or economic nature. We have &ldquo;constitutional crises&rdquo; (Dred Scott, Nixon, Trump) and the Brits have &ldquo;cabinet crises.&rdquo; It&rsquo;s hard to tell the difference between a &ldquo;problem&rdquo; and a &ldquo;crisis;&rdquo; so much so that I suspect using the term &ldquo;crisis&rdquo; is just a way to claim attention, either contemporaneously or historically.&nbsp;<br /><br />Most History books that aren&rsquo;t &ldquo;crisis&rdquo; centered focus rather on &ldquo;transitions.&rdquo; How many times have I read: &ldquo;It was a time of transition from [A] to [B]&rdquo;? As if. As if there&rsquo;s not always some transition going on. The nature of history (small h) is that change is always happening: From the Mughals to the Raj, from sail to steam, from foragers to growers, from search engines to AI. As Johan Gouldsblom suggested, pretty much all of history can be summed up as: First, nobody has xx, then some folks have xx, then everybody has xx. Despite our idealization of the past captured in some historical &lsquo;snapshot,&rdquo; in fact (with the possible exception of &ldquo;revolutions&rdquo;) there is no stability from which any particular change is a remarkable difference. And, of course, few of these transitions have much in the way of a clearly demarcated starting point or ending.&nbsp;<br /><br />Perhaps crises are merely the crux points of transitions.<br /><br />It&rsquo;s also worth noting that much of this crisis/transition sensibility comes from elites who have the ability to observe this level of change. There are many (most?) whose lives are precarious and in a constant state of crisis; but they don&rsquo;t write books or blogs.<br /><br />In any event, characterizing some change as either a crisis or a transition is, likely as not, merely a rhetorical device. Sometimes, these are useful as when a historian puts a new frame of interpretation on events, such as the shift from plain old cell phones to &ldquo;smart&rdquo; phones or the shift in political alignment among Southern whites from the Democrats to the GOP in the aftermath of the mid-20C Civil Rights movement. Sometimes, however, putting &ldquo;crisis&rdquo; in the title is just a way to sell books.<br /><br />The insightful historian Adam Tooze (2022) characterizes our current era as being in a &ldquo;polycrisis,&rdquo; highlighting the multiple overlapping issues that seem to be coming to a head in the 2020s. It&rsquo;s not a terrible word (and certainly preferable to the overused &ldquo;perfect storm&rdquo; metaphor), but as with the 17C or the 1930s (just to pick two examples), any good crisis worth its name always include multiple components and angles.&nbsp;&nbsp;<br /><br />One important perspective that arises from looking at the history of crises is that whoever comes out the end figures out how to make do and eventually we get to us in the present day. In other words, however much contemporaries bewail their particular circumstances as the end of the world, it isn&rsquo;t. And, as noted from the beginning of this blog series, there are few lessons to be taken from these events/developments and fewer occasions for judgmentalism. If the Russians had deployed a bit more sophisticated military planning in 1914, the &ldquo;July Crisis&rdquo; would likely have ended up as a relatively forgettable third Balkan War rather than a continental conflagration. While we know the damage that resulted in the event, we don&rsquo;t know how the world would have been along the line of that alternate history; so we can&rsquo;t say which was worse or start blaming anyone for what we ended up with. Crises that get resolved get forgotten and subsumed into the flow of history.<br /><br />Perhaps it&rsquo;s a fundamental human addiction to adrenalin that makes it so attractive/effective/ necessary to hype things up and overdramatize the mundane. Perhaps parts of us secretly want to (as the Chinese curse) &ldquo;live in interesting times.&rdquo; On the other hand, it&rsquo;s hard to blame those (myself included on some scores) who try to bang the drum loudly when they see peril coming and unattended to. We revert to whatever devices lay at hand, including rhetoric, to rouse the sleeping populace even at the risk of overusing and devaluing the language to the point of moral exhaustion. On the other hand, Paul Revere would not likely have cried out to Lexington and Concord that &ldquo;the British crisis is coming.&rdquo;<br /><br />The real test of a crisis is what we do with it. As Winston Churchill (perhaps apocryphally) and Rahm Emmanual (certainly) said: &ldquo;Never let a good crisis go to waste.&rdquo; Still, historically, most crises do; folks muddle through until a bunch more changes pile up, we go through a transition or two and then walk into the next crisis.&nbsp; Those that then try to seize the day are called &ldquo;revolutionaries.&rdquo; They, too almost always end up in a crisis of their own soon enough.<br></font></div>]]></content:encoded></item><item><title><![CDATA[Degrees of Indistinction]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/degrees-of-indistinction]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/degrees-of-indistinction#comments]]></comments><pubDate>Fri, 20 Feb 2026 15:42:06 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/degrees-of-indistinction</guid><description><![CDATA[The State of California once led the country in establishing a vision for higher education and deploying resources that made college education available to millions of its residents and fostered world-class research across dozens of disciplines. Its latest move shows that this mid-20C spirit is faded, eviscerating the meaning of the degrees which are granted at both the high school and college levels. The institutions which were charged with realizing the original vision have fallen prey to self [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">The State of California once led the country in establishing a vision for higher education and deploying resources that made college education available to millions of its residents and fostered world-class research across dozens of disciplines. Its latest move shows that this mid-20C spirit is faded, eviscerating the meaning of the degrees which are granted at both the high school and college levels. The institutions which were charged with realizing the original vision have fallen prey to self-preservation and sclerosis.<br /><br />Specifically, a new law requires most California State University (&ldquo;CSU&rdquo;) campuses to offer admission to any California high school student graduating with at least a 2.5 grade point average (i.e., C+/B-). Now, (according to ChatGPT) the average high school GPA has inflated dramatically over the past 40 years. In the 1980s, it was under 2.4, it is now well over 3.0. This means that about 70% of graduates have at least a 2.5 GPA; i.e., below average students are now encouraged to go to college.<br /><br />[I have to acknowledge that as a graduate in the 1970s from an upper-middle class environment with lots of academic support and resources, I am part of the incumbent elite in this story.]<br /><br />The combination of these facts and the new law raise several questions:<br /><br />1) What is the purpose of a college education?<br />2) How does this law help students?<br />3) How does this law help legislators and the CSU system?<br />4) Why would most students who didn&rsquo;t do all that well in high school, want to go to college?<br /><br />The post-WWII expansion of higher education was seen as an important means of building a strong-middle class in the US, along with facilitating the growth of the economy (especially in the service and corporate sectors). A college degree was a mark of distinction and was well compensated in the employment histories of those who achieved it. While less than 2% of the population had a college degree in 1900, by 1960 this grew to over 7% and to over 16% by 1980. Since then, the rate has again more than doubled to about 38%.<br /><br />The marginal benefit of a college degree, therefore, had to fade over time. The mantra of &ldquo;to get a good job, get a good education,&rdquo; was tremendously effective in engaging students (and their parents) to steer their focus in this direction. But the surge in degrees necessarily means that they&rsquo;re no longer so distinctive and economically valuable.<br /><br />At the same time, broad social changes across the country throughout the second half of the 20C expanded the likely pool of college students and the social necessity of expanding college access. I don&rsquo;t think it&rsquo;s accurate to characterize the continued push for degrees to those aspiring to increased socio-economic status as some sort of deception or manipulation by elites, but there is some irony in the fact that this &ldquo;democratization&rdquo; of college degrees has coincided with their relative reduction in economic value.<br /><br />For decades, colleges have deployed resources, techniques, and programs to enhance and accelerate graduation rates for their students. Much of this effort responds to the relative decline in student&rsquo;s capabilities (for a variety of reasons) at the time they enter college. Overall, they were a modest success&mdash;at least on their own terms. The new California law continues this trend. It builds on an embedded belief that the purpose of a college is to produce college graduates. Clear thinking about the meaning and purpose of becoming one of those graduates is, however, harder to find.<br /><br />I&rsquo;ve talked before about the vocationalization of college education: the focus on job training and the diminution of the liberal arts. This process has developed in tandem with the &ldquo;industrialization&rdquo; of college education: turning colleges into assembly lines for the production of graduates with degrees.&nbsp;<br /><br />The new law now encourages high school graduates of below average performance to go to college where they will likely struggle even more than they did in high school, to spend five years or so and about $100,000 (plus living expenses) to get a degree that will get them into the middle of the mass of job seekers.&nbsp;<br /><br />Wow! Sounds like a great deal.&nbsp;<br /><br />That would be tough enough in ordinary times, but the AI-induced incipient upheaval in the entry-level white collar job market undermines any confidence that this traditional scenario will continue. (See <a href="http://www.steveharris.net/condemned-to-repeat-it/morlocks" target="_blank">012425, Morlocks,</a> for a comment on the long-term societal consequences).<br /><br />Besides providing more &ldquo;opportunity&rdquo; for young Californians, the new law also seeks to help those colleges campuses which have faced enrollment declines in the last several years. [This includes SF State, where enrollment drops led to my &ldquo;retirement&rdquo; in 2024 (so one might think I would benefit from this law).] In other words, let&rsquo;s artificially prop up demand for a service and institution that can&rsquo;t cut it in the marketplace.&nbsp;<br /><br />This is the sort of legislation that generates a nice sounding press release and campaign PR for its supporters (it passed unanimously in both houses). It&rsquo;s no wonder that most voters have little enthusiasm for our democratic institutions. Elected officials are more concerned with sound bites and demonstrating "action&rdquo; rather than actually thinking about what would be useful to young people and candidly re-assess the nature and purpose of the public university systems and the extensive expenditures we all make to support them.&nbsp;<br /><br />I&rsquo;ve spoken before about the limited utility of analogizing from history, mostly in geopolitical contexts. It&rsquo;s especially true here; we have no good sense of how this will play out, but the changes are likely to be deep and wide.<br /><br /></font></div>]]></content:encoded></item><item><title><![CDATA[The End of the World]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/the-end-of-the-world]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/the-end-of-the-world#comments]]></comments><pubDate>Fri, 13 Feb 2026 15:58:46 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/the-end-of-the-world</guid><description><![CDATA[In my recent comment on Bill Gates&rsquo; piece on climate (110725), I criticized his dismissive remark that the climate crisis was not the &ldquo;end of civilization.&rdquo; I pointed out how, for some individual victims and societies, it&mdash;literally&mdash;is. There are important civilizational questions about how we and those to come will choose to memorialize those whose lives or cultures were cut off in this way. The Museum of Climatically-Extinguished Cultures and Creatures is likely to [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">In my recent comment on Bill Gates&rsquo; piece on climate (<a href="http://www.steveharris.net/condemned-to-repeat-it/on-gates-and-climate" target="_blank">110725</a>), I criticized his dismissive remark that the climate crisis was not the &ldquo;end of civilization.&rdquo; I pointed out how, for some individual victims and societies, it&mdash;literally&mdash;is. There are important civilizational questions about how we and those to come will choose to memorialize those whose lives or cultures were cut off in this way. The Museum of Climatically-Extinguished Cultures and Creatures is likely to be pretty crowded by the 22C.<br /><br />Still, while my life is not likely to come to an end due to climate change, nor is civilization in general likely to collapse by 2039 (my planning horizon according to the Social Security Administration&rsquo;s life expectancy tables), this prospective doom got me wrestling with why and how I might think about those whose lives will continue into 2040 and beyond and how I should conduct myself in the meantime. In other words, since civilization will have come to an end as far as I&rsquo;m concerned, why should I care about those left behind?<br /><br />Philosophy has proposed and history has demonstrated several answers offered to this question. Just to put a bit of structure on this issue, we might divvy up our impact into these categories: the personal memories of those who will continue on, our biological progeny, physical manifestations of our existence, and cultural traces of our impact on the world. We might also distinguish between the impact/legacy of the few prominent people in the world and the vast remainder of us. Finally, we should note that many folks have lived based on their expected status/treatment in some shape of an afterlife (aka Heaven, Hell, Nirvana, Valhalla). However, since I don&rsquo;t believe in an afterlife, doing stuff now to get credit in the hereafter seems futile (as well as a bit tawdry).&nbsp;<br /><br />Based on how people behave, many folks want to be remembered by those still around or to come years and centuries post-mortem. And most are, at least by family and friends. But memory is a fickle thing, not only does it fade over the years, but by the time memories are passed down to succeeding generations, distortions are inevitable. Very, very few are likely to be remembered by anyone in any meaningful way more than 50-70 years after their own passing. Maybe that&rsquo;s as far ahead as we can imagine, so we don&rsquo;t care about our more distant legacies. The memories of the 5 billion (+/-) people who died over the course of the 20C are going quickly and in 30-50 years what will be left? My niece and nephew were in their teens when my mother passed. In fifty years, they will be in their 80s with faint wisps of her (either direct or through their father) remaining. I have a friend who is into genealogy and has reconstructed some of his family lines back for several centuries. It seems personally satisfying to him, but those long past have left only a name and a few traces of themselves. With all due respect to Ancestry.com et al., I don&rsquo;t aspire to be an entry on a long list compiled by a greatx8 niece in the later 22C.<br /><br />In terms of cultural remembrance, very few leave something behind. Wikipedia includes just over 2M biographical entries and maybe ten times that number have some sort of bio bit somewhere. Out of 100B people who ever lived, that&rsquo;s not very good odds at being remembered in this way. Donating money will get you a building or a pew, but as with the above examples, the names of those that do will become only words with no meaning behind them all too soon (ditto for street names). A few folks will be memorialized in published articles or their archives dug up by Historians or click-bait chasers. Of course, with big bucks or some luck, you can play in the big leagues: sainthood and multiple churches, universities (unless your money came from enslavement), or cities (Charlestown), states (Penn-sylvania), countries (the Philippines, Bolivia); but that&rsquo;s likely not more than a few thousand all in. Scientists and doctors have a nice racket going in naming diseases and natural phenomena after themselves (the Humboldt Current, Higgs&rsquo; boson, and Alzheimer&rsquo;s disease all come to mind). The pinnacle is likely the Taj Mahal: prominence at the less than one-in-a-billion level.&nbsp; All-in-all, for ordinary folks, we will likely get swallowed up in the maw of time. Even if some electronic record remains, who will look for it or do anything with it? (In this regard, access some record of mine by an AI/Borg as it hoovers up items from the past for some college essay in 2076 doesn&rsquo;t really seem to count!)<br /><br />In sum, if we can&rsquo;t count on being remembered in any meaningful way beyond a generation or two, the purpose of memorialization is more likely for the self-satisfaction of those who seek to be remembered; in other words: ego. If you think you&rsquo;re an exception, see Shelley&rsquo;s poem: <a href="https://www.poetryfoundation.org/poems/46565/ozymandias" target="_blank">Ozymandias</a> (1818).&nbsp;<br /><br />So, if personal, name-and-likeness legacy is a racket, is there anything I can leave behind? I think so, but it&rsquo;s not likely to be identifiable. I used to say, when I was teaching in college, that my impact on my students was more likely to be in things they remembered or ways of thinking they learned even if they couldn&rsquo;t remember my name, or even how they might have learned them originally. I suspect the same is true elsewhere, it terms of both family, social, or cultural influence (or even charitable donations).&nbsp;<br /><br />If the end of the world (objectively) coincides with the end of the world (subjectively), then it won&rsquo;t really matter. If the objective world carries on, who knows what direction things might take? Preserving the possibilities for future discovery is as about as far as I can see may be enough and my belief that I helped it to do so is valuable to me and sufficient as a motivation.&nbsp;</font><br></div>]]></content:encoded></item><item><title><![CDATA[A Poisoned Chalice]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/a-poisoned-chalice]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/a-poisoned-chalice#comments]]></comments><pubDate>Fri, 06 Feb 2026 15:53:20 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/a-poisoned-chalice</guid><description><![CDATA[If it weren&rsquo;t for all the mayhem and pain likely to come in the meantime, it would be a blessing that we have 2 ? years until the sturm-und-drang of next Presidential election. Of course, in our perverse media/money-driven system, the major potential candidates are already getting organized and positioning themselves. Even without the particularly dysfunctional nature of our political parties, it&rsquo;s not likely that we will see a candidate who is really ready to tackle the fundamental  [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">If it weren&rsquo;t for all the mayhem and pain likely to come in the meantime, it would be a blessing that we have 2 ? years until the sturm-und-drang of next Presidential election. Of course, in our perverse media/money-driven system, the major potential candidates are already getting organized and positioning themselves. Even without the particularly dysfunctional nature of our political parties, it&rsquo;s not likely that we will see a candidate who is really ready to tackle the fundamental issues facing the country. Indeed, whoever wins is likely to fail badly.<br /><br />Of course, I have zero confidence in the likely GOP standard bearers. If you multiply the imagination, compassion, and integrity of either Vance, Rubio, Ron DeSantis, or Ted Cruz, you might well get a negative number. Anyone else on that side with the capability of being President has either bowed out of our Trumpian-dominated public life or has a sufficiently low profile as to have no chance of serious consideration. As I have noted recently, the entire party seems wrapped up in issues and images of the past which, combined with the remnants of its &ldquo;small government&rdquo; philosophy, ensures a passive approach to the dire challenges ahead.<br /><br />There are half a dozen Dem Governors in the mix (Newsom, Shapiro, Beshear, Pritzker, Wes Moore, Whitmer) plus Pete Buttigieg. They all score much higher on the imagination, compassion, and integrity combination, but face a comparable internecine (&ldquo;progressives&rdquo; vs. &ldquo;moderates&rdquo;) drag within their &ldquo;party.&rdquo; Once elected, they would also have to deal with reconstructing the federal government, in terms of both personnel and policy, in the aftermath of the current evisceration. Even with my suggestion of an accelerated remediation (see my proposed EAGER Act, 052325), much of their first term would be spent getting systems and programs back to the ground floor from the current sub-basement level and dealing with the &ldquo;normal&rdquo; range of issues and crises.&nbsp;<br /><br />Getting Congress to act, even if it had modest Dem majorities in both houses, presents another ubiquitous hurdle to meaningful action. After all, at the end of the day, the Dems are only marginally more cohesive and effective than the GOP. They have their own share of personality squabbles, infighting, and inertia. They will also be distracted by the shiny toys of power and the opportunity to go after Trump and his many corruptions; as well as &ldquo;preventing&rdquo; his abuses from recuring. These are worthy targets in the abstract, but when establishing the priorities for taking care of the country, they have to be relatively low on the list<br /><br />The prospect of making fundamental changes will also run into the electorate&rsquo;s unwillingness to recognize root causes and bite the short-term bullets necessary for long-term improvement. Indeed, the one-word summary of today&rsquo;s popular concern is &ldquo;affordability.&rdquo; However, the economic data show that this isn&rsquo;t really a problem for most of the (middle class to well-off) folks currently complaining. There is something much deeper going on and it&rsquo;s not susceptible of quick fixes. This includes the loss of the &ldquo;American Dream&rdquo; (some version of &ldquo;Ozzie and Harriet&rdquo;), uncertainty about our place in the world (aka &ldquo;globalization&rdquo;), and a loss of confidence in society&rsquo;s and government&rsquo;s ability to maintain coherence and progress.<br /><br />Even if addressing those historical concerns was feasible, they wouldn&rsquo;t reach the underlying problems that demand prompt and radical action: climate, inequality, housing, and the imminent disruption of AI on our workforce and demographics.<br /><br />There are, of course, well-articulated proposals to deal with this list (except for AI where no one has any idea what to do). They require, however, a degree of radicalness that is alien to our self-satisfied and incremental political culture. New tax structures can generate much of the necessary revenue for comprehensive health care, housing, and basic income. Climate changes can be addressed. It is far more a matter of political will than of developing solutions.<br /><br />The best model of breaking out of this doldrums in the US is the famous &ldquo;100 days&rdquo; of the first term of FDR&rsquo;s administration in 1933. Huge Congressional majorities and a widely-recognized major economic crisis enabled some radical thinking to take hold. There is caution in this tale, however. A conservative Supreme Court struck down many components of FDR&rsquo;s program and it&rsquo;s not at all clear how effective those moves were in ultimately providing an exit ramp from the Great Depression.&nbsp;<br /><br />All in all, the chances that the US will be in better shape in 2032 than in 2028 are, therefore, not so great. Indeed, things might well be worse given the amount of damage that is currently being done (and I haven&rsquo;t even touched on international complications yet). So, if a moderately progressive administration comes in, they&rsquo;re not likely to look very successful four years on. Even if there is great success on fixing the current damage, rebuilding institutions, and laying the foundations for the solutions to long-term problems, that administration is not likely to be able to give much of an answer to the perennial question of electoral politics: &ldquo;Are you better off now than you were four years ago?&rdquo;&nbsp;<br /><br />Given the friable nature of the electorate, a further zig-zag is quite plausible. Indeed, the volatility of this zigzagging is part of what makes more extreme parties and leaders increasingly popular. This is especially visible in Europe. There are structural problems, to be sure: the difficulty of enacting programs with demonstrable effects within a single term. Added to this is the fundamental nature of the problems facing the country and the difficulty of devising solutions. The electorate, however, has&mdash;even in the best and most deliberate times&mdash;little patience for considering these constraints. The upshot is that whoever wins risks the specter of failure and subsequent rejection.</font><br /><br /></div>]]></content:encoded></item><item><title><![CDATA[An Old College Try]]></title><link><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/an-old-college-try]]></link><comments><![CDATA[http://www.steveharris.net/condemned-to-repeat-it/an-old-college-try#comments]]></comments><pubDate>Fri, 30 Jan 2026 16:32:30 GMT</pubDate><category><![CDATA[Uncategorized]]></category><guid isPermaLink="false">http://www.steveharris.net/condemned-to-repeat-it/an-old-college-try</guid><description><![CDATA[Everybody loves to beat up on the Electoral College. It&rsquo;s anti-democratic in multiple dimensions, it&rsquo;s subject to abuse, and it seems archaic. Here&rsquo;s a way to make an Electoral College relevant, useful, and constructive in the 21C.We have to start with the original purpose of the Electoral College, which was both sensible and intentionally anti-democratic. In the 18C world of limited communications and partial literacy, with a predominantly agrarian population, it&rsquo;s hard  [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font size="4">Everybody loves to beat up on the Electoral College. It&rsquo;s anti-democratic in multiple dimensions, it&rsquo;s subject to abuse, and it seems archaic. Here&rsquo;s a way to make an Electoral College relevant, useful, and constructive in the 21C.<br /><br />We have to start with the original purpose of the Electoral College, which was both sensible and intentionally anti-democratic. In the 18C world of limited communications and partial literacy, with a predominantly agrarian population, it&rsquo;s hard to see how most voters (even if they were heads-of-household) could have a sense of those capable of being the leader of the country. All the parts of the process with which we are familiar&mdash;declarations of candidacy, position statements and platforms, live campaigning&mdash;had yet to be invented. How could a farmer in the Virginia Tidewater region be expected to know much beyond the name of a John Adams or George Clinton? In such a world, only the political elites in each state could be expected to have a knowledge of the individuals and the issues which the country would face. That this accorded with Madison&rsquo;s aversion to blank democracy and a preference for keeping decision-making in the hands of &ldquo;men of affairs,&rdquo; is no surprise either; but this was not merely blind elitism.<br /><br />A lot has happened then, including the emergence of political parties, the vast, if gradual extension of the franchise, an increase in literacy, and the omnipresence of the media, now in its unfettered &ldquo;social media&rdquo; phase.&nbsp;<br /><br />We have can&rsquo;t deny the fundamentally dysfunctional nature of our current Presidential election process. Even if a precise parsing of the connections between the historical developments and the current state of things is a fool&rsquo;s errand; it's arguable that the shift to Presidential candidate selection though the popular vote primaries was a significant step in this sad evolution. In our media-saturated and polarized political culture, we have prioritized fund-raising, sound-bites, and the ridiculous spectacle of states jockeying for who gets to vote early in the process. There&rsquo;s a laundry list of problems, and I won&rsquo;t rehearse them all here. They&rsquo;re neatly captured in the point that the talents and capabilities necessary to govern are rather different than those necessary to get elected; as evidenced by the many politicians who have withdrawn from public life and the quality of those who remain.<br /><br />If we credit the College&rsquo;s original goals of balancing untrammeled democracy and producing a President (now far more important and powerful than Washington and his immediate heirs) who is capable of intelligent leadership of the country, then we may need to consider some radical approaches. It would not sit well in a political culture with an essential democratic premise to have a few folks make the final and unreviewable choice. Still, I suggest that we change both the way we choose the College and its role in the overall Presidential process.<br /><br />The second part first: the role of the new College would NOT be to select the final winner of the Presidential election. Rather, they would select four finalists from which list the general electorate would select the winner.&nbsp;<br /><br />They would do so through a process designed to offer a set of sensible choices to the public. The College would be named at the beginning of July. They would meet for a week at the end of August and announce their choices on September 1. I would seal them off in some resort for a week, something like a Papal Conclave. Each Elector would place four names on the ballot. A series of preliminary votes would winnow that large list down to sixteen. Each of these folks would then be interviewed by three (randomly selected) Electors for half an hour and the videos of these interviews shared with the entire group. Then through a ranked-choice final ballot, the four public candidates would emerge. Those four names would be put on the public&rsquo;s General Election ballot and (after a mercifully short two-month campaign period) voters would rank their choices among the four in order. Standard ranked-choice methods would winnow down the winner.&nbsp;<br /><br />While we&rsquo;re at it, let&rsquo;s choose an electoral college in a different way. Let&rsquo;s split the country into fifty equally-sized districts (each containing 6-7 million people), each district to elect one person. No one who has held elective state or federal office within the prior five years can run. That way, no one who&rsquo;s in the middle of the political process, with axes to grind and horses to trade and immediate IOUs to cash would be involved in the selection process. Then I would add the Secretaries of State, Defense, and Treasury and the Attorney-General from each of the two prior administrations, four retired Senators chosen by the existing Senate, four retired Representatives chosen by the existing House, and four retired judges chosen by the Supreme Court.&nbsp;<br /><br />That totals seventy people, which seems a large enough group. It would include a range of political, administrative, and judicial experience, but not enough to dominate the process. This is a far cry from the &ldquo;back-room,&rdquo; cigar-chomping caricature of the pre-primary era method of candidate selection. The voice of the electorate would retain its central role, but it would be tempered by the voices of experience and judgment. The requirement to nominate multiple candidates would ensure that a range of capable persons would be considered. Who knows, we might get a poet, professor, or seasoned executive in the mix.<br /><br />Only a few Presidential candidates over the past fifty years have inspired actual passion. Instead, most Presidents have been chosen primarily because they&rsquo;re marginally better than the &ldquo;other guy.&rdquo;&nbsp; They have been products of a process that favors campaign skills over the ability to govern and lead. Perhaps we can do better.</font><br /><br /></div>]]></content:encoded></item></channel></rss>