Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Vox Polluli

11/24/2023

0 Comments

 
The acceleration of the pace of change in modern society has been recognized for over 150 years. Whether, as a broad matter, this has been an improvement in the human condition is certainly debatable. In our own era, this acceleration has been particularly noticeable in the area of communications and information flows. Again, while there are many benefits, there are also profound and less visible costs to be paid.

In particular, the process of democratic deliberation and decision making has been disrupted by a combination of technologies and distorting information flows, often abetted/created by mass media, a process that was underway well before the advent of the internet, and which has been shifted into overdrive by technological capabilities driven by market forces with only glancing consideration of the fabric and aspirational values of our society. I have taken plenty of shots at the media generally in previous postings, so I want to target one particular angle here: public opinion polling. Thus, not “Vox Populi,” (the “voice of the people”) a traditional formulation of the basis of democratic culture, but “Vox Polluli,” a Latin-abusing neologism for looking to the polls as the basis of democratic culture.

Modern polling dates from the 1930s (a famous Gallup survey called the 1932 election for Al Smith over FDR), connected to the rise of modern advertising/marketing/consumerism of the early 20C. Today it’s a whole little industry of its own with academic studies, dozens of polling organizations and extensive media coverage of policy and political issues.  Technology has advanced from “please return the postcard with your opinions” to live, real-time assessments of Presidential debates and speeches.

“Public Opinion” (as apparently discovered and authoritatively articulated by such polling) is regularly reported on and seems to be relied upon as a basis for public policy decision-making by elected officials. There are several problems with this:

First, polls are simplistic and life is complicated. Generic expressions of broad philosophical principles are of little use in diagnosing problems or the real-world crafting of policy.

Second, few members of the public spend much time understanding even the first-level specifics of tax policy or education expenditures, much less the extensive complexities that each areas entails. In a world of eight-second-sound bites, the thought of more than one percent of the population taking half an hour to understand the mechanisms of trade relations with China is, to be mild, highly speculative.

Third, poll responses are often/mostly driven by ‘feelings,’ not facts. Disapproval of Presidential performance ratings, for example are usually  more a function of economic sentiment and psychological security than an assessment of what the “Leader of the Free World
 du jour is actually doing or is capable of doing. Indeed, there is a good argument that pollees (i.e., the people being polled) more-or-less consciously use polls for this purpose (i.e., as a “venting” mechanism rather than as a substantive expression of preferences for policy or candidates).

Fourth, things change—events, negotiations, compromises all happen too fast for most of us to keep up with.

All of these are, in effect, arguments for intelligent representative government with policy decisions made by folks who are chosen to spend the bulk of their time sorting through options and coming to conclusions about desired outcomes. In other words, to whatever extent direct democracy might have worked in Athens 2400 years ago, or in a New Hampshire Town Meeting today, it’s wholly inadequate for the modern world and groups of more than a few thousand. This is the same rationale for avoiding plebiscites on policy (e.g. referenda and public votes on detailed legislative public initiatives).

Our current polling culture short circuits the process of democratic representation by providing instantaneous answers which are then supposed to “guide” policy makers. Bad questions, bad answers, bad information; even if we had good legislators/officials, what could go wrong…? Of course, the media is less concerned with quality policy than with “news,” even if it’s meaningless, the “blah-blah” mouthings of innumerable candidates that you have to “trust the judgment of the ‘American People’ notwithstanding.

The media’s counterpart to the public opinion survey (wearing its coat of statistical validation) is the apparently non-scientific spectrum coverage article which takes quotes and views from  a full range of opinion. It’s of no more value than “some people like green and others prefer pink;” but it does enable the news outlet to ensure the public that it is listening and presenting everybody’s point of view, without apparent bias or spin (or value).

The upshot of this aspect of our political culture reinforces the corrupting influence of money by ensuring that those elected can claim to be “representing the people,” by listening to the polls, rather than their more fundamental job of leadership and public education on why they (and not their PACs) have taken the stances that they have why the complexities and compromises inherent in any democratic political system have worked in practice and why simplistic thinking doesn’t help anyone.

On top of this, is the obsession with speed and “breaking news,” best exemplified in the reporting on “exit polls” so that the apparent winner of an election can be designated a full 12 hours or so ahead of when results might otherwise be available. More media filler, more non-news; Pavlov would be proud.

As with most of modernity, there’s not much use in seeking to put the polling genie back in the bottle. It would be great if there were less and slower results. I’m not sure what public purpose is served by pre-election information on where the candidates stand, nor by post-election information on what “the people” “think.” Is it too much to hope that media outlets stop feeding the adrenalin junkies  and give due (i.e. less) attention to such matters? I suspect it is. Every country, as Joseph de Maistre said (1811), “has the government it deserves.” We have the media and political culture we deserve, alas.

0 Comments

Timeframes

11/17/2023

0 Comments

 
A recent piece in the NYT reported that a combination of astronomical, geological, and atmospheric developments will render the planet uninhabitable by mammals—in about 250 million years. Ah well, it’s been fun….

Looking further out, scientists have predicted the burning out of the Sun (~ 7.6 billion years) and, of course, the ultimate heat death of the universe in 100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000, 000,000,000,000,000,000,000,000,000,000,000,000, 000,000,000 years (more or less). On the more immediate side, the familiar apocrypha (nuclear war, climate change, massive pandemic, and Trumpism are regularly cited as likely causes of the end of humanity. Some have predicted that the Boston Red Sox or Chicago Cubs winning the World Series was a sure sign of the end of days, others view the Detroit Lions as an NFL contender in the same light. More religious types have looked to the “Rapture,” or other modes of the Second Coming of Jesus. Other deep cultural traditions have their own versions of the end.

We—both as individuals and societies—seem to have some difficulty in comprehending such big numbers and long periods. As a species (maybe as with any living thing) we’re highly focused on the “here-and-now.” At one level there’s a correlation between survival and immediate threats, so this primacy of presentism makes sense. Yet, I can’t but wonder if one of the benefits (purposes?) of a larger brain and human consciousness is the ability to think ahead. After we’re pretty sure no sabre-toothed tiger is nearby and that we have enough food safely stored for a few days, we can extend our perspectives.

Modern folks increasingly have such confidence in short-to-medium term survivability and can afford to commit an increasing portion of our attention to longer-term issues: saving for college or a home, planning for climate change, or retirement. As one moves up the ranks in the military or commercial organizations, more time is spent thinking about “the strategic,” or the “long-term” and leaving the tactical and day-to-day to those lower down the organizational ladder.

Historically speaking, “modern” folks seem to have a different sense of time than pre-moderns. This is due to several reasons. First, our awareness of time is a function of our awareness of change. Traditional societies faced considerably less change than we have seen in the last 250 years. As a result, the “future” has become meaningful as a concept since it has become increasingly apparent as something different from the past and present. Second, modern societies have (generally speaking) increasingly mastered short-term survivability and can spare some bandwidth for a longer-term future. Third, the emergence of modern historical practice has made us aware of the length, complexity, and change of the past and opened up the prospect of the reciprocal: the future. Fourth, our longer individual longevity means that, as compared with a few hundred years ago, the prospects and conditions of our (potential) lives multiple decades hence actually could have some meaning (not so if your life expectancy is only 45). Finally, 19C science—especially geology and evolution—has forced us to come to terms with the vastness of human time. Charles Lyell’s Principles of Geology in 1830 showed that the planet was millions of years old (not a few thousand as clerics had inferred from Scripture). This created a chronological space for Darwin and his theory of evolution, showing that the emergence of species could actually occur, one genetic modification at a time, since there was now enough possible generations to accommodate the development of the human variability, for example; something that would not be feasible without divine direction in the few thousand years since Adam.

Modern rationality has also insisted on counting and specificity. So traditional stories of end times—of whatever culture and nature—can’t be so open-ended: floating out there as something that would happen…sometime. Mythos doesn’t fit well—stylistically—with spreadsheets. This has led to scientifically-grounded projections which are far more bounded, even if not precise to the last detail (we’ve only got 7.6B years with the Sun, not 8B).

So, not only has our time horizon expanded, but the mentality with which se contemplate what lies ahead has changed too. The acceleration of change in the 20C—whether in terms of technology, geopolitics, or culture—has brought with it (among other things) an expectation of further accelerating change. This makes the future inherently—and consciously—different from the present and increases the interest in what’s coming next. When we add in a dollop of Enlightenment-stimulated sese of human power and control, it’s no wonder that the 20C saw a bourgeoning of “futures studies,” scenarios, and efforts to at least conceive of potential future vectors of developments: possibilities which could be planned for.

Planning connected the present with the future. In contrast, ancient and traditional modes of envisioning the future—the Second Coming, Kaliyuga, the Mayan Long Calendar—all existed “out there” somewhere in the indeterminate future; not in the present, but not really according to any calendar that people could comprehend. Railroads and naval fleets, on the other hand,  required plans—with schedules and budgets. With the incremental advance of technologies as a model. The arrival of the future could be projected as emerging, piece-by-piece, out of the present. It became immanent (of and in the world), no longer transcendent (dropping out of the skies without much human agency).

Much the same can be said of the avowedly fictive futures. Utopias from Plato (4C B.C.) to More (16C) existed away from reality; indeed that was their point. Modern “science fiction”  wrestled with societies based on the present-plus; Verne and Wells being the pioneers here.

The combination produced an extension of the culture of the present; one that continued on increasingly, so that the realm of the imaginable, the realm of the implementable, grew from merely the present to years and decades ahead.

Our world today is filled not with just the present, but with this future, extrapolations of current trends, either through literary imaginations or statistical models. The premodern world didn’t conceive of itself in this way. Its future was preconscious, dominated by the here-and-now. We are willing to contemplate a span of years ahead as something that is integral to who we are now, something which we have some chance of steering; even if we never actually know what will happen.


0 Comments

Ma non troppo

11/10/2023

2 Comments

 
The problem with most of the rhetoric uttered in times of stress (and these days is there much else?) is that it’s good for rousing people, exercising their adrenals and other brain chemistries, and flinging them into action for some cause or another. Outrage, insult, doom: we must all push hard against these incipient evils. On the other hand, it’s not good for governing, solving problems, or living together.

My title today: “Ma non troppo,” is an Italian musical term typically affixed to the composer’s direction to the player as to the tempo or how briskly or languorously the piece is to be played; as in “allegro, ma non troppo” or “lively, but not too much.” It’s a delightful phrase with useful application far beyond the recital hall: telling me not to get carried away; to be focused on my target, but to remain conscious of my context at the same time.

There is much to be said for capitalism, socialism, individualism, cohesive group identification, social justice, rule of law, democracy, governmental effectiveness, national security, individual rights, promoting moral standards at home and abroad, fiscal rectitude, self-defense, respect for authority, a sense of aspiration, incrementalism, liberty, equality, fraternity, a responsibility for the future, a responsibility to the past, human rights, communal responsibilities, faith, science, basic quality of life, environmentalism and, indeed, hope [did I leave anything out?].

All are good, but “ma non troppo.”

I’ve found that it’s a good practice when in a confrontational situation to try to construct a plausible rationale and to identify the omissions/blind spots for each side: Landlords and tenants, Palestinians and Israelis, advocates for a universal basic income and advocates for lower taxes, those who want to choose gender identities different from traditional appearances and those who have embedded decades of habit in reacting to others by those appearances, etc., etc.

I’ve found it’s a good practice not to presume malicious or insulting intent. Not there isn’t often reason for such a belief, but to presume it without assessment doesn’t generally get me where I want to go. Indeed, I suspect that well over ninety five percent of what’s bad in the world is due to negligence, loss of attention and (especially) incompetence; evil and malice are pretty rare.

I’ve found that binary thinking, simplistic categorization, painting people and ideas as either black or white—period—is usually laziness, arrogance, blindness, or anger on my part.

I’ve found that being a victim of some crime or evil doesn’t make a person incapable of criminal or other evil actions and to merely recite their victimhood as a justification rather than assessing their own actions is disingenuous.

One of the downsides of despotism/authoritarianism is that such regimes’ insecurity/arrogance usually means that they can’t tolerate consideration of alternatives or constraints or balance.  Lenin found this out in the early 1920s when, despite the then-new triumph of Marxist doctrine, it was necessary to carve out market-oriented exceptions if people were to be fed. Mao didn’t learn the lesson and millions starved in China in the 1960s.

Unbridled [fill-in-the-blank with any of the items from the list in the third paragraph] rarely works. This is mostly due to the inherent distance from theory to practice and the complexities of having lots of people with different views and priorities living together. Liberty and initiative have brought many benefits to the modern world, but we read every day about the excesses of 21C oligarchs/billionaires who throw lavish parties while millions starve. Each is its own mini, privatized version of a self-serving authoritarian regime. Socialism for the public good is noble, too; but is also subject to corruption, arrogance, and bureaucracy.

One of my favorite examples has to do with the level of taxation on the rich. Any attempt to raise funds for public benefits is met with pained cries of those who insist that a heavier tax burden will suppress investment and initiative, that entrepreneurs will be deterred because they won’t be able to make as much money and society will suffer the loss of innovation and competition. Yet few entrepreneurs I know or know of would work less hard due to a higher tax rate. They’re motivated by their own ideas, their own energy, and their own drive for recognition and success. They “keep score” with money, to be sure; but if a steeper tax bill meant that all their competitors also ended up with a bit lower net worth, the rankings would still be the same. So, capitalism…sure, but ma non troppo.

Self-defense is another example. The doctrine that a person’s home is their “castle,” defensible with weapons is a plausible theory of criminal defense. Pushing that idea out into the streets via the “stand your ground” theory might be seen as an incremental extension. But, it runs into other people’s liberty and security.

So, let’s dial it back a bit, let’s not push things to (past?) their logical limits. Let’s leave the last ten percent of every idea off the table. Abortion/women’s rights, capitalism/socialism, free speech, the mare’s nest of the Middle East, US/China, etc., etc.

In a Supreme Court case (whose name I can’t recall) on the question of due process under the 14th Amendment, Justice William Brennan described the decision point as “implicit in the concept of ordered liberty.” It’s not a bad phrase, even if it’s overwhelmingly ambiguous (more a signal of the difficulty of balancing principles than a useful predictor of what the Constitution allowed). Order is good, so is liberty. They often (usually? always?) clash. Ma non troppo is more elegant.

2 Comments

Out of Control

11/3/2023

2 Comments

 
I like to think that I’m in control of myself. I rather take pride in my rationality and ability to solve problems; it’s pretty central to my self-image. So, I don’t know if two recent incidents constitute a karmic telegram to stop kidding myself (Remember telegrams? The last one (physical—not karmic) was delivered about ten years ago, apparently).

Of course, any sense of control is an illusion, and often a dangerous one. The ability to “go wild” seems to have all manner of positive psychological and physical benefits (at least in doses and with some limits); as evidenced by popular dance music for centuries. Alcohol, tobacco, and other drugs are much to the same end. Regardless, the illusion has provided me with no small sense of self-satisfaction, even if part of me can also acknowledge the costs. And beyond satisfaction, a sense of security, both situational and ethical. So, on to recent history….

Incident #1: Last month, I was doing some yard work (man-of-the-land that I am!) when I apparently disturbed a ground nest of yellowjackets (wasps) who swarmed me instinctively. Before my “normal,” control-predilected self was aware of this, my amygdalic brain started flailing my arms—foolishly, I later learned—and propelling my legs away quite rapidly. A few seconds later (real time; or an extended period as it seemed in the moment), I was in the house with—mercifully—only four stings on my hands and wrists. By the time I had dashed to the computer to do an internet look-up for appropriate remedies, grabbed appropriate creams and dunked my hands into ice water, I caught my breath and realized that my flailing had left my glasses out in the yard in the spot of the initial onslaught.

For the next several hours, I felt drained physically. Mentally, I didn’t feel scared (I did retrieve my glasses), but a touch wary and with a definite preference for “hunkering down.” I spent some time observing myself.  I guess I don’t fire off the brain chemicals and short-circuit my normal, well-processed thought processes very often. In fact, I can’t remember the last time I reacted as instantaneously/intensely. As a result, it was strange to recognize the guy who moved through this situation in this way. I don’t regret acting the way I did; not that “I” had much control over what I did. So, both in the moment and in the aftermath, some quite apparent demonstrations of Steve not being “in control.”

Incident #2: Almost a week later, my wife is starting to feel increasingly bad: fatigue, aches, respiratory inflammation. We had, for three and a half years, avoided being caught by the COVID bug, but our days of innocence were gone. I followed about two days later. Fortunately, for both epidemiological and pharmacological reasons, we only had a few days of being miserable and are both more-or-less returned to normal health.

Nonetheless my two-ish days of moderate misery: spaciness, comprehensive body aches, a bad sore throat, and occasional chills/fever were, for me, remarkable. I’ve been quite fortunate to have avoided acute illness over my life. Other than a couple of out-patient procedures, a light-to-moderate set of cold/flu infestations, and an increasing prevalence of age-appropriate chronic physical conditions, I have been pretty healthy.

COVID presented in me in a manner similar to colds/flu, but more severe. Since I’ve had colds/flue since I was  a kid, at one level it wasn’t remarkable. And yet…even though the chances of severe complications was small, it was different. It was new. Or, perhaps I just looked at it (i.e., me with “it”) differently. I was regularly aware of struggling to clear my head, to wake up from my  (more frequent) sleeping and deciding (repeatedly) that I didn’t have to or want to. When sitting at my desk, I was “just fine” to sit there vacantly and not do much (if any) work (once I had emailed my students with the revised class schedules for the week).

I didn’t have a chronic condition, but I could see that I could very easily feel the same way indefinitely. I got to wondering whether I could be like this if in some time—for any number of reasons or conditions—my limited acuity and attention (…and self-control) would become my “new normal” and possibly terminal, if indefinite state. What if the reduced sense of connection with the world: my characteristic interests in ideas and affairs, my role in managing my life was “as good as it got.” Perhaps I would mind, perhaps I would be upset with my new smaller world; but perhaps that’s just the current me standing up now when—by definition—that Steve wouldn’t be present anymore.

I’m not sure how to characterize how I feel about such a prospect. Not “scared,” certainly not “resigned to it;” aware, as I say, that any idea of such a future is more projection than prediction. It is all well-and-good to declaim: “Rage, rage against the dying of the light.” But that presumes a certain level of synapses and energy levels to spark such rage. A noble dream, but not everyone’s reality.

So, to return to where I started, this mild-to-moderate COVID bout gave me a second taste  (and a hint of a third) of not being in control of myself I the way I am used to thinking. One due to a hyped-up system, the other due to a spaced-out processor.  I take from these two (+) situations an appreciation of how much I rely on my constructed sense of myself, the fragility of that control, and a question of whether to lean on it as much as I have. Or, as T.S. Eliot asked (and the Allman Brothers affirmed): “Shall I eat a peach?”

2 Comments

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly