Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Frozen

1/31/2025

0 Comments

 
Frozen

It’s pretty commonplace to describe the modern era (let’s just say—for discussion’s sake—the last 250 years) as a time of widespread and accelerating change. The Industrial “Revolution” spurred deep shifts in how people lived and worked. The French Revolution and its progeny spurred deep shifts in how people saw themselves politically and organized their societies accordingly.  In our own lifetimes, technology is practically tripping over itself with new things and ways of doing, disrupting industries and lives as never before.

So, naturally, I’d like to take a contrary stance. I’d like to argue that, in fact, the modern era is as marked by as much sclerosis and resistance to change as it is by “new and improved” and other manifestations of “progress.” In a way, this framing is a counterpart to my recent comments on revolutions. Perhaps a geologic analogy would help: a “locked fault” is the connection between two tectonic plates which doesn’t shift because there’s too much friction ‘sticking’ the two plates together. Eventually, the underlying stress builds up and a bigger-than-expected earthquake results when all that stored energy is finally released.  Social patterns and relationships aren’t too different. Decades of pressure for women’s suffrage from the mid-19C on made only marginal progress until the shock of WWI loosened the political process here and in Europe. Similarly, accumulating pressure on European metropoles for the independence of their global empires finally cracked due to the changes wrought by WWII and a deluge of new states emerged in the following decades.

Part of the story of progress and modernization that has become conventional historical wisdom highlights the pace and extent of change as part of some self-satisfied triumphalism (never mind that we’ve fallen over the cliff before and our current pace looks alarming in several directions). Such stories downplay how much doesn’t change or, at least, changes much more slowly than we might expect. For example, despite a moderate toning-down of its powers in 1911 (yes, 114 years ago) the Brits still have a House of Lords that only this year no longer includes folks who inherited titles/wealth from some daring military commander in the 16C, etc. Royal families—European, Thai, several spots in Africa—still wield power, attention, and subsidies. Even many nominal republics around the world are led by virtual or actual “Presidents-for-life” or de facto dynasties (the recently-departed Bashir Assad being only one notable example).

More fundamentally, a lot of our institutions (not just organizations) seem to be stuck. I’ve previously talked about how the Dems/GOP are off in their own shared fantasia (we’ll see if the returning resident of 1600 Pennsylvania Ave. can really knock them both off their stoop!). Ditto our venerated US Constitution. Ditto our “belief” systems. Doctrinal trivialities aside (and I mean you Protestants), how many centuries (millennia?) has it been since we had a significant new religion in the world?

I’ve spoken in several contexts about the dysfunction of our global system of states which seems to suck up any effort at a different model of socio-political organization. The configuration of the U.N. is an exemplar. No one thinks of Britain and France as top level global powers (and arguably haven’t been since they leveraged Great Power inertia to get their seats in 1945 or the Suez fiasco of 1956), but there they sit with veto powers in the U.N. Security Council and no real mechanism to get them off (comparable to the rights of Delaware and Wyoming to have the same clout in the Senate  as California and Florida). Meanwhile, any number of regional powers and relative economic heavyweights are left among the also rans. At the same time, the Marshall Islands, Monaco, and Grenada (combined population under 200k) have the same vote in the General Assembly as India, Indonesia, and Nigeria (combined population: ~2B!). Incumbents cling to power and bureaucratic processes make dislodging them problematic.

Across the US, few statutes and regulations have sunset provisions, so they live on, without review or updating, leaving the formal structures of most societies dragging around the legal remnants of the past. The current wildfire disasters in LA have highlighted the encrusted hodge-podge of governmental districts and jurisdictions established long before that metropolitan area became the 2d largest in the US.

While the creation of bureaucracy has been considered a hallmark of modernity since Max Weber in the early 20C, prioritizing rationalization and efficiency, its mentalité (as he predicted) has inexorably grown. Rules of behavior, reflecting originally sensible principles have become outdated burdens. Just look at the rules for home building in most developed parts of the world (the less-developed parts of the world could probably use a bit more rule-based bureaucracy in their construction/zoning sectors, however). NIMBY-ism is a great example of how incumbents preserve their power/position. Rules-upon-rules, intermixed with a litigatory environment and legislative processes overweighed with conflicting lobbyists, both cement the rules and deter the thought of adaptation (much less radical re-think). The result is a set of regulatory schemes—health care, telecommunications/internet, energy usage, securities, taxation, etc.—that make living in the real world of today exceedingly difficult.

The private sector is generally better at adaptation since market pressures force change. The fortunes of the leading 20C industrial companies (e.g., GM, IBM) and financial institutions after the 2008-9 debacle are ample evidence of this. In between, the NGO/non-profit sector seems less adaptable since they lack both competitive or institutional pressures to change.

One aspect of the modern era is the way in which tensions between legal/governmental/ bureaucratic stultification and the dynamic pace of technological and market developments have played out in terms of societal benefits and costs. If we hold off on imbibing the mythology (especially regarding the wonders of modern technology and free markets), and assuming that either one is—per se—beneficial or detrimental, then perhaps we can avoid the political moralizing and merely recognize that there are multiple ways to evaluate modern society.

In either case, we need to not only watch out for being overwhelmed by the apparently incessant pace of change, but also by the icebergs of tradition that chill the prospects of moving our society forward.

0 Comments

Morlocks

1/24/2025

0 Comments

 
H.G. Wells spun out many ideas in his (rather short) novel “The Time Machine,” (1895) (also depicted in a great movie version (1960) and a decent recent streaming version). One of the most memorable is the division of the human race into two species (about 800,000 years in the future): the Eloi, who are effete lotus-eaters whose every need is supplied by the planet and the other species, the Morlocks. This later group lived underground in clearly crude and atavistic societies, but were far more energetic and engaged in life.

It's a morality tale, to be sure, extrapolating on late Victorian ideas of racial superiority and class division. Wells leaves a fair amount of ambiguity as to which group is worse off. However, the idea of divergent human evolution has been fleshed out in various subsequent dystopias, often with some Musk-esque group of elites blasting off and leaving the bulk of humanity bereft on a dying planet.

On a more immediate basis, critics of capitalism have, for more than a century, warned of profound dysfunctions in human societies stemming from the exploitation of the masses by the wealthy. The bourgeoning of modern Western (capitalist) society across the 20C (indeed, its very survival) has thus arguably depended on the willingness and ability of those elites in control of the economy and the state to accommodate a sufficiently robust middle class that has tamped down on revolutionary impulses while allowing the “one percent” to continue on their ever-upward climb to become “Masters of the Universe.”

The development of the computer era has deeply threatened this model by undercutting the need for lower-skilled jobs and the value of an increasing number of the better-paying jobs of the “middling sort” of human societies. And, while several billion people around the world aspire to the living standards of the bottom 20% of those in the West, for those who are part of the West’s “middling sorts,” the slide toward economic distress and irrelevance looks ominous. Some of the more perceptive recognize this existential threat and are the principal drivers of the general crisis of ungoverability of which I have regularly written (unfortunately, they have often had to resort to dubious “populists” to get themselves heard, to the general detriment of all).

It's a nice question of historiography as to when the “computer age” began (e.g., IBM’s Mark I (1944), ENIAC (1945), general commercial mainframes (1960s), PCs (1980s), the World Wide Web (1989)). All this technological development has been continuously accompanied by concerns over the effects of these technologies on society and the work force in particular; stories about the threat of “automation” date back half a century or so. It seems pretty clear that the dramatic introduction of generative AI in the last two years marks a major ratcheting-up of computer capabilities and the impacts on the work force are much more immediate now than earlier technology milestones.

Leaving aside the (not inconsiderable) issues of robots running amok or vastly increased electricity demand, the social implications of millions of folks losing their (relatively) well-paying (if generally lower-to-middle level) jobs over the next decade are going to be important (and, of course, were wholly ignored in the recent political campaigns). These pressures will join broader demographic trends (especially increasing average age), general overpopulation in Africa and Asia, accumulated overconsumption and underinvestment (aka “deficits” and national debt), reduced educational capability (itself accelerated by overreliance on AI by students and institutions).

This is likely to happen faster than other techno-driven social changes. I have written about AI before, but I just had a stunning experience of the power of AI to analyze, digest, and articulate a set of texts. It’s a new app from Google called Notebook LM. You can check it out in a 15 minute podcast it created in a couple of minutes from a 7000 word lecture text I wrote a few years ago (Link below).

The abstract the AI created looks a lot like what we have come to expect from generative AI, but the dialogue in the podcast a whole new level of coherence and conversational expression of the material it was based on (not to mention taking a few liberties with the information/arguments that were inputted. As a professor, I see a job threat; but more importantly, I see students outsourcing even more of the learning/digesting/analyzing/creating activities which we have come to expect as the essence of the educational process. As we professors try to get them to “think different,” these tools will enable them to farm out intellectual development and remain relatively unchanged by their educational experience. Who’s going to hire them and pay them a “middle class” salary necessary to support a suburban lifestyle? What incentives will they have in our nihilistic age to figure out how to have a decent standard of living? (or even a sense of style!)

Capitalism and geopolitics ensure that there’s no way to slow down, much less cap AI capabilities and deployment. All manner of menial, service, and simple production jobs are at risk; it’s not just Uber drivers, coders, customer “care” agents, and hamburger slingers. Politicians are wholly at sea on what to do. In short, it’s going to be a bloody mess. Look for a revival of the Luddites. Some of the roads forward lead to “revolution” (on which see my recent postings); but even if we muddle through, the prospect of significant portions of the population increasingly “underemployed” is a dire one. Politically, the country has tolerated this bifurcation of “classes” as long as it's been confined to “minority” and rural communities domestically and the bulk of the global population (i.e., Asia, Africa, Latin America). When it hits the “heartland” (i.e., suburbia), it could get ugly; if not politically, then at least economically and aesthetically. If you think there is energy around “MAGA” now, wait until the distance to American “Greatness” is farther than it is now.

In Wells’ tale, the Morlocks and Eloi are caricatures. On current trends, it won’t be 800,000 years to get there.


0 Comments

2525

1/17/2025

1 Comment

 
While many can recall the distinctive rock music hits of the late 1960s (e.g., “Stairway to Heaven,” Hey Jude,” or, “To Sir, with Love”) only a select few can dredge up any sense of the mid-1969 hit by the (otherwise entirely forgettable) duo of Zager and Evans: “In the Year 2525.” In an era in which lyrics were often light on semantics (e.g., “In-a-gadda-da-vida” (1968) or the Archies’ classic “Sugar, Sugar” (1969)) the song’s narrative was pretty meaty, projecting the future course of human history; certainly a rarity among the froth of popular culture.

The original “Star Trek” series had just completed its run, with a reasonably optimistic take on the 23d century, but the view encapsulated in the song (which ran through the year 9595) was definitely darker. They thus nicely straddle the range of utopia/dystopia themes which runs back to the 17C in European secular culture (and considerably further back if you count religious projections and eschatologies). They were both part of a burgeoning of future-focused writing, both fictional and “serious” which accompanied the surge of energy in the West in the decades following WWII. Economic growth from 1945 to the mid ‘70s joined with technological developments and a brash confidence epitomized by JFK’s (successful) lunar challenge.

Still, as we know now, many of those predictions and much of that confidence was—to put it mildly—premature. Many aspects of technology continued to leap ahead (although few had any sense of the information/computer age by 1970), but many stalled. Manned space flight stopped pushing the “boldly go where no one has gone before” envelope in the 1970s and even now the Artemis project to return to the Moon after more than 50 years is seeing continued delays. The ATT Picturephone, a highlight of the 1964 World’s Fair, didn’t really show up as a mass market tool until the COVID pandemic “Zoom’ed it” forward 55 years later. And, of course, we’re still waiting on those famous flying cars.

All further evidence that neither historians nor anyone else are good at predicting the future. Most of what falls into the categories of forecasting, “visions,” and predicting are really just about projecting the emotional mind-set of the present—positive or negative—rather than any particular insight about what will actually happen. Sometimes someone gets it right (much like the proverbial broken clock that’s accurate twice a day (at least that proverb worked in an analog clock environment, a broken digital device isn’t even accurate that often!)), but that’s down to luck and ordinary probabilities.

The ”Space Age” of the 1950s-70s spawned all manner of science fictional accounts of aliens, planets and techno-wonders (I still have quite a collection!). As noted above, the range of futures predicted in the ‘60s covered quite a range and can’t tell us too much about that era’s zeitgeist. However, the fact that such prognostications were splashed across and avidly consumed by mass popular culture (“The Jetsons,” “Lost in Space,” “My Favorite Martian,” “Star Trek,” and “Planet of the Apes”) is some evidence for that period’s at least an interest in, if not an optimism for, the future. [But see, Kubrick’s adaptation of Arthur Clarke’s “2001” which raised some profound philosophical questions.] In contrast, over the past decade or so a skew towards dystopian visions similarly indicates our darker outlook, rather than what the world will actually look like 20/40/100 years hence.

The post-WWII burst of futurism was, however, marked by a more rigorous and quasi-academic timbre. The inadequacy of pre-WWI and WWII planning led to more intense scenario development (exemplified by Herman Kahn and the RAND Corporation’s work on potential routes into and results of nuclear war. New magazines published scholarly prognostications which carried enough of a sheen of thoughtfulness/reliability to reinforce the popular sense that our civilization was advancing to the point where we could feel some confidence about how the future would be. Audits of these predictions, however, were not implemented and we may suspect that their average quality was only marginally better than Nostradamus (mid-16C).

This applies not only to technological visions, but also (perhaps more so) to human affairs, whether framed in terms of economics, psychology, politics, or culture. We can easily extrapolate from current trends (e.g., the rise of China, climate change, medical improvements) but by the time the infamous butterfly in Borneo flaps its wings, the resulting typhoon/monsoon/guerilla platoon will puncture such neatly drawn projections like—well, a balloon.

Zager and Evans’ hit was the pop version of this deeper cultural trend. We’ll find out in 500 years whether they were right.




1 Comment

Baby, it's cold outside

1/10/2025

2 Comments

 
Most of the time, nobody pays much attention to Greenland. It’s pretty much out of the way, tucked to the East of Canada and WAY up north. It’s a big island but it’s almost all covered with glaciers, ice, and snow. While indigenous people (mostly Inuit) have been there for thousands of years (a testament to human adaptability), it came into the European orbit with the Norse explorers in the 10C (the famous “Eric the Red”) and has been part of Denmark since the 13C.

But the times are changing and things are no longer moving merely at glacial speeds. Global warming has increased the navigational significance of the Arctic region and melting ice and snow have uncovered stunning amounts of minerals (including strategically important rare earths) that are now extractable.

The resulting geopolitical competition has attracted the attention of all the leading global players, including HWSNBN, a latter-day imperialist, who suggested that the US just buy the place or, perhaps, invade and conquer. His buy-out proposal was soundly jeered at by both the Danes and the local government. But, you can never tell with him, especially since it would be a real estate deal (on which he is the world’s leading expert) and he wouldn’t have to submit a personal financial statement to float the mortgage. He could be just trash talking (there’s some precedent for that), trying to scare the Russians/Chinese away or leveraging the Europeans to step up their defense spending. While we have had military bases there since WWII, I don’t think we’re quite up to unilateral annexation or creating the 51st state; much less attacking a NATO ally. (Of course, he has a habit of talking loudly and carrying a little stick. He might be trying to scare the Greenlanders into staying with Denmark and then leaning on the Danes to give us a good mineral-access deal).

Meanwhile, the Greenlanders are restless. Their locally-elected government is dominated by pro-independence parties who are planning a referendum this year. Shades of Scotland! Another nail in the coffin of European imperial domination of the globe (pretty much of a dead duck in any event!). The prospect of a new state raises several interesting points.

First, from a geopolitical perspective, Greenland and its newly-accessible minerals could become a bit of a beachball, getting whacked at by both superpowers and global mining behemoths. It’s heady stuff and a lot of money is likely to be dangled in front of the locals, who have depended on fishing for the bulk of their economic activity. However, they’ve been subsidized by the Danes for decades (currently almost $10K/person or about 18% of GDP), not to mention the payments and economic stimulation from long-established US military bases which would be at risk if the Chinese or Russians were to come nosing around (for an engaging fictional take on all this, see the most recent season of “Borgen,” the Danish political thriller on Netflix). All in all, it’s hard to imagine these folks successfully playing in the “big leagues.’ In this way, Greenland is much like other natural resource dependent countries; let’s hope they figure it out better than most.

Second (and more fundamentally), we have to ask what the hell is a place with less than 60,000 people doing being a country to begin with? To be sure, there are a dozen or so smaller countries scattered around the world (mostly post-imperial cast-offs on islands in the Caribbean or Pacific), but, I mean, really…? It’s one-seventh the size of the population of Iceland. There are more people in my electoral district that chooses one of eleven seats for the Board of Supervisors here in San Francisco.

Since Greenland already has autonomous self-government under Danish sovereignty, they don’t have to handle/pay for defense or foreign affairs. Even buying a decent one-bedroom apartment in NYC to be the Greenlandish Embassy/UN Mission would set them back $20/head (not counting utilities and maintenance). Where are they going to get the people/expertise to handle everything on their own? Or are they hoping the (pretty rich) Danes will continue to throw money at them? Indeed, it’s hard to imagine they would be considering independence if they couldn’t shelter under the EU and NATO.

This problem of micro-states arose with the wave of independence mid-20C as European empires unraveled (there were a few that pre-date that era, leftovers from the Holy Roman Empire (Luxembourg, Liechtenstein) or fringes of unconsolidated France (Monaco, Andorra) or Italy (San Marino). In the 20C, a fair number of places were so minute that they stayed as formal colonies (with new names) or were formally integrated with the home country. But many insisted on independence to be ‘au courant,’ and have usually remained on someone’s dole. Being independent allows—I guess—a certain degree of self-respect (carrying a flag at the Olympic Opening Ceremony and all that), but not much else.

After all, we live in a world of states, each of which expects everyone else to take on the same form/status. As I have noted elsewhere, this often doesn’t work out because there was no coherent political community from the get-go or the nature of the community or economy underwent significant change and the trading/community patterns don’t match the land boundaries anymore.

The micro-states don’t have those problems (most are islands), but they’re not capable of functioning at the international level. In other words, they pretend to independence, but are functionally (and usually financially) dependent either on their historical ‘owner’ or whatever multi-national trades in whatever they have to offer in world markets. The rich/big countries throw some money at them out of general principles, but the people are usually stuck (either physically or metaphysically).

So, perhaps the Greenlanders will opt to raise their own (red-and-white) flag. Perhaps they will sell themselves to the orange-haired one. But I suspect they’d be better off laying low, sticking with the (generally friendly and good-natured) Danes, and staying home and warm. Despite global warming, when it comes to geopolitics, baby, it’s cold outside.

2 Comments

Revolutionary Era.2

1/3/2025

1 Comment

 
A month ago, I wrote about the incipient 250th anniversary of the American Revolution and added some comments about the unlikelihood of similar events here in the 21C. This is a reconsideration of that latter point.

One of the benefits of being an open-minded historian is that I get to change my mind after taking a position, discovering I didn’t know as much as I thought I did, and then researching and thinking a bit more. So, my skepticism about future revolutions is an instance of (almost instant) revisionism; and likely not the last, especially since I’m still sorting out what I want to say for my upcoming OLLI course this winter.

The first point to make is that the nature of revolution has changed over the centuries. Or, in other words, we use the same word to describe some different things. This is partially a function of ordinary semantic sloppiness, but more fundamentally, as phenomena occur that are similar to previous phenomena, we use the same word to describe them.  It’s one aspect of the famous Mark Twain misattribution: “History doesn’t repeat itself, it rhymes.” The rise of ideologies, nationalism, and the taking root of beliefs in democracy (all themselves products of the revolutions of the 17/18C) became essential parts of the discourse of revolution later on. More importantly participants in later phenomena, e.g. the Russian Revolution of 1917, are conscious of their past, in this case, the great French Revolution of 1789. The desire to replicate or to avoid such precedents not only changes attitudes and outcomes, but makes the nature and purpose of these echoes different.  The Revolutions of 1848 across Europe were in a dialectic with (i.e., they were responding to) what happened in Paris in 1789 (and 1830 and 1834, etc.). Revolutionaries of the 19C and 20C had read their Marx (and, later, Lenin) and sought different goals as a result. So, it’s only by taking a step back and recasting a model of revolution more abstractly and generally that we can keep the word and the actions connected.

With this broader perspective, we can see that the 21C is filled with revolutions—some successful, some failures—and that there is no particular reason to think that we’ve hit some metahistorical wall ending an era (or, at least the label for an era). In the last three years alone, we can see the culminations (one always has to be careful not to characterize a particular revolution as having come to a “conclusion”) of revolutions in Afghanistan, Bangladesh, and Syria. There are ongoing struggles (which we could call revolutions in process) in Venezuela, Sudan, Myanmar, and (arguably) elsewhere. It’s anyone’s guess as to whether any in the former group will stabilize (and, if so, how) and whether any in the latter group will “succeed.”

But just to take the three that seem to have come to some climax, we can see that revolutions overlap with all sorts of other labels for major political violence, such as insurrections, civil wars, and wars of independence. They don’t fit into any clear and tidy model. The forced departure of Sheikh Hasina in Bangladesh was the result of years of turmoil and political infighting, but relatively little violence (at least as compared with the following two examples). A civil war in Syria has been bubbling at various temperatures for more than a dozen years. Afghanistan really has to look back at least fifty years to a monarchy that was overthrown in a coup in 1973 for any semblance of political stability, suffering in the meantime from invasions by the USSR and the US, tribal disputes, and all manner of incoherence.

Indeed, both Syria and Afghanistan faced their own of religious disputes within the broader framework of Islam. The former was ruled by a family from the Alawite sect that comprises less than 15% of the population. The latter faced disputes different religious/tribal factions until the radically conservative Taliban succeeded in securing control of the capital in 2021. Both country’s incumbent governments were the products—to a greater or lesser degree—of externally imposed structures and institutions and their revolutions can be said to be in part a rejection of those foreign injections of power (imperialism). In both cases, revolutionary forces sought to force fundamentally new structures of government and social organization. In contrast, Bangladesh, whose independence and coherence date from the 1971 independence from Pakistan and the 1947 Partition of British India, seemed relatively democratic and the recent eruption and ouster of the incumbent government did not seem driven by either religious or ethnic issues, but rather a combination of concerns about corruption and incompetence.

The Taliban’s triumph in 2021 led them to consolidate governmental control (although it remains a political outcast internationally). The nominal governmental structure in Bangladesh remained in place in August, 2024, with a technocratic caretaker government still sorting out its constitutional and operational plans. And, as to Syria, it is far too early to tell what will emerge from the December 2024 climactic even on an interim basis.

This is, of course, just a skimming of the many ways we could compare the developments in these countries and the degree to which the facts on the ground aligned with any number of models of revolution. It’s no wonder historians are inherently wary of social science approaches to the diversity of human behavior and social structures.

One notable aspect of these revolutionary developments is that none were well-predicted. Some historical developments resemble slow-motion car crashes where there are few surprises. Here, however, the world was stunned by the rapidity of the changes that eventually emerged out of recognized cases of instability. Who on the outside has any real sense of what is going on in China or Russia? Could there be secret cracks behind the facades of authoritarian control? Even so, aren’t coups there more likely than revolutions?

Historians are not inherently better at prediction than any other group, so I won’t be going down the road of specific prognostication of when and where the next revolution will break out or succeed. As I have noted elsewhere, the state system and most established governments around the world are facing crises of competence and legitimacy. There is a lot of distributed and uncontrollable means of military power sitting out there. People are unhappy. All the key ingredients for revolution are there.

1 Comment

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    February 2026
    January 2026
    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly