A friend of mine recently connected me with a think tank called New Consensus and, in particular, their rather stunning set of proposals for national renewal called “Mission for America.” It’s led by the folks who originated the “Green New Deal” concept, some notable portions of which ended up being enacted by the Biden Administration. It makes for heady reading. You can check it out here.
0 Comments
For all the modern moralizing and disparagement, mind-altering substances have been immensely popular across human history. Indeed, taken together, commercializing and promoting their use has been a significant driver of geopolitics, globalization, empire, and (often) the economic foundation of the shape of our world—and far more so than the efforts to attack their supply or deter their use.
Indeed, the moral righteousness around drugs has often been highly selective. I would bet that the lawyers going after the Sacklers for the latter’s role in the 21C opiate crisis consume a fair amount of alcohol and caffeine (ditto for the many “soldiers” (and “generals!) in the US “War on Drugs” from the 1970s onward). The temperance movement which flourished in the 19C and culminated in the US (short-lived) Prohibition Era (1920-33) at least condoned tobacco, sugar, and caffeine. So, this is (yet another) area where we have to be careful of hypocrisy and anachronistic judgementalism. Speaking broadly, such drugs (my list includes opiates, tobacco, alcohol, sugar, caffeine, and cannabis) have been used not only as sources of profit, but also tools of control, both in terms of imperial relationships and domestically. Regulation/prohibition, taxation, import monopolies, export monopolies, coercive labor regimes, and a continuum of violence from petty criminality to full-scale warfare are the commonplace of a history of drugs. Indeed, I suspect that one could rather robustly populate a multi-dimensional matrix, using drugs, power structures (states), and time across three axes. Probably the most famous such situation in (relatively) recent history is the British cultivation and export of opium from India to China in the 19C. Famously wrapping themselves in the flag of “free trade,” the British insisted that China allow opium imports and unleashed two wars in the 1830s/50s to enforce their power to push drugs. Sugar was a key component in the British/American three-legged trade pattern of the 18/19C, famously captured in the phrase: “rum, molasses, and slaves.” Tea from China, and later India, was similarly critical to Euro-Asian trade patterns (not to mention its centrality to British culture). Coffee was a later and (until the 20C) lesser trade factor. Tobacco from the British colonies in North America was important from the days of Sir Walter Raleigh (16C) onward. From another perspective, we can see that the efforts of the US to prevent a variety of drugs from entering our country, manifested in all sorts of quasi-military actions in the vast majority of countries in the Western Hemisphere over the past 50 years. There’s some evidence that the “War on Drugs,” initially promulgated during the Nixon Administration, was more focused on criminalizing the behavior of minority and other political adversaries than it was aimed at directly improving the social fabric of the country. Apart from the impacts of use/abuse, the production of such commodities has been the site of many forms of oppression. Much of the sugar produced in the Caribbean in the 17C-19C came from slave labor, as was significant portion of the tobacco produced in the southern US. Even without formal slavery, exploitative labor structures, can be found in coca farming in South America as well as opium farming in 18C India and 20C Afghanistan. Notably, differing roles of states, formal state-sponsored commercial enterprises (e.g., the British and Dutch East India Companies), and less formal organizations (e.g. drug cartels that take over regional/local governmental administration) ensures that we can’t just look for whose flag is being flown to understand “cui bono” (i.e., “who benefits”). This is not just a historical concern. The NYT ran a piece last month on the exploitation of sugar workers in India. I have noted before that “history” is not just a matter of kings, battles, technology, or philosophy. As my friend, Jim Grossman of the American Historical Association, says: “Everything has a history.” As importantly, all the pieces, angles, events, and descriptive language are interconnected. You can’t write about the naval arms race between Britain and Germany in the early years of the 20C and how it contributed to WWI, without understanding how the Brits accumulated the cash to build the boats. Slaves in Jamaica, peasants in India, opium buyers in China, tea drinkers on every continent, and millions of others could tell you how that capital was accumulated. The distinctive thing about drugs in history is not that they were vehicles for profit and exploitation (within or between empires); nor that they were items of human consumption (as the many great histories of food, clothes, and other “things” amply demonstrate); rather, it is that these mind-altering substances are tangled up with the nature of human consciousness, all manner of social convention, and, therefore, morality. Producers, traders, and users have all been subject to condemnation or condonation in ways not usually associated with corn, cotton, or transistor radios (remember those!). When governments get involved, whether as regulators or facilitators, the social/political and moral landscapes get even more complex. These entanglements make it more difficult than usual for historians to “unpack” their own (personal or societal) judgmentalism and package/frame/interpret the practices of the past. The evolving characterizations of human consciousness (torporous or energetic, mystic or hallucinogenic) add another layer of complexity. How are we to see those who used (or opposed or produced or traded) opiates, tobacco, alcohol, sugar, caffeine, and cannabis? Tee-totalers? Churchill with his cigars and champagne? Shamans? MADD? Plantation owners? Rum-runners? Cocaine mules? Rastas? Betel-chewers? “Lotus-eaters” of 18C opium or 21C opiates? It’s a bewildering set of exercises that forces any seeker after coherence and consistency to pause (which is, after all, History’s job). The 24th Amendment to the Constitution was adopted in 1964 as part of the push for civil rights legislation of the 1950s/60s. It prohibits states from imposing a tax on voting, widely used in the former Confederacy (Unsurprisingly, 36 of the 38 states who initially ratified the amendment were from outside the South).
The Poll Tax was introduced after the Civil War and became widespread throughout the region by the early 20C. Basically, it was intended to keep poor people from voting and since almost all Black residents of the southern states were poor, it was an effective device to keep Blacks from political power. Supporters of this approach to politics have since moved on to other modes of voter suppression, but this particular technique is consigned to history. Beyond the racist and classist aspects of this approach to democracy, there is a more fundamental point: we should want everyone to vote. As I have suggested in my new draft Constitution (portions of which have been posted here over the past few years), such a policy should be affirmatively stated in our basic document. However, there’s a way to make that happen even without a constitutional amendment/convention. Voting in Presidential elections has averaged about 58% of the eligible population over the past 40 years. As we all know from anecdotal evidence and our own practice, voting in state/local and off-cycle elections can be considerably less. Although the vote in the 2020 Presidential election was the highest on record at 66%, there is a broader concern about what we might call “voter ennui.” There is widespread disenchantment with the political process, particularly among younger citizens. They need a bit of a boost to get them to the polls. A few other countries mandate voting, notably Belgium, Australia, South Korea, Singapore, and much of Latin America. Such requirements are often nominal, with little enforcement mechanism. Several places, such as Venezuela, Italy, and Austria, have repealed requirements in the past 30 years. A mere aspiration or admonition doesn’t seem practically effective. Criminalization of non-voting seems a step too far. Most countries that require voting impose only a small fine for failure to vote. A nominal penalty, however, implies that the duty is merely pro forma and not to be taken too seriously. Instead, the cost of non-voting should be significant, reflecting a broader belief in our shared responsibilities of citizenship and our need to act as a single political community. Fortunately, we already have a mechanism in place to incent citizens to vote and a precedent that can be adapted to encourage voting: the tax code. We could increase current taxation rates by, say, 1% across the board (e.g., if you were in a 10% bracket, you would now be in an 11% bracket). Then, we would apply an offsetting citizenship credit to be supported by an official electronic certificate from your local registrar; the equivalent of an “I voted” sticker. Those who didn’t vote would thereby pay a meaningful additional tax. By using the existing tax collection mechanism, we would minimize bureaucratic processes of enforcement. We would also ensure that the “fine” for not voting was graduated by income. One additional twist would be to impose such a “citizenship” charge on non-citizens: a surtax, as it were, for the privilege of being a guest in this country. As a matter of principle, taking steps to ensure that all citizens had an easy method of voting would have to be a prerequisite to such a proposal, so that this “non-poll tax” would be part of a broader set of policies and standards around registration, access, and methods for voting. Given the history of voter suppression, this would argue for a more extensive federalization of the voting process; a step I take more fully in my new draft constitution. Indeed, we could think about this voting incentive as part of a larger package of principles and actions on the nature of democracy and citizenship in our society. Instead of being seen as an aggravation or obligation, voting should be characterized as an integral part of a package of participating in the direction of our society. You can ask those who have become US citizens how important the right to vote is to them. You could (theoretically) ask those who live in places where voting is a sham. One of my favorite slides in my lectures on the history of democracy shows the people of South Africa lined up by the hundreds for their first chance to vote for their own representatives after decades of struggling against apartheid. Individually, many of us take it for granted. Some say voting doesn’t matter; that the two parties are pretty much the same (a rather more plausible argument a decade ago); that the system is broken so that democratic choices are stifled by a dysfunctional political process; that the world is going to hell anyway, so what does it matter. I will spare you the usual recital of democratic virtues in rebuttal. My response would be: “Too bad. Sorry that it’s a bit of inconvenience; but it’s part of the admission price for being here.” Indeed, it’s logical nonsense to have rights without responsibilities and such a pollitical system will sooner-or-later break down. Societally, we don’t take the responsibility of voting seriously. Public discourse is full of mumbles. Those who promote voting are dismissed/ignored as lovely “do-gooders.” Voting needs to be brought back from the fringes of our political discourse. It should be fostered, facilitated, and celebrated. Debating my proposal would certainly contribute to that. Implementing it would ensure that voting wasn’t merely a side-show to our culture as a country. When Jesus said: “Suffer the little children…to come unto me.” (Matthew 19:14), he (or rather the 17C authors of the King James Version of the Bible) was using a now archaic meaning for “suffer.” In our modern diction, we would say “allow.” However, what we do as a 21C society is make them suffer (in the modern sense of the term). It is horrible and hypocritical and not historically unusual.
I have noted elsewhere the anomie and resentment that is deeply embedded in young people today: the sense of not only being dealt a tough hand (e.g. Covid), but facing considerable disappointment in terms of career and housing opportunities, as well as a looming climate catastrophe. On top of this they recognize the profound dysfunction of our political culture which makes efforts to address these substantive challenges seem unsolvable. There is considerable evidence that parents allowing (“suffering”) their children to immerse themselves in smartphones and, especially, social media is a principal cause of the current wave of disaffection/alienation and broader psychological distress with which many young people contend. Indirect peer pressure (children insisting on access cuz all their friends have it) seems to overwhelm whatever prudent and “common sense” response parents might have. Our society seems to have no problem in saying that certain brain-altering agents (tobacco/alcohol/cannabis) are off-limits for those who (as a class) have an insufficient capability of managing themselves. We have been slow to catch up to technology and add screens/social media to this list. It is not clear whether those whose youth has been smartphone-dominated (i.e. born since 2000) can recover, nor how many more will be allowed to harm themselves in this way. Juxtaposed against these woes is a cultural mythology (hardly unique to the modern US) of cherishing our progeny. It’s built on a profound genetically-rooted set of practices. Pretty much every parent makes sacrifices (often heroic) to support and protect their children who are especially vulnerable to the vicissitudes of life. Limited knowledge prior to the 20C often made childhood a minefield of illness and death. For millennia, financial necessity meant marshaling all family resources to such a degree that the sharp limitation of child labor is a marker of social and economic progress. When we look at the health, education (formal and practical), and life choices which the modern world provides to our youth, there is no doubt that they are, as a group, better off than those of a century or more ago (even if a sharp family-wealth gradient remains). What, then, are we to make of a society which sends those children to schools where “active shooter” drills are commonplace? What, then, are we to make of a society where schools have given up their standards and allow (suffer?) their children to pass classes and graduate high school without basic educational capabilities? Some of these are peculiarly US issues, but the increase in climate-related illness/hunger/risks is world-wide. There the dangers are less immediate, but pretty much universal in impact; and mitigation/reduction efforts founder on the short-term economic claims of the generations who have already profited by modern capitalism’s exploitation of the globe. What are we to make of an immensely wealthy America which allows a set of break-through Covid-driven child support programs to founder? One where schools in general and child-support systems of many kinds are chronically under-funded? There are valid points (e.g., liberty, limited government) underlying all the arguments which lead to the demise of these practical improvements in the lives of children. No one is—on the face of it—against children, health, and opportunity; it’s just that the cost of providing them is (apparently) too high and something has to give. Such arguments around the need for compromise and trade-offs are normal in political discourse. When they are made, for example, by competing industries around the need to regulate or benefit one such group to the detriment of the other, we may have one policy preference or another. But children have few lobbyists and lawyers and the resulting economic benefit flows to embedded/voting/older generations. If it weren’t so predictable, we could look at those who want the State to intervene in support of the “rights of the unborn” but are unwilling to support such intervention to the health and education of the “recently-born.” We could look at the champions of “family values,” a phrase which has become a euphemism for parental protection/insulation from the valuing of or evolving beliefs of children. We could look at those who demand the liberty to carry dangerous weapons to such a degree that the liberty of those to be free from the fear of dangerous weapons is shoved into a closet. Globally, issue of health and hunger are much more dire than in the rich “West.” Children there suffer from the shadowed lives of both children in general and poorer countries (and “of color”) that, combined, make them nigh invisible to most with wealth. Pretty much every crisis we read about, be it political, economic, or cultural, is a more dire crisis for children. Little of this is new, historically speaking. But the current situation seems more acute because of the combination of threats and lost-opportunities which stand in sharp contrast to the amassed wealth (both individual and societal) which could be deployed. We cannot blame the children when we seem to actually care so little about them. As the dullest electoral primary season in recent history is effectively over (but for the breathless media reports, widespread hand-wringing culminating in the national conventions/infomercials), real attention has disproportionally shifted to the geriatric general election (with its breathless media reports and widespread hand-wringing). In particular, I know a fair number of folks who are distraught (not unreasonably) at the prospect of the return of he-who-shall-not-be-named.
With that perspective in mind, I have been wondering why this prospective disaster-in-the-making is different from some other historical precedents. We rarely can predict the future with any degree of confidence. Looming disasters/slow-motion train wrecks including the US Civil War, or WWI, or WWII, were amply foreseen, but only vaguely or conceptually. The natures of those calamities, not to mention their outcomes, were in line with the views of only a few prognosticators; even more so, their aftermaths. In each case, horrors, shocks, and destruction set the stage for periods of amazing growth and activity. History has no straight lines. In my work as a teacher of the history of democracy, I use many examples of the erratic course of “progress.” The promise of the post US Civil War amendments for Black people was repeatedly dashed by the deep inertia of racism legally, politically, and culturally. Weimar Germany morphed into Nazi Germany. The French Revolution was, famously, a decades-long ping-pong match between radicals (of various flavors), moderates (of various flavors), and monarchists (ditto). Even to take a current situation, there’s a potential future history of 21C US politics in which the 2016-2020 era is a fluke, washed out of significance within a decade. (Yes, there’s also a scenario in which the Biden era is seen as the last flailing gasp of mainstream politics as the country/world swirls around the drain). We just don’t know. I’ve been doing some research lately about what has to be one of the most remarkable flip-flops of political structure/culture (certainly in European history, if not globally): the mid-17C Civil War which led to a British republic. Frustration with a politically clumsy King (Charles I) led to armed conflicts in 1640s England, Scotland, and Ireland (not yet formally combined into the United Kingdom) between Royalist forces and a Puritan-led Parliament. Disputes mainly centered on an overbearing and increasingly authoritarian state, taxation without consent, and a bewildering variety of religious differences. (Voltaire later remarked, with a pithy double-back-handed compliment, that “England has 42 religions, but only two sauces.”) Parliament raised an army to fight the King, but lost control of this “New Model” Army which became its own political forces, eventually leading to the capture and execution of the King (1649). Oliver Cromwell, a minor provincial figure who had risen to command of the Parliamentary Army became “Lord Protector”, but Parliament was pretty ineffective and he (and, briefly, his son) ruled until things fell apart and, by popular demand, the monarchy was reinstated in 1660 under Charles II. Charles II and his brother James II ruled until 1689 when the latter’s pro-Catholic policies proved too much for the English and he fled to France. This “Glorious Revolution” led to the establishment of a constitutional monarchy with a dominant Parliament that looks a lot like the structure of British government through the 20C. What do I take from this (much-abbreviated) story? First, the see-sawing of power between conflicting philosophies and parties and structures. The French Revolution, which began a century later, produced its own version of this swinging back-and-forth. Both were dramatic illustrations of the futility of long-term prediction and the power of contingency. There were times of great turmoil and danger and then things settled down. Virtually every party and outlook had, at various times, reasons to rejoice and feel triumphant, only to shortly find themselves with the short end of the stick politically-speaking. Second, the road to our current political culture (a global version of liberal democracy) was far from assured. It could easily have been truncated (if Charles I and James II had been more moderate, they could have maintained a vibrant monarchy for some time) or, alternatively, have been radically accelerated. There were active groups in the late 1640s that proposed structures of power and principles that would have been recognizable to American and French Revolutionaries a century-and-a-half later. Indeed, it was the “republicans” (Cromwell et al.) who did more to suppress these proposals than the royalists of the day. So, in terms of our current predicament, there is reason to fear, but not likely—on the history—reason to despair. Things may turn out well or badly—but, we have no idea today. It is fair to ask, beyond this framing, whether the nature of modernity, and in particular, legal/social structures, the disintermediation of social media, and the sticky nature of the degradation of the global climate, makes such analogies from history irrelevant. Could well be. Still, there is no basis to crumble into a small puddle. The Levelers of mid-17C England would have seen their ideas of voting power and equality embodied by most countries in the world by the 20C. The world’s most powerful hereditary power structure is a small, bleak country in East Asia and the Bourbons, Windsors, etc. are consigned to shadow plays of performance art. After Weimar fell, the “thousand-year-Reich” lasted only a bit more than one decade. This is not to dismiss the darkness that occurred during the down-cycles. Nor is there any reason to think that democracy is assured by some simplistic reading of “human progress.” There is work to be done. The history of geopolitics is an unending tale of “rise-and-fall”s. The king of the hill in the 19C is scrambling for position in the 20C. Different Chinese dynasties held a range of positions in east Asia over thousands of years. Moghuls in India were replaced by the Brits in the 18C. The Spanish took over from the Aztecs and the Incas in the 16C, but their empire in the Americas had collapsed by the early 19C. There is much to be told about the whys and hows, patterns to be traced and exceptions noted.
This history is usually a story of power—primarily military/coercive and economic (sometimes also coercive)—and, according to the Realist school of international relations, that’s all there is. Notions of morality and ideologies, central to the Idealist school of understanding how countries actually behave, don’t count for much (however much one might aspire to morality in international affairs). Of course, from a historical perspective, the nature of those ideas and ideals has changed over time: you can’t understand European politics from the 12C through the 19C without looking at the internal structures and changing configurations of Christianity during that era. Ditto the variations of Islam across North Africa through Central Asia from the time of Muhammad. For us in the 21C, the ideological sprawl of modernity is everywhere: Communism, Fascism, Liberalism, Nationalism, democracy, etc. For example, the conceptualization of the individual and ideas of international cooperation spawned in the 19C have played a global role, not least in the concept of human rights. When combined with mass media and “public opinion,” there is motivational power—a “soft” power—in these ideas; at least to such a degree that virtually all countries pay at least lip service to the ideas of democracy and law and (dare I say it) morality. Yes, there are many variations and exceptions, and such “soft power” won’t win a war on its own, but what passes for global public opinion has had some alignment and securing its approbation is attractive even to the most craven (although this gets pretty faint at the extreme end). As with other forms of power, this public esteem/consideration/acquiescence can be spent. It’s usually not as dramatic as a ship being blown up, or an IMF deal being reneged upon, but over time the reputation of states that gives them credence can also fade. Probably the clearest example of this phenomenon lately is Israel. Here is a country that was established at the behest of European/Western countries in the aftermath of WWII in recognition of the horrific crimes committed against the Jewish people in the Holocaust. Tainted, to be sure, from the beginning, by the impact of its establishment on other peoples who had also been living on the same land; Israel’s alignment with Western democracies and its material success, its embrace of Jews leaving less friendly places around the world, has garnered wide respect, as well as accommodation for the actions it has taken vis-à-vis other groups in the Middle East. Notwithstanding the history that it stands on, Israel’s more recent pattern of aggressive behavior and settler imperialism has done much to erase its accrued goodwill. The latest excesses in Gaza have created a moral deficit that will take decades to offset. A second example is South Africa, which gained substantial moral credit from its decades-long struggle for majority rule and democracy (now marking its 30th anniversary), embodied in the person and leadership of Nelson Mandela. It’s leadership since then has been spotty and prone to several varieties of corruption that often plague countries without deep practice in national community and civil society. Despite its regional (economic) influence and the lack of competitors in Africa for leadership, it risks leaving Mandela to history and becoming rather ordinary. In the 20C, the Soviet Union initially garnered great credit for its (nominal) commitment to socialism and its success in modernizing its country (the costs and oppressions were helpfully hidden). It spent its moral capital crushing Poland, Hungary, and Czechoslovakia and had few friends when its economic contradictions finally undermined its political culture and its empire. Finally, our own United States, which has claimed a mantle of democracy for over two centuries and whose commitment to global improvement over the 20C, most particularly during the two world wars, but also noted for its attention to democracy and human rights outside of wartime, is also facing a reckoning. To be sure, our efforts towards the world have always been colored by the promotion of our own interests and a string of excesses in the projection of power. For all those shortcomings, the US has likely done more good for others (both individuals and groups) than any other modern country. Still, memories are short (and getting shorter) and there’s no small sense of “what have you done for me lately? in the perception of 21C America in the world. It is natural that both the power structures and the wider population within these countries (and there are others) have drunk deeply of the stories and characterizations that reflect this moral high ground. It becomes part of the national psyche, but that only makes the erosion more easily seen from the outside. It comes as a great shock to learn that one is not universally loved/respected. But the conflicts, compromises, and shortcomings inherent in the exercise of power (of all flavors) are more visible externally and the harms they cause are appropriately counted against whatever moral “assets” have been accumulated. Countries that claim the “moral high ground” in terms of international stature, competition, and conflict seek to reap both tangible and “soft” benefits by holding themselves out as role models. They often seek to redefine morality so that their actions and beliefs place them on that “high ground” as well. Some countries, on the other hand, don’t care. But those that trade in this “moral economy” need to remember that even the loftiest and most robust enterprises can go bankrupt if they don’t watch their balance sheet. One of the delights of spending a lot of time with young people is that I get sharply different perspectives on the world. It’s especially good when I can get myself to disengage from my professorial lecturing mode and listen to what they see and what I can get them to say.
A minor example occurred a few weeks ago when we were talking about current social relationships among college students. I heard a term for a particular personal attitude and wrote it down on the board as “dog -like,” only to see smirking eyes, which one of the group explained by saying that the term was “dawg -like.” My point has less to do with the specific situation or definition (You are welcome to google it, but the references to its use 10+ years ago as a complimentary and endearing term seem to be already outdated), but rather the personal distance of perceiving social change. Not having raised kids, I didn’t go through this process from the elder perspective (although I was part of the younger end of the “generation gap” back in the day). At this point, with students who could be my grandchildren (many born 2000-2005), the “gap” is wide and widening. Different mores, styles, and language are commonplaces of modern life; filled with a technology-based acceleration in the normal pace of changing how we live. As a historian, I am always on the lookout for ways in which “change over time” manifests. From a historical distance, it is a challenging exercise to parse incremental social developments and see which ones—which often garnered a lot of attention at the time—fall into one of three characterizations: 1) the shock of the new, 2) the adjustments of transition, and 3) more long-lasting alterations. The first two are, pretty much by definition, transitory, but it is usually nigh unto impossible to sort them out while we are in the middle of things. The impact of AI is a case in point: great brouhaha a year ago, followed by various degrees of cautionary tales and, simultaneously, wider changes in current practices and further improvements . It seems clear that—whatever “AI” is—it’s still way too amorphous to define it, much less assess it. Much the same could be said of the longer-lasting, more encompassing terms “computer age” or “information age.” Telling my students about how I used “punch tape” to “code” a “time-share” computer some decades back produced glazed eyes from them, but fifty years ago it was as wondrous as SIRI (introduced in 2011) or the app de jour of 2024. Indeed, good arguments could be made (simultaneously) that 1) there’s nothing profoundly different about AI; it’s just souped-up computing processing and 2) we are only at the start of the “AI revolution” which will fundamentally change the idea of what it is to be human. How many of the current debates will have faded in ten years (or fifty) as archaic or trivial transitional questions (how many outdated connector cables do you have in your closet?); and how many will be seen as fundamental framings of the way the 21C worked out? At another point, I asked some students about their use of social media and was not surprised to learn that few use Facebook (apparently now principally the preserve of “older” folks). But I was quite intrigued to hear that their use of social media generally is going down considerably. The main reason is the proliferation of ads which makes it harder both to find postings from people with whom one has a “real” connection and reciprocally to ensure that your social media contacts will actually see your posting. Whether these changes will be long-lasting we will discover in due course. But the very question of whether either the “rise” of social media or the current drawing-back from its use among younger people are a blip or a trend highlights the current uncertainty of these phenomena. It also makes clear that much of the surrounding breathless excitement over the past 15 years has been created without any sense of durability or significance. Stated simply, we don’t know what’s going to stick, alter, or fade. Similar points could be made about drug use patterns, personal savings rates, gender identity, or indeed, the importance of a college education. There was a time when jazz music was an important part of the US music scene, but now it seems to be waning and the demographics may consign it to a place not far from folk music as part of our popular culture. There was a time when all “businessmen” wore neckties every day, and now I just hang on to a fraction of my collection for use at major social events. Bumps are not trends, trends rarely last; while we’re in the middle, we can’t tell what is a “secular” change or a fad, or what is the “norm” and what is the anomaly. Thus, the benefit (to me) of engaging with a distant generation in the classroom and even more distant generations in my research and lecturing; each provides a critical corrective to getting stuck in my own cultural cul-de-sac, imagining that a person’s range of experiences are—either in my own particulars or my students’ or those of the past—have some fixed meaning and effect. There was a time (starting in the late 19C) when the terms “moron” and “imbecile” were considered scientific and enlightened; an improvement over the then commonly used “lunatic” or “mad.” As is often the case, such terms gradually acquired derogatory connotations are were consigned to the dust-bin of history, at least among official channels and the “better” parts of society. For a while, “mental retardation” was the preferred terminology, but in 2010, the official US government term was changed to “intellectual disability.”
Similar paths have been followed by a variety of ethnic and “racial” groups. The early 20C saw the establishment of the National Association for the Advancement of Colored People (1909), a term that by mid-century seemed outdated and burdened with considerable social baggage. At various stages, colored, negro, black, afro-american, african-american have all had their vogue. Names were often changed because of the negative connotations that had accumulated over time and a desire for a fresh and positive denominator for a group that sought a clear identity. Of course, some terms have been so heavily weighted with animus that they are de factor barred in common social discourse (e.g., the “N-word”). In the last decade or so, “people of color” has gained in currency; sometimes as a larger grouping of all non-“White” peoples, sometimes as a grouping of all such peoples who are neither “Black” nor “White.” (As I have noted elsewhere, this is a particularly US issue since in most of the world, “White” people are the exception. (Not to mention that “White” people aren’t white and whatever color of their skin might be called, it’s a color; so that the whole discourse is heavily skewed by the history of global epistemological power.) The linguistic history of labels for people with various physical or mental limitations or abnormalities is also the story of the interactions of medical/scientific nomenclature and theories, changing social perceptions; most recently seen in the differentiation of “disabled people” from “people with disabilities.” At this point, I should acknowledge that I have not had the experience/perspective of any of these groups (i.e., I am “privileged”). I also believe that people should be able to choose the labels by which they wish to be addressed/referred to. Historically speaking, the repeated restructuring of such terminology frames seems to be unique to the last century and a half (i.e., a product of modernity) and, I suspect, has contributed to the social confusion/disorientation that many have experienced. The underlying issue, however, is not the labels by which groups are identified. It is the animus or disparagement with which such groups are regarded. The (repeated) efforts to change people’s minds by campaigning to change the socially acceptable language which they use seems to me to be at least ineffective, and sometimes counterproductive. There is some research in linguistics which suggests that the language people use does reflect or cause a different epistemology. Inuit people have dozens of words for snow, while in the 21C US, we have dozens of words for salty crunchy plant-based foods. So, changing language perhaps can change minds. I certainly agree that mustering the forces of linguistics for social justice makes sense but, at the same time, let’s not pretend that new labels will give us much purchase for deeper social change. Indeed, I can’t help but suspect that arguments over labels and nomenclature is but a substitute for a meaningful dialogue (and the hard psychological work) about the feelings behind the words. Instead, we argue whether “person of color” means something different than “colored person,” or “deaf” means something different than “hearing impaired.” There might be some (moral?) improvement in the shift from “disabled person” to “person with disabilities.” Is it better to refer to someone as “fat,” “obese,” “zaftig,” “weight-challenged,” or “full-figured”? I understand the effort not define such an individual by a particular aspect of their lives. However, there seems to be some degree of awkwardness about this change. It also runs against a general preference for using the “active voice” (perhaps as part of an effort to designate these individuals as passive subjects of the condition of their bodies). There is also something off-putting and distancing about this mode of construction; it carries a “scientific” and analytic connotation that seems disconnected from the realities of social relations. Perhaps, at the margin, this kind of language is OK, but it’s still no substitute for figuring out ways to stop derogating people for aspects of their lives that they 1) cannot change, and 2) have no moral significance. In my own practice, I try to actually not use such adjectives at all. I find that the patterns of speech which I developed over the years, especially in referring to others as, e.g., the “young black woman in my class,” would be better I if just referred to that person as “a student.” Most of the time, identifying adjectives don’t really add anything to the point of my talking about them; and, where it’s relevant, I can add the appropriate adjective back in. This requires some effort to be conscious of what I’m saying; an effort beyond that called for in terms of changing underlying attitudes of prejudice, not using terms designed to incite animosity and resentment, and actually seeing people for who they are. At the turn of the 19C, the English romantic poet, William Wordsworth wrote a sonnet of this title. In his poem, Wordsworth decried the materialism of that age (a time which seems so organic, measured, and simple to us now) and the alienation from nature which were among his frequent themes. He would likely be saddened, but unsurprised, by our current state.
Here in the early 21C, the world is way too much with us. The “world” of which Wordsworth warned comprised overlapping spheres of affairs, commerce, and “society.” He wasn’t (particularly) talking about the “world” in the sense of globalization or the intrusiveness of technology (whether a telephone ringing or the comments of “Alexa” or “Siri”). Rather, his focus was on the attitudes of the emerging bourgeoisie of late Georgian England. You could say it was a critique of capitalism, avant le lettre. That spectre is still with us of course. In spades. However, I take the phrase in a broader and more direct sense. The noise of the world, whether of news, sports, or popular culture, is hemming us in. Yes, this noise exacerbates our distancing from nature (or even a walk in that nature-imitator, the “park”). Its insistence (and not just sonic loudness) crowds out our peace of mind. The newspapers of Wordsworth’s time have blossomed/mushroomed/metastasized into streaming, “social,” and other “media” to such a degree that we must make a sustained effort to escape them. On top of the noise and ubiquity, however, are the aggressive demands for our attention (born of advertising/consumer marketing and sharpened by the overdramatization and hyperbolization of language). Beyond the incessant clamor of memes and items to be purchased, lies the disorientation of the material world wrought be technology. Wordsworth wrote before the “industrial revolution” had much broad impact on English (much less global) living and working patterns. The Luddites were still a decade in the future for him. But for us, “disruption” is standard. We have not digested the globalization of commerce of the late 20C. The information/robot/AI revolution is, hauntingly, still in its infancy. We’ve gone from Cronkite to cable TV to far more than 500 channels in less than fifty years. We adapt our lives to our appliances. What we work on, how we work and, indeed, why we work are new in every decade. Families, the traditional bastion of social stability, spin apart geographically; transportation and communications make it seemingly easier to maintain those ties that used to be “in the flesh.” Careers, another mode of continuity, face pressure from the “gig economy,” portable pensions, and job-hunting apps. There is much freedom and choice in all this; benefits not to be sneezed at. Nor is it useful to imagine a prior world as some pre-lapsarian idyll. But there is a cost; real, if hard to grasp. Social changes, too, have brought many gains; chipping away at millennia of social injustice. Relationships—whether personal, social, or commercial—are more complex and dynamic. Embedded expectations of who people are and how to relate to them are upset. All this takes some getting used to, plus there is so much and the pace of change has accelerated so greatly, that it can easily seem overwhelming. This is what I mean by “the world is too much with us.” It is manifest in psychological distress, drug use, political animosity, dis-tethering of established patterns, disorientation, and nihilism. Some seek to reject modernity (or at least the parts of it they fixate upon). Some despair. Some are uncertain. Some disconnect. Social fabrics are eroding; which would be challenging enough if their weaknesses did not undermine the possibility of political action necessary to even try to wrestle with all this (and the climate crisis, too). Indeed, there is mutually-reinforcing cycle of lack of confidence in joint social/political action and the inability of societies/governments to figure out what to do. This is hardly a uniquely American problem. It can be seen across the “West” and, in different configurations, among those societies for which modernity is only partial. Despite Wordsworth’s warning, what we are facing is new, at least in degree. The increased quantity of stress has changed the quality and, as shown in all manner of physical (e.g. polar ice melting) and social (e.g. from discontent to revolution) phenomena, there are discontinuities of response/tipping points. My point in all this is not to join those in despair/nihilism. Instead, it is to highlight the fundamental and interrelated nature of what we are facing. Superficial and symptomatic solutions (including any number of “normal” political/policy proposals) will only get us so far. Indeed, I don’t hold out too much hope for “macro” solutions; whether governmental or social. Rather, the best defense against the world being too much with me is to fortify myself and figure out what is really essential in me and work to reject the worldly intrusions/distractions on my attentions and actions. It means managing myself in my slurping from the firehose of media—political, entertainment, gossip. It means constructing activities (hobbies?) that are meaningful to me. It means engaging with other people on a regular, extensive, and substantive basis. It means tamping down “appetites” of whatever variety (not just food/drink). It means centering on myself without being arrogant and greedy. As the old Sufi story says (in the broadest and calmest way): “If I need enough, and want little enough, I shall have delicious food.” I’ve been reading an impressive work of historical synthesis about the Revolutions of 1848 (Christopher Clark’s Revolutionary Spring). It’s certainly not a “mass market” book (although even at 700+ pages, it’s pretty accessible); but it addresses one of the most problematic set of events in modern European history. That winter and spring of that year saw a widespread series of uprisings across continental Europe, in dozens of locations from Poland to Sicily to Belgium. The ‘top-line’ description of these events is “the revolutions that failed,” since virtually all of them did (at least eventually, and at least on the surface). Governments were toppled, absolute monarchies granted constitutions, radicals and socialists tasted power—briefly—and then “the forces of reaction, entrenched social/political/military elites reasserted control, and scores of people were executed. In France, where the most extensive activity took place, the constitutional monarchy established in 1830 fell, to be replaced by a radical republic in 1848, then a more conservative republic, then by the re-establishment of the Empire under Napoleon’s nephew by 1852. Yet, despite these reversals, the ideas of change, still strongly echoing across the Continent from the great French Revolution of 1789 (Liberty, Equality, Fraternity!), sometimes hung on or were reinstated over the course of the following decades. We in the US tend to dismiss revolutions as somebody else’s opportunity/problem. We tend to think, as Sinclair Lewis wrote in his 1935 dystopian novel-turned play about a fascist take-over “It Can’t Happen Here.” Those revolutions that have occurred recently—the fall of Communism thirty+ years ago, the brief moment in Tien-a-min Square in Beijing in 1989, the short-lived “Arab Spring” of 2011—have been distant blips for most of us. Our own Insurrection of 2021 was appalling but highly localized and easily dismissed as a fringish fluke. But just because we’re out of revolutionary practice doesn’t tell us much about the present or future. Most modern revolutions have come from the “left,” embodying demands for social justice and more distributed political power; testing whether the embedded power structure was too ossified to withstand the energy of the “people.” Some were implemented (more-or-less) through existing legal/constitutional structures, but most involved violence. Nor have they been distinctly “Western” affairs, despite the disproportionate amount of ink spilled on Europe and the Adams/Jefferson/Washington events of the 1770s-80s. The events of 1848 offer us some useful reminders in our current situation: 1) You can never be sure what will happen next. The Revolutions of 1848 were, generally, surprises. There were agitations, protests, and intellectual ferment to be sure. But the uprisings and violence were each the result of local culture, personalities, and power structures. Most incumbent governments were caught off-guard. Contingencies were dispositive. In times of turmoil, politics (not to mention violence) have been highly dynamic; those who lit the match were often supplanted by others with different priorities or even completely different orientations. We can see in France in the 1790s, again in the 1840s, and in Russia in 1917 a bewildering array of claims to power, some of which lasted only a few weeks. Is the current environment similar? There is certainly vast discontent in the country and a lot of ideas for change. We had an unsuccessful insurrection three years ago and there are more than a few echoes from Napoleon III to our own orange-haired would-be emperor. There are vague rumblings of a “civil war.” Just as Monty Python famously said: “No one ever expects the Spanish Inquisition,” so too revolutions, while plotted and feared, are rarely announced in advance. The only claims of inevitability (of success or failure) come from lazy historians in retrospect. In the US, the propensity to violence from the 1960s to 1980s has lain mostly in the “left;” but lately it is the “right” that seems most agitated and ready to force issues. History gives us enough examples of different revolutionary paths that most general story lines have been written, even if specifics will vary significantly. 2) Outbursts of “revolutionary” energy often dissipate quickly. Coalitions of convenience and discontent don’t easily translate into coherent government and stable public order. Many are just along for the ride or are quickly disillusioned and return to the sidelines. Sicily and several parts of the Austrian Empire colorfully illustrated this in 1848. It’s much easier to critique and disrupt and much more challenging to articulate policies and gain widespread support (as the House GOP has regularly demonstrated recently in their own small way). 3) “Progress” is an illusion. 1848 saw great claims, excitement, and celebrations. Then not. Constitutions granted were revoked, newly-minted parliaments were disbanded, freed people were enslaved, and cultural changes stuttered. Steps forward do not inherently build on themselves; but sometimes, they do. This is particularly true over time. Narratives of progress—whether for the US, Europe/the “West,” or the world—are fine as history; but, as stock brokers all tell you: “past results are no guarantee of future performance.” Moreover, what counts for “progress” depends not only on one’s political predilections, but on digesting the actual results of past events. Things often don’t turn out the way their sponsors hoped. 4) Historical assessments depend on when they are being made. This is closely-tied to the last point. One way historians distinguish themselves from journalists is that the latter write while it’s far too early to tell what will happen. But even at some distance, assessments change and not just because of differing historiography. Early 1848 revolutionary jubilations were pretty much reversed by the following year. The dispersal of political power to the “lower” classes moved incrementally over the following century. The Austrian Empire remained intact for another twenty years, until the shock of a loss to the Prussians forced modest changes. France, too remained as a monarchy/empire until 1870 when the Prussians (again) knocked them over and a republic finally took root. In other places, social services, and the spread of the franchise moved incrementally and locally for decades. There was little of what we would recognize as full-on democracy until after WWI. Historians along the way (and through today) have talked about the “success” or “failure” of the Revolutions of ’48; but if we stop focusing on the initial spasm and stretch our view to a century; much of what the “revolutionaries” sought actually resulted. It all depends on when you ask the question. These are not, I hasten to reiterate, “lessons of history.” Restating and interpreting (& reinterpreting!) the events of the past is what historians do, but projecting those past events into current or future situations is a game for mugs and pundits. Precedents make for plausibility and merely help the historian be “not surprised” by current events. It’s a long way from plausibility to prediction, or at least it should be. |
Condemned to Repeat It --
|
|