Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

History and Truth

6/25/2021

0 Comments

 
History, as with any other modern field of study, knows better than to claim that it produces the “truth.” As with theories of cosmic formation or of the modes of Coronavirus transmission, historians frame understandings of the past as best they can, knowing that their successors will ask new questions, utilize newly-discovered sources of information, and propound new answers that will seem, at least for a while, as an improvement; but no final “truth.” Indeed, as the recipient of many cocktail party propositions/inquiries about history, I always respond with “Yes, but it’s more complicated than that…”

Physicists and immunologists are perhaps more effectively inoculated against hubris by the frequency of paradigm shifts and lesser revisionisms in their fields. Most professionals can likely recall more than one occasion on which they discovered that they were “wrong” about some received theory.

The percentage of those who interpret history without benefit of a degree or extensive study is considerably higher and their platforms (even before the 21C media circus) have been more widespread and well-known. Indeed, the insight that journalists write the “first draft of history” captures a significant portion of this phenomenon. (Professional) historians who claim the mantle of “public intellectual” comprise a closely-related group. Perhaps it is the limelight, or the heat of public intellectual battle or the melodramatic tone of modern media that leads them to forget (or at least elide) some necessary humility and to omit a degree of self-scrutiny about the theories they propound.

Both sides in the recent debates about the NYT’s 1619 Project provide examples of historical interpretations that overreach and, therefore, undermine the public benefit they seek to advance. Claims that racism is the US’ “original sin” require a fair amount of theological foundation, even if the evilness of racism is easily understood. By defining US history in this way, rather than merely characterizing racism as an important aspect of US history for 400 years, the 1619 authors oversimplify the story; they make it seem as if they are telling “the truth,” rather than engaging in an important illumination of our history.

On the other hand, critics of 1619 lazily attack it for overstating the story and readily fall into a narrative of exoneration and excuse. Indeed, the ‘whitewashing’ of US history for the past two hundred years—in which Blacks were all but absent and Whites were avatars of national triumphalism—is at least as problematic.

The Arab-Israeli conflict is another site of such behavior, as are the characterizations of Japanese behavior in China and Korea in the early 20C. Where history is hotly contested, where there is no consensus on the likely “truth,” claims are more strident, as if rhetoric and table-pounding would lead to a resolution.

Much of the time, this type of “history” is signaled by the use of the definite article “the” rather than the indefinite “a”: “the truth about …” vs. “a truth about…”  Sometimes, it is coded in narratives. Sometimes it is rationalized away with claims to social justice or patriotism. Sometimes, it overloads the proposed theory with far too much explanatory power. The idea of “race, for instance, in the 17C was quite different than how it was understood in the 20C.  Similarly, claims that the “Founding Fathers” were supporters of democracy (i.e. universal voting and equally-distributed power) are as unfounded.
As an alternative, I suggest historians (and wannabe’s) be mindful of the nature of their enterprise. Charles Beard’s “Economic Interpretation of the Constitution” (1913) was, in some ways a forerunner of 1619. It was extremely controversial and resonated with Marxist worldviews common in that era. It remained an important part of American historiography for much of the 20C. Beard claimed that the design of the US government in 1787 was driven by economic interests, rather than the hallowed political/liberty narrative embedded in our national culture. Importantly, however, he characterized his work as an “interpretation” and rooted it in the on-going debate on the nature of the country and its history.

We (historians who address the public) might do better to remind ourselves that we are engaged in a process of searching after the truth and that we try to figure out if we can learn something new by devising new questions and interpretations of the past. We know all too well that most historical events/developments have multiple causes and that there is no simple mechanism to assign relative weights. Even in the “hard” sciences, there are few ‘Eureka!’ moments; fewer still in History.

The modern discipline of History arose in the 19C in response to the increased awareness of the pace of social change over the previous few centuries and the increasingly apparent inadequacy of Scripture-based narratives of the past. Wider public awareness of science was (more-or-less) simultaneous and, while History is sometimes described as a “science,” its task is far more complex and its answers are necessarily more tentative and disputable.

The underlying question which few historians (public or otherwise) raise or respond to is: “what is the benefit or insight resulting from this new interpretation?” There are generically, several answers to this question; including 1) giving voice to those silenced, 2) putting a new theory into the mix, 3) reading old sources in new ways, or even 4) just getting published. If historians would write more detailed (and self-reflective) Prefaces to their works, both they and we could better understand their purposes. We would also get better History as a result.

Yes, we’re all searching for the “truth,” but it takes no little chutzpah to claim to have found it.

0 Comments

Google U

6/18/2021

0 Comments

 
Last year, Google announced the deployment of its Google Career Certificates as an alternative route to employment without the need for a college degree. This marks another step in the much-needed disruption/revolution of higher education, aspects of which I addressed in my piece on April 30.

However, the real significance of the Google competitive threat is not in the quality of their courses or their relative inexpensiveness, rather it is in Google’s role as a leading-edge employer and their willingness to forego the ornate sheepskin that certifies completion of a “college education” as a prerequisite to entry into the working world.

Google as “instructor” may not be fundamentally different from the University of Phoenix or Southern New Hampshire University, or, for that matter, Coursera or other MOOC (“massive open on-line course) purveyor. But, Google as employer (and, by extension, other leading STEM companies and other smaller firms hiring “certifiable” talent) is a different matter.

If you are post-high school or community college, fitting in another (non-cheap) two-to-six year gig at your regional public university may not be so attractive by comparison, if (and this is key) there is a large pool of potential employers that don’t care if your certificate is signed by multiple Ph.D.s or the VP of Telecom Training at AT&T.

Whatever the values of the traditional university education, including breadth of courses and socialization, the crucial component its economic value is the fact that the certificate (i.e. diploma) comes from someplace which has been “accredited” to issue them. Universities jump through all sorts of hoops to secure/maintain their “accreditation,” but the process is redolent of the College of Heraldry in the British Empire. The accreditation institutions (a whole bureaucratic process itself) issue their educational equivalent of the “Good Housekeeping Seal of Approval” to ensure students (and research funders) that the university meets certain standards. As a practical matter, employers rely on this accreditation to accept the diplomas issued by those institutions as the basis of hiring.

This is also important for those students going on to graduate and professional schools, but that is a relatively small group.

For a significant portion of students, particularly those at non-elite universities, all they want is to get their diploma and get out into the “real” world; the diploma is their admission ticket to a job (or a “better” job). For many, their GPA isn’t all that important. This “get by and get out” mentality was reinforced by the extra pressures of pandemic/remote learning.

So, if Google (et al.) are willing to take another admission ticket, what is it that traditional universities have to offer? What is distinctive about the college experience that justifies the extra cost and hassle? Many students (and parents) value the prestige, the liberal arts, the breadth of educational exposure, and the many aspects of socialization that occur on college campuses (e.g. sports, extra-curriculars, dorm life). Many students (and parents) value the enhanced opportunity for graduate/professional education. Google U isn’t going to directly affect them.

However, many can’t afford those things, or aren’t interested in them. So, we will have to see how many forego the traditional collegiate experience. Even those who have bought into the traditional modern mindset of “to get a good job, get a good (college) education” will have to question whether that mantra is outdated.

There isn’t (and likely won’t be) a private-sector “accreditation” agency that can provide assurance to hiring companies that the virtual certificate issued by Google U (or Health Tech Training Institute, or other groups) will be worth the (non-) paper it is printed on, so things are likely to be messy for quite a while. But the nature of disrupting traditional business models is such that we shouldn’t look for replicating or just tweaking those past models. There are sure to be failures, as the MOOC-mania of a few years ago demonstrates.

On the other hand, the bourgeoning disruption of the hiring business (e.g. Linked-In, Indeed, Monster) shows that this component of the human supply chain can change. It can also provide an accelerator for accommodating this shift. A hiring manager will need only check the “Google U” box on the “acceptable educational background” box on the hiring intake template to open this up.

This development does not portend the end of traditional education. As I noted above, there are many segments for which either the experience or the outcome is worthwhile. Still, if even a modest portion of the student body pool evaporates (as it were)—say, 20%—then the economic impact on certain groups of schools, particularly non-elite public universities, could be significant, especially given their less than rosy financial outlook generally.

From an institutional perspective (as compared with the student’s concern or societal perspectives), Google U and the erosion of the accreditation monopoly raise the question of whether  these schools could develop alternative modes of instruction and certification. Here, the bureaucratic and institutional constraints (e.g. legislatures, unions, pedagogical open-mindedness) make it unlikely that radical and innovative ideas could be implemented at least in the short-to-medium term.

Google, social media, Uber, mobile phones, Tesla—just a short list of disruptors whose shock waves are still spreading and whose social impacts are profound. In the world of education, technology has been seen as a disruptor as well, but one whose impacts are being incorporated into existing models with no more than the expected hiccups and stumbles. The attack on accreditation, however, goes to the heart of the university’s role in 21C US society. It will not be a challenge easily met.

0 Comments

De-merit

6/11/2021

0 Comments

 
On the face of it, a meritocracy sounds sufficiently laudable to rank right up there with mom and apple pie. So, how to explain the current turmoil over the idea that the “best” should rise to positions of power and leadership? As usual there are both historical, semantic, and implementation aspects of this issue.

In a premodern world, power (governmental positions/largesse, mercantile advantage) was typically distributed to those who were related to (or at least part of the same social set as) royal families, aristocracy and nobility. Kings’ younger brothers were made admirals and generals willy-nilly. Land was granted to the favorites of Queens (or mistresses). One of the most famous examples from late 19C Britain was the Cabinet appointment of Arthur Balfour by his uncle/Prime Minister Robert Cecil, Marquis of Salisbury. This elevation of his nephew gave rise to the term “nepotism,” (and also, the catch-all explanation of mysterious developments: “Bob’s your uncle”).

Increased professionalization of the military and the civilian bureaucracy in 18/19C Europe (and, e.g., the first US Civil Service Reform Act was in 1883), marked the shift to more professional management (of course, for centuries, China’s famous imperial service was based on a rigorous examination for wanna-be mandarins). As corporations increased in size and distance from their investors (late 19C-mid 20C) they, too, responded to competitive pressures by professionalizing their management. Standardized testing for academic admissions (the SATs began in 1926) expanded the scope of “merit”-based advancement in society.

From this perspective, it’s hard to see “meritocracy” as any sort of bad thing. As a marker of social progress, a manifestation of modern organized and rationally-based decision-making, both public and private sectors seemed poised to benefit from moving away from “who do you know?” as the standard of hiring/promotion in increasingly large and complex activities.

Still, there is no progress without downsides. Here, the idea that “merit” could be measured set up several problems. First was the conflation of the measurable with ability or value. People like simple answers and formal, statistically-based, well-organized decision-making structures fill the bill, regardless of whatever edges of judgment and insight are carved off in the process. The other was the other frequent human response of “gaming the system,” most famously manifest in test-cramming courses in several East Asian countries. If objective criteria are established, they can be targeted for success. This fosters an environment in which the selection process is more important than the underlying values for which the selection is made. Individuality is suppressed in favor of conformance. Resources are applied to secure success at the earliest possible logical step in the selection process (e.g., competition for private kindergarten schools in Manhattan) and carried to pathetic extremes (e.g., the recent college admissions scandals, perpetrated by very privileged/rich families).

The most insidious problem, however, was that people started to believe that the “merit” system was definitive (instead of a rough bureaucratic approximation). In other words, “merit” for purposes of selection/advancement was conflated with moral worth from a social perspective. This has had all sorts of pernicious effects. For example, most people would think that someone who scored well on academic tests or met thoughtfully-designed hiring/promotion criteria was better than someone who’s interests and values were not susceptible to the measurability/bureaucratic mentality. (why is it, exactly, that a being a lawyer is better than being a carpenter?) This is particularly true of those with “merit,” who tend to believe their own version of the “chosen people/elect of God” self-justifying worldview. Such an approach profoundly distorts society, government, business, employment, and personal life choices. [You can get a more detailed exploration of this issue from Michael Sandel, a pretty-insightful Harvard professor, here.]

Then, there is a rather large question of whether “merit,” even if it were an accurate assessment of moral value or individual capability, is fairly determined in our society. The standards for “merit” have been determined by a society with a long history of domination by white males from wealthy families and extraordinary opportunities. It can’t be surprising that their standards of “merit” reflect their own criteria of quality. But other people have other perspectives and it is possible (likely?) that the embedded structures are highly discriminatory.

Moreover, it seems pretty clear that “merit” is more a result of nurture than nature. Growing up in a stable, supportive family, access to attentive, challenging, and enriching educational and cultural experiences, physical and environmental health all are highly causative of “merit,” but they are not equally distributed in society. So, it’s no wonder that prototypical elite college freshmen have most of these advantages. We have to ask whether their “merit” is really theirs? [Full disclosure: I had all of these and a lot of them.]

Does that mean that those with “merit” aren’t relatively smart/capable? No. But it does undercut, in terms of both salaries and societal esteem, the idea that they “earned” it. It does undercut the idea that our society really believes in equality of opportunity as we often proclaim. It does undercut the idea that our standards for value and “merit” are really as objective as we might like to seem.

I’ve consistently put “merit” in quotes throughout this essay as a reminder of how easy it is to forget that it is a social construct. If we really believe in the quality of the individual matters, then we are losing out of a lot of talent. If we insist that “merit” is moral value, we are losing out on reframing our society to reflect a broader sense of value and ethics. We are not just harming those with less “merit,” we are harming ourselves.

0 Comments

____--Americans

6/4/2021

0 Comments

 
The language we use says a lot about us individually and the terms embedded in our cultural discourse, phrases we use without a second thought, say a lot about our society; in terms both of the various heritages present here in the US and of the way we see ourselves in the world. In my recent course on globalization, I have emphasized that the late 20C and early 21C have been characterized by a dramatic increase in global awareness/epistemology, especially among people in the US. As a big and dominant country, it’s been easy for centuries to ignore most of the rest of the world or at least push it to the back corners of our brains. That’s been changing and I’ve been thinking about how this development has shown up in my own life.

I was struck the other day while reading an article about political trends and the surprising failure of the Democrats to garner the bulk of the newly-expanded Hispanic-American voting population. The article noted that Cuban-Americans, especially in South Florida, tended to vote more Republican than Hispanic-Americans elsewhere and that certain groups of Hispanic-Americans in South Texas were more amenable to socially conservative stances. In other words, they didn’t fit their stereotypes as promoted by the political commentariat in which (all words in individual quotes—poor, striving, brown—immigrants were committed to social justice and the removal of “racist”/”reactionary” republicans.

The electoral implications of these revelations are significant, but my concern is how this incessant ethnic categorization is more broadly facile and socially destructive. In other words, using “Hispanic” to describe all people who have come to the US from Central and South America is coarse. Using it defines them principally as not quite American (as with other hyphenated categories) and makes that difference more important than their particular cultures or self-identities.

We use the term “Hispanic” about twenty times more frequently than we do “Mexican-American” even though those of Mexican heritage in the US comprise more than 60% of those of some Latin American origin. One need not go very much farther down Central America to realize that phrases such as “Salvadorean-American,” “Panamanian-American,” or “Paraguayan-American” are no real part of our language (or our thinking).

We can see the same phenomenon with regard to the recent concern about anti-“Asian-American” discrimination and hate crimes. Outside the question of “what kind of food shall we have tonight?,” there is little time for distinction between those of Chinese, Korean, Thai, or Philippine extraction. Have we, as a society, advanced only microscopically beyond “they all look alike”? The phrasing of “anti-Asian” hate crimes might at least be accurate insofar as it reflects the same coarseness of thinking and over-grouping of people from Asia by those who attack/discriminate hate. But using this term doesn’t reflect/respect the individuals attacked who have recently come from places whose cultures are at least as diverse as Europeans.

The language used with regard to those of European descent is clearly different. How often have you referred to/thought of someone as a “European-American”? The cultural dominance of Europe is so strong that our emphasis on national difference ensures we talk of “Italian-Americans” or “German-Americans” distinctly. Does anyone even use “Anglo-American” or “British-American” as a significant domestic ethnic identifier?

This practice pre-dates the rise of “identity politics” in the late 20C. Americans—mostly mongrel—seem sufficiently insecure of their “melted pot” identities to avoid insisting on ‘othering’ everyone else in sight. And we’ve not been troubled to be particularly precise about it at that.

I’ve noticed that the regular use of such phrases seems to far outweigh any possible benefit from whatever identification benefit might come from their use. In other words, when talking to a colleague about a student’s performance, why do I say “this Black woman wrote a great paper” or a “this Japanese guy didn’t know how to do use the Library”? There might be some benefit if my colleague was going to meet them (an identification benefit); but generally, that’s rare. More likely, we’re just talking about them as an example of student behavior. I suspect that I’m using this language more as a memory aid, with the unfortunate side-effect of gratuitously classifying the student in an irrelevant context. So, I’ve undertaken a conscious effort to drop the ethnic (& gender) adjectives unless there’s a specific benefit I can think of in the moment.

What this comes down to is being more mindful in my language (and thought). Most of us are well past the stage of using the classic ethnic slurs, despite what we might have seen/heard as we were growing up. I hope I am ready for another step forward.

As to the second aspect of this question, I have noticed that the default term of reference for US citizens (both how we use it and how others refer to us) is: “American.” Whether as a noun or an adjective, it’s everywhere. Of course, I’m not urging the mouthful of “United States of America” in everyday conversation; but I know there are a whole bunch of other countries in this hemisphere; all which are filled with “Americans.” Other than the fact that the US has been the most powerful country in the hemisphere for two hundred years, I can’t think why we should claim exclusive rights to the adjective “American.”

In my writing and academic speaking, I have tried to shift to “US” or “U.S.” instead. I know that there are other “United States” in the world (including the “United Mexican States”, and other historical references), but I think it’s a clear improvement, even if it’s awkward not to have a good casual substitute for “American.”

As a white male from the US, I have been deeply immersed in systemic “othering” for my whole life. I have some work to do to dig out.








0 Comments

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly