Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Froth and Fundamentals

3/8/2024

1 Comment

 
One of the delights of spending a lot of time with young people is that I get sharply different perspectives on the world. It’s especially good when I can get myself to disengage from my professorial lecturing mode and listen to what they see and what I can get them to say.

A minor example occurred a few weeks ago when we were talking about current social relationships among college students. I heard a term for a particular personal attitude and wrote it down on the board as “dog -like,” only to see smirking eyes, which one of the group explained by saying that the term was “dawg -like.” My point has less to do with the specific situation or definition (You are welcome to google it, but the references to its use 10+ years ago as a complimentary and endearing term seem to be already outdated), but rather the personal distance of perceiving social change. Not having raised kids, I didn’t go through this process from the elder perspective (although I was part of the younger end of the “generation gap” back in the day). At this point, with students who could be my grandchildren (many born 2000-2005), the “gap” is wide and widening.

Different mores, styles, and language are commonplaces of modern life; filled with a technology-based acceleration in the normal pace of changing how we live. As a historian, I am always on the lookout for ways in which “change over time” manifests.

From a historical distance, it is a challenging exercise to parse incremental social developments and see which ones—which often garnered a lot of attention at the time—fall into one of three characterizations: 1) the shock of the new, 2) the adjustments of transition, and 3) more long-lasting alterations. The first two are, pretty much by definition, transitory, but it is usually nigh unto impossible to sort them out while we are in the middle of things.

The impact of AI is a case in point: great brouhaha a year ago, followed by various degrees of cautionary tales and, simultaneously, wider changes in current practices and further improvements . It seems clear that—whatever “AI” is—it’s still way too amorphous to define it, much less assess it. Much the same could be said of the longer-lasting, more encompassing terms “computer age” or “information age.” Telling my students about how I used “punch tape” to “code” a “time-share” computer some decades back produced glazed eyes from them, but fifty years ago it was as wondrous as SIRI (introduced in 2011) or the app de jour of 2024. Indeed, good arguments could be made (simultaneously) that 1) there’s nothing profoundly different about AI; it’s just souped-up computing processing and 2) we are only at the start of the “AI revolution” which will fundamentally change the idea of what it is to be human. How many of the current debates will have faded in ten years (or fifty) as archaic or trivial transitional questions (how many outdated connector cables do you have in your closet?); and how many will be seen as fundamental framings of the way the 21C worked out?

At another point, I asked some students about their use of social media and was not surprised to learn that few use Facebook (apparently now principally the preserve of “older” folks). But I was quite intrigued to hear that their use of social media generally is going down considerably. The main reason is the proliferation of ads which makes it harder both to find postings from people with whom one has a “real” connection and reciprocally to ensure that your social media contacts will actually see your posting.

Whether these changes will be long-lasting we will discover in due course. But the very question of whether either the “rise” of social media or the current drawing-back from its use among younger people are a blip or a trend highlights the current uncertainty of these phenomena. It also makes clear that much of the surrounding breathless excitement over the past 15 years has been created without any sense of durability or significance. Stated simply, we don’t know what’s going to stick, alter, or fade.

Similar points could be made about drug use patterns, personal savings rates, gender identity, or indeed, the importance of a college education. There was a time when jazz music was an important part of the US music scene, but now it seems to be waning and the demographics may consign it to a place not far from folk music as part of our popular culture. There was a time when all “businessmen” wore neckties every day, and now I just hang on to a fraction of my collection for use at major social events. Bumps are not trends, trends rarely last; while we’re in the middle, we can’t tell what is a “secular” change or a fad, or what is the “norm” and what is the anomaly.

Thus, the benefit (to me) of engaging with a distant generation in the classroom and even more distant generations in my research and lecturing; each provides a critical corrective to getting stuck in my own cultural cul-de-sac, imagining that a person’s range of experiences are—either in my own particulars or my students’ or those of the past—have some fixed meaning and effect.

1 Comment
Mark Carnes
3/9/2024 08:45:05 pm

A thought about your students' smirking at your lack of comprehension of the meaning of "dawg". (And, yes, I had to look it up in a dictionary of current slang.) I used to think that I, as a professor, needed to bridge the linguistic gap between myself and my students. How can I communicate if we don't share a common vocabulary? But I've come to appreciate that students perceive instructors as pseudo-parents: objects who constrain the students' freedom and desires (while sometimes, perhaps surprisingly, offering solace and encouragement). But if students are going to take their place in the world (and remove us from that place!), then they need to forge bonds with each other. And, thus, a measure of distance from us is essential to their enterprise. We cannot be among their dawgs, as it were, a sentence whose diction shows that I pose little threat of encroaching on their affiliative sensibilities.

Reply



Leave a Reply.

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly