A minor example occurred a few weeks ago when we were talking about current social relationships among college students. I heard a term for a particular personal attitude and wrote it down on the board as “dog -like,” only to see smirking eyes, which one of the group explained by saying that the term was “dawg -like.” My point has less to do with the specific situation or definition (You are welcome to google it, but the references to its use 10+ years ago as a complimentary and endearing term seem to be already outdated), but rather the personal distance of perceiving social change. Not having raised kids, I didn’t go through this process from the elder perspective (although I was part of the younger end of the “generation gap” back in the day). At this point, with students who could be my grandchildren (many born 2000-2005), the “gap” is wide and widening.
Different mores, styles, and language are commonplaces of modern life; filled with a technology-based acceleration in the normal pace of changing how we live. As a historian, I am always on the lookout for ways in which “change over time” manifests.
From a historical distance, it is a challenging exercise to parse incremental social developments and see which ones—which often garnered a lot of attention at the time—fall into one of three characterizations: 1) the shock of the new, 2) the adjustments of transition, and 3) more long-lasting alterations. The first two are, pretty much by definition, transitory, but it is usually nigh unto impossible to sort them out while we are in the middle of things.
The impact of AI is a case in point: great brouhaha a year ago, followed by various degrees of cautionary tales and, simultaneously, wider changes in current practices and further improvements . It seems clear that—whatever “AI” is—it’s still way too amorphous to define it, much less assess it. Much the same could be said of the longer-lasting, more encompassing terms “computer age” or “information age.” Telling my students about how I used “punch tape” to “code” a “time-share” computer some decades back produced glazed eyes from them, but fifty years ago it was as wondrous as SIRI (introduced in 2011) or the app de jour of 2024. Indeed, good arguments could be made (simultaneously) that 1) there’s nothing profoundly different about AI; it’s just souped-up computing processing and 2) we are only at the start of the “AI revolution” which will fundamentally change the idea of what it is to be human. How many of the current debates will have faded in ten years (or fifty) as archaic or trivial transitional questions (how many outdated connector cables do you have in your closet?); and how many will be seen as fundamental framings of the way the 21C worked out?
At another point, I asked some students about their use of social media and was not surprised to learn that few use Facebook (apparently now principally the preserve of “older” folks). But I was quite intrigued to hear that their use of social media generally is going down considerably. The main reason is the proliferation of ads which makes it harder both to find postings from people with whom one has a “real” connection and reciprocally to ensure that your social media contacts will actually see your posting.
Whether these changes will be long-lasting we will discover in due course. But the very question of whether either the “rise” of social media or the current drawing-back from its use among younger people are a blip or a trend highlights the current uncertainty of these phenomena. It also makes clear that much of the surrounding breathless excitement over the past 15 years has been created without any sense of durability or significance. Stated simply, we don’t know what’s going to stick, alter, or fade.
Similar points could be made about drug use patterns, personal savings rates, gender identity, or indeed, the importance of a college education. There was a time when jazz music was an important part of the US music scene, but now it seems to be waning and the demographics may consign it to a place not far from folk music as part of our popular culture. There was a time when all “businessmen” wore neckties every day, and now I just hang on to a fraction of my collection for use at major social events. Bumps are not trends, trends rarely last; while we’re in the middle, we can’t tell what is a “secular” change or a fad, or what is the “norm” and what is the anomaly.
Thus, the benefit (to me) of engaging with a distant generation in the classroom and even more distant generations in my research and lecturing; each provides a critical corrective to getting stuck in my own cultural cul-de-sac, imagining that a person’s range of experiences are—either in my own particulars or my students’ or those of the past—have some fixed meaning and effect.