As a historian who wrestles with issues around democracy, nationalism, and the state, I guess I qualify to some degree as an “intellectual historian.” I still vividly recall a class in Modern European Intellectual History that I took at Brandeis in about 1975 in which Prof. Izenberg held me in rapt attention for most of a semester. My head was spinning with the ideas of Marx, Nietzsche, Rousseau, Burke, and Sartre (to name just a few).
However, I’ve long had trouble with the concept of “intellectual history” on several fronts. First, it has a tendency to turn into an abstract jousting of ideas from the “great thinkers” of different eras and cultures without regard to the vastly different contexts in which those ideas were developed. This sort of “dueling philosophers” history all too often ignores that Virgil and Adam Smith faced very different lifestyles when they each wrote about what we might now call “economics.” Current intellectual historians use such set-ups more to demonstrate their own dexterity and sophistication than to say much useful about either one. Historians, even “intellectual historians,” can’t ignore context.
This used to be a bigger problem than it is in the last few decades. A movement (the “Cambridge School” as it’s known in the trade) argued very hard and generally successfully that language—the specific words used by the subjects of intellectual history (e.g., Plato’s references to “the people”) had to be read in context and that their meaning typically migrated over the decades/centuries. To compare and connect Aristotle, Machiavelli, and Carl Schmitt requires some serious work to parse the meaning of (e.g.) “democracy” in a particular time and place.
Speaking of language, intellectual history is also susceptible to getting caught up in jargon. Such esoteric language sometimes sounds impressive, but quickly becomes more about demonstrating the author’s erudition than with actually communicating something meaningful to the reader.
More fundamentally, intellectual historians all too often conflate the ideas they’re studying with the broader culture from which they emerge. For example, studying the insights of David Hume (18C Britain) is all well and good. He was a really smart guy (and, apparently, quite charming). An active participant in what we now call the “Enlightenment,” he challenged a raft of commonly-held ideas about politics and religion. However, the literacy rate in his neighborhood was still in the low double digits and even fewer had the time, money, and inclination to read philosophical texts. So, to hold Hume out as an exemplar of the broader culture of mid-18C Britain is quite deceptive. Similarly, Gianbattista Vico, a Neapolitan of the early 18C, wrote a highly insightful book (“The New Science”) in 1725, but it wasn’t translated into German or French for a century, and so had limited impact on the Enlightenment (Hume would have been quite interested). In terms of intellectual history Ie.e., the actual transmission of ideas), then, Vico might belong more to the 19C than his own 18C.
One of the underlying causes of this approach to history is that smart, intellectual historians tend to focus on, write about, and (perhaps) overestimate the significance of smart intellectuals from the past. It’s a bit too much of the guys in the club patting themselves on the back, as if only “intellectuals” (remember, this is a very small group even today!) could be a source of important ideas. Instead, we might reflect that considering that the ideas and beliefs of the (great unwashed) masses of people (who, by the way, didn’t have the time to write books) actually constitutes a better assessment of the culture. It is (almost literally) superficial. It’s as if an alien spacecraft were to survey the Earth, note that 70% of the surface was water and conclude that we lived on a watery orb, when in fact, water constitutes only .02% of the total mass of material in our planet.
At one level, discovering who was the first to publish an idea is an important historical function (remembering that 1) he (almost always a “he”) was not necessarily the first to think that idea, and 2) even if someone else came up with the idea and wrote it down, unless it was preserved over the centuries we wouldn’t know about it). From another perspective, it’s as if we were only measuring high tides, not the overall average water level (to run another maritime metaphor). There is ample evidence that human culture takes a long time to change, and the distance from the creation of an idea to its widespread acceptance is often a matter of centuries. “When (if ever) did the great bulk of Europeans (or Africans for that matter) believe in the second coming of Jesus?” seems far more important than when some late antiquity synod made such a declaration. Ditto for democracy; folks have been talking about it for 2500 years (+/-), but as we look around, there’s only intermittent evidence that it has sunk in.
Of course, it’s hard to know what the unlettered and unpublished think. So the focus on the few who did think (and publish) is one way in which history falls prey to the fallacy of availability as a substitute for significance. The flip side of this last point is that we have to remember that most intellectuals were writing to other intellectuals, not to the unlettered. Grand ideas of utilitarianism, socialism, epicureanism, etc. are fine for those who choose to wrestle with them; but most folks don’t. Most folks don’t have the time/interest/preparation to engage in such pursuits. (it’s one reason intellectual history so often seems disconnected from the periods in which such ideas were broached).
With rare exceptions (e.g., Marx-Lenin-Russian Revolution), it’s usually pretty hard to track ideas directly and immediately into the “real” world where most folks don’t get beyond working and living and eating. But, even if we can’t pin the significance of ideas down very much, we feel certain that they’re important (especially if we’re already intellectual).
RSS Feed