Friday, February 20, 2026

Burst

There are still places on this earth where people do burst into song.

I was in Dundalk, Ireland, in an ordinary, nondescript Italian restaurant on the high street, with some friends who were there attending a conference, along with a few other miscellaneous diners, when suddenly a woman at a table nearby, probably in her forties or so, let out with the refrain to Cat Stevens’s “Moonshadow” – and at once we were all, instantly, “leapin’ and hoppin’” along. The strange thing was that the waiters and waitresses took no notice of it; she sang, and we all came in on the chorus. At the end of it, there was applause, and people looked ‘round the room at one another, and someone called out “Well, give us another song!” And someone did.

This would never, of course, happen where I live, in the United States. Here, no one bursts into song, except maybe in a Karaoke bar, and badly. If you were to start singing at the top of your lungs in a TGI Friday’s or a Pizza Hut, people would think you were crazy, and before long the police would be called, and you’d be taken away to face charges.

But here, in Ireland, it's understood that people do do such things, and that it's a perfect part of the ordinary. People could, actually, burst into song in a public place, and just as in an episode of  “Pennies from Heaven,” the other diners – if they didn’t join in themselves – would just carry on with their business as though it were the most usual thing in the world.

*     *     *     *     *

Growing up in the suburbs of Cleveland, Ohio, I was subjected to my parents’ deep and inexhaustible love of musicals. Living far from New York, we never actually went to one, but they blared forth from the stereo all the same. “My Fair Lady” – “Oklahoma!” – “The Most Happy Fella” – “The Pajama Game” – “The Mikado” – “Carousel” – “South Pacific” – the list seemed endless. My mother loved them, and my father loved them even more, turning out all the lights in the living room and turning up the stereo as far as it would go without distortion. “We are gentlemen of Japan!” … “All I want is a room somewhere” … “A p'liceman’s lot is not a happy one” …“Some enchanted evening” … there was an instant drama in every line, and though I hated the music itself, some portion of its feeling got under my skin, and planted a kind of weird seed for the future. I remember hating these songs, the darkened room, the way that my father would, from his prostrate position on the couch, quiver quietly with tears.

But now, decades later, when I hear these same songs, I’m the one that's weeping. Not for the fate of poor little Buttercup, or Liza Doolittle, or Billy Bigelow -- but for the lost world in which people indeed did burst into song. And in that world, it now seems to me, there was such a tremendous reservoir of pent-up emotions, such a backlog of heartache, that it was absolutely impossible to speak – it could only be sung. And yet, once the final lines had finished and the strings faded, everyone went back to their ordinary, pent-up ways – and it was this final loss, the loss of the miraculous world of people actually singing about their feelings, that touched me the most. For even then, when such things were common knowledge, they failed to transform the world with their beauty; people went back to ugliness, got on with their tawdry lives, and if asked would probably have said “Singin’? I didn’t hear no singin’! Did you, pal?”

All of this came rushing back to me recently on when I re-watched the 1981 film Pennies from Heaven with Steve Martin and Bernadette Peters. I’d seen it when it first came out in a theatre – a lonesome one, populated only by myself and the love of my life, two little old ladies, and an usher collecting for the Jimmy fund. Somehow, the lip-synching made these songs work, and made the self-representation tragedy of singing one’s heart out a double loss – for the voice these actors sang with was not their own. Of course, that was often true in the classic Hollywood era, when Marni Nixon provided the voice for everyone from Deborah Kerr to Audrey Hepburn, but I never knew that back then. Later, of course, I saw Singin’ in the Rain and realized what a strange fate it was to be the unknown voice of a famous actress. It’s an uncanny kind of real-time pathos that goes back at least to Cyrano de Bergerac and right up through Disney’s Little Mermaid: to watch another woo one’s love with one’s own words, or voice. It’s a kind of tourture, really.

Some years seeing the 1981 film, I tracked down a copy of the BBC series with Bob Hoskins. Until then, I hadn’t realized how heavy-handed the irony was – that a man who sold love-songs for a living could be, not just a low-grade jerk, but a complete asshole – so that the transformations he underwent, and witnessed were a sort of false, cruel magic, one in which the kiss of song transformed toad-like people into temporary princes, not because they were really princes inside, but because they and the whole goddamned world were so deeply, perversely, incurably toad-like that this formed for them a kind of cruel joke. I wondered whether Dennis Potter was a cruel man, or whether, in the manner of other dark comedies, it was meant to be humorous.  His genius, I decided, was that it didn’t matter – it worked either way.

But what's happened to musicals today? It's strange to realize that Pennies from Heaven was in fact the very last musical produced by MGM before its long descent into financial ruin and dissolution, its legendary film library sold to Ted Turner, and then to Warner's. It was Disney -- or, more properly, Ashman and Menken -- who revived the musical feature film in 1989 with The Little Mermaid. Somehow, bursting into song had become something that was too much for any human actor; it had transcended that world and moved into animation. Even Broadway was swept by this phenomenon, as adaptations of Disney films have been among the highest-grossing and longest-running stage musicals.

There are exceptions, it's true, but they're rare. I have to confess that Stephen Sondheim, for all his storied career, has always left me cold. His melodies never quite soar; like tethered birds, they tweet too close to the ground, and are soon forgotten. Wicked -- which I've seen in its current incarnation at the Gershwin -- is transcedent on stage, but the soundtrack doesn't really do it for me. There's soaring -- who can forget their first experience of "Defying Gravity"? -- but without the visuals, the music itself sounds overblown and frantic.

And then there's Glee. Many love it. I loathe it. Not because it lacks magical moments of bursting into song, but because there are too many. You can't just constantly be bursting into song; as Alan Jay Lerner put it,
"A man's life is made up of thousands and thousands of little pieces. In writing fiction, you select 20 or 30 of them. In a musical, you select even fewer than that."
And that to me is what a musical is -- a series of dramatic moments, carefully curated. But a serial musical is like a serial killer -- you never know when it will strike again; we live in dread instead of wonder. Each song must count, must form the peak of one of the mountains of our existence. We mustn't descend too soon to our tawdry, toad-like world -- we must allow these shadows of our better selves to burst, to break, to echo down the corridors of everyday life, daring to sing out loud.

Car sitters

Thomas Francis Barrow, from "The Automobile" (1966)
You've seen them. In your neighborhood, around the corner, at the edge of the park, or maybe even in front of your house. They are neither coming nor are they going; often they're idling, but sometimes the engine is off as the driver -- or the person who would be the driver, if the car were being driven, sits within. Smartphones have made the problem worse -- one can see just by the tilt of the head what the person is doing -- but the practice pre-dates such devices.

I hate them. True, they haven't done anything to me; they're peaceful and rarely make a scene, other than the still-life of car with a trail of exhaust -- but they occupy space in a way that makes no sense to me. I'd like to walk down the sidewalk -- often, I'm walking my dogs -- but the sight of a car-sitter puts me off, causes me to cross to the other side. It's more pronounced at off hours -- I walk my dogs early and late, but there are always a few of these vehicles sitting out there, even at four in the morning.

"Get on with it!" I want to shout -- but I don't imagine the car-sitters would hear me. Some are blasting their stereos, but even the quiet ones are fixated on their interior space; were I to shout and be heard, I would be as much an intrusion to them as they, by their just being there, are to me. And after all, I don't own the streets, or wish to exercise control over people other than myself -- but I would like those who are in motorized vehicles to, well motor. Get on with your lives, head toward your destination, whatever it may be. But go! -- don't just sit there! -- move along!

Saturday, July 26, 2025

Tom Lehrer

The news of the death of Tom Lehrer hit hard.

No prophet is accepted in his own country, it's said. And yet, oddly enough, nearly fifty years ago there once was a man by the name of Tom Lehrer -- erstwhile Harvard mathematics professor, wry songster, and television parodist -- who managed to make music out of some of the most irksome and intractable issues of his day. Racism, pollution, Jim Crow, pornography, the nuclear arms race, the Second Vatican Council, and even World War III were all fodder for his astonishing show tunes, and there often seemed to be scarcely any line of propriety he wouldn't cross. And yet, since he sung every one with such verve, he managed to make nearly everyone laugh at their own folly, rather than throw rotten vegetables at the stage. By the late 1960's, his songs were the stuff of legend, exchanged by high school and college students like a sort of secret code: do you know "The Vatican Rag"? "Poisoning Pigeons in the Park?" "Smut?" At my high school, all the cool kids (that is, all the geeks, since I went to an alternative hippie Quaker school) had at least one of his songs memorized.

It was amazing to me and my friends in the late '70's and early '80's to think that many of these songs were first heard on network television in 1964 and 1965 on a program called That Was the Week that Was. You'll have to remember, this was a long, long, long time before Stephen Colbert.

Lehrer retired from comic songstering for more than 25 years, re-emerging briefly at a concert in 1998, where he was introduced by his old friend Stephen Sondheim, and performed a lovely redux of his famous anthem Poisoning Pigeons in the Park.

Asked in an interview some years ago why he wasn't writing songs satirizing our present moment, he observed that these days,  "everything is so weird in politics that it's very hard to be funny about it." True enough -- but he will be missed nevertheless.

Wednesday, July 16, 2025

Conspiracy Theory Theory

We seem to live in the age of conspiracy theories. Not that they haven't been around for a while, but that something in this postmodern moment seems to supply them, amplify them, and keep them alive longer than at any time in the past. An event scarcely settles into public awareness before someone steps forward to "question" it -- not simply to question why or how it happened, but whether it happened at all.

In the course of this post, I'm going to try not to mention too many specific such theories -- I have no desire to give their promulgators more attention -- but I do want to look at some of the reasons for the plethora of such theories at the present moment, and I believe I can do so without going into the particulars of any of them (though I will, on occasion, mention such theories about the Kennedy assassination, which in many ways is the model for them all).

What kinds of events generate these theories? And what does the their structure tell us about the shifting shape of mass media, and our own shifting modes of understanding the 'true' and the 'real'?

To generate such theories, an event has to be of a very specific nature: something that, at its very occurrence, was completely unanticipated, spectacular, and ideologically charged. Whenever something terrible happens to an individual -- a friend dies, one's home catches fire, or one's finances collapse -- we ask ourselves why.  Sometimes, there's a simple answer: he got cancer, the ashes were still hot, you invested in what turned out to be a pyramid scheme.  Even then, though, there are leftover questions: Why did someone with such a promising future have to die? How could such a small mistake destroy a home? Why didn't I realize much sooner that the promised returns were too good to be true?

It can take an individual years to sort through these issues, and to eventually decide it's time to move on. But what about when such a disaster overtakes an entire region, nation, or the planet as a whole? The causes and consequences of these disasters are vast and complex, and there may never be a completely clear and unambiguous reason why.  Why did the Light Brigade charge into the cannon? How was it the Roosevelt didn't anticipate the attack on Pearl Harbor? How could a lone gunman have killed the leader of the free world, flush in the youth of his success?

The psychic cost of letting ourselves understand and accept such events is enormous, and the collective soul searches for some, for any kind of release from this torment. In the past, scapegoating was the easiest option; to identify and attack the purported cause of something terrible is undeniably cathartic, even if it later turns out that the person or people blamed were innocent.  It's harder to do today, although it can still happen.  But as it turns out, alleging a conspiracy -- even one so vast and profound that it turns out that what we thought happened didn't actually happen at all -- provides an even more effective balm.

On the face of it, these claims are absurd, but that doesn't matter. The people who make them do not put themselves under the burden of constructing a single, plausible alternative to what everyone else believes has happened; they simply exploit doubt and uncertainty to a sufficient degree that they can discard what they regard as the "official" and therefore false version. Having done that, they hint vaguely at a series of dark alternatives; they don't have to pick just one, and if one is shown to be patently false, they just point to another, and another, and another.

The Kennedy assassination's "grassy knoll" is one such point. One frame of the infamous Zapruder film seems to show a puff of smoke at this location.  Is it a puff of smoke? A puff of car exhaust? A slight fogging of the film? The first attack is simply to cast doubt on the "single shooter" version, and in this charge the puff of smoke is just the beginning.  The second move is to speak of another shooter as if he or or she was definitely known to exist, and search all the testimony one can find that correlates with this possibility; if any such claim is doubted, one simply cites another.

And here is why in this age, such theories have such traction: the informational background -- official reports, statements, maps, photographs, blog postings, 911 recordings, satellite photos, and so forth -- is so vast, and daily growing so much vaster, that the amount of informational fuel available is, for all practical purposes, infinite.

As anyone who has tried can confirm, it's impossible to ever win an argument with a conspiracy theorist. For one, they have bushels of information at their fingertips, and warehouses more if those run out.  For two, they don't have to produce a coherent account of what actually took place, just cast doubt on every particular claim, one after another, until eventually the whole thing shudders under piecemeal attacks.

Recently, some such theorists have defended their statements by saying that they are just practicing "critical thinking," that they are "questioning assumptions" and doing "investigative journalism." But none of these actually apply: what these theorists -- who often have, or are given, the name "truthers" actually possess is a very poor understanding of nature of truth.  To them, "truth" must be consistent, not only with every conceivable piece of data, but with their own ideological presuppositions. Anything inconsistent stinks to them of untruth.  And yet, the strange fact is, the truth of any actual event is full of inconsistencies, many of which can simply not be resolved completely.

The "truther" path dares those who accept a commonly-known fact or event to "prove" it happened. And yet nearly all human events cannot be proved in this way.  Prove that the ancient Egyptians existed! One points to the pyramids, and the Temple of Luxor.  But what if these were actually fake ruins put there by the Greeks thousands of years later? Prove that evolution is evidenced in fossilized life! But what if these fossils were put there by God to test our faith and confuse us?

Truthers are fond of documents taken out of context; the obsess over documentation but when documentation is provided, they simply say that it was forged or faked.  They insisted that neither the President's short form nor long-form birth certificates were "real" -- and yet an obviously, clumsily faked Kenyan one was held forth as confirmation of their suspicions. They point to photographs quite a bit -- and yet when it's shown that the image doesn't depict what they claimed, they simply say the photograph was altered by people trying to discredit them.

And they love eyewitness testimony.  Never mind that it's been conclusively shown that eyewitness testimony is quite often unreliable, they love the way the acid of testimonial inconsistency eats away at the "official" version.

The irony is that the reason eyewitness testimony is unreliable is the exact same reason that conspiracy theories are so attractive to their adherents. It's because the human mind detests information in a vacuum; we make stories of our memories even as we are making our memories, and the stories stick -- we hate to change them later.  In a classic Peanuts strip, Lucy sees what she thinks is a rare butterfly from Brazil on the sidewalk,  and wonders at the amazing natural ability of these tiny creatures to travel so far. In the next panel, her kid brother Linus points out that it's actually a potato chip. To which Lucy replies,  "Well, I’ll be! So it is! I wonder how a potato chip got all the way up here from Brazil?"



And so it is with the conspiracy theorist. The things which evidence may or may not point to are taken as givens, and any attempt to show that the preponderance of the evidence shows that it's a potato chip of local origin simply does not compute.  And, in the age when anyone who wants to can Google up millions of pages of information about butterflies, those of us who still see a potato chip are in trouble.

Monday, May 26, 2025

The Problem with Evil

The word "evil" seems to be trending once more. It's a harsh word, a powerful word, a sweeping word. There's no way to dilute it or qualify it; a person or a deed can't be more or less evil, just a little evil, moderately evil -- it's all or nothing. We reach for it in the same way we reach for the emergency brake switch on a train -- as a last resort, knowing that pulling that switch will be an irrevocable act.

"Evil" works for us when nothing else will. Like a pair of asbestos mitts, it enables us to handle things we could otherwise not bear to touch. "Evil" enables us to categorize and keep safe distance from people who would otherwise almost crush us with their horror, their unfathomability: Hitler, Stalin, Pol Pot, Idi Amin. And it gives us unlimited power to denounce them, to condemn them, to convince ourselves that never, never, never would we have anything to do with them. Those who are "evil" are consigned to the world of devils and demons, the underworld of those whose motivations, personalities, influences, or thoughts no longer matter. How could they? -- they're evil.

But "evil" also blinds us. It convinces us that, while some judgments are just a matter of perspective or cultural context, others are absolute, and apply universally. And yet when, in re-creations such as that in the film Argo, we see the United States denounced in billboards as "The Great Satan," we smirk and think how ridiculous that is: "What, us, Satan?"

And this is the essential problem. In the relentless march of cultural and historical amnesia that our modern media-saturated world embodies, "evil" is the ultimate memory zapper. It convinces us that all we need to know about someone is that they were "evil" -- no more sense learning about their lives or motivations. Case closed. The fact that so many of the people we write off in this manner were, in their country and in their heyday, beloved by millions and widely admired, strikes us a irrelevant. The fact that so many people who ended up being "evil" started out being "good" seems merely an inconvenient detail. When we see women holding up their babies for Hitler to kiss them, or families weeping at the funeral of Kim Jong-il, we think to ourselves, what foolish people! How were they so hoodwinked?

But perhaps it is we who wear the blinders. "Evil" works so well in retrospect; it solves for us the problem of history. But if we really want to prevent future Hitlers and Stalins from arising, it's absolutely useless.  No one steps forward and calls themselves evil -- to do that would be almost comic, like Austin Powers or Aleister Crowley. No, the people who we may, eventually, find to be evil will always be people who arrived to meet some human wish or another: the wish for stability, the wish for prosperity, the wish for revenge, the wish for power. They will begin by being very attractive people indeed, so attractive that we don't see them in time to stop them -- or ourselves.

Wednesday, May 7, 2025

Panic over the airwaves

It's always been something of a miracle. From around 1920, when the development of signal amplification enabled the human voice to travel over radio waves, the thought that sounds spoken hundreds or thousands of miles away could by some mysterious process appear in our homes and cars as though the speaker were present, has been perhaps the first and greatest "miracle" of technology. Later, with the development of television, sights traveled by the same means, adding to this wonder, and shaping the baby boomer generation in a way no generation had ever been shaped before. But of course there were those who wondered: wasn't sending all this electricity, this radiation, through the air a sort of health hazard? Curiously, those who worried about that also worried about the tawdry nature of much mass entertainment, or the potential of words and sights so widely transmitted to alter or control our minds.

In a debate in the Radio Mirror back in 1934, Charles Shaw of NYU tussled with no less a figure than Nicola Tesla, the man who in many senses invented radio, long before Marconi. Shaw voiced the concern that, since radio engineers in close proximity with transmission equipment seemed to have higher body temperatures, that perhaps radio waves were going to bake us all while we slept! Not only that, but its noise and drivel "lowered our cultural standards." Tesla was left to point out that radio waves were far too weak in amplitude, and the wrong frequency, to do any cooking, and as to radio's content, wisely noted that "You can't blame lowering our culture on radio," he insisted, "blame it on yourself and myself. The type of program that comes over the air is the type you and I want to listen to."

A light pole in San Francisco
But in recent years, alas, these same two intertwined fears have arisen again about cell phone frequencies. Cell phones were "cooking" our brains, they said, or were leading to an increase in cancer. The panic has increased with the implementation of "5G" technology -- particularly since 5G will require many more antennas to give its higher frequencies coverage. Ironically, these higher frequencies are exactly the reason that 5G is harmless; it is much less able than lower frequencies to penetrate solid objects, which would include peoples' skulls and buttocks. The human body is impenetrable to frequencies above 70 megahertz, a fact which Tesla took advantage of in an experiment that demonstrated their safety. He took alternating current of a very high frequency, but a very low amplitude (power) and used it to electrify himself and others (Mark Twain among them). The high frequency prevented its penetrating the body, and the low amplitude eliminated any other risk of harm. And yet, a lightbulb in the hand of such an electrified person would glow!

Mark Twain in Tesla's lab in 1894
People today don't get much of an education in these matters, it seems. They tremble in fear of the word "radiation," not understanding that all sound and light and heat are also radiation. They confuse high-level radiation, which is known as ionizing radiation (x-rays, gamma rays, etc.) which can be lethal, with the many forms of low-level radiation (infrared waves, radio waves, etc.) which are largely harmless. They also don't understand frequency and amplitude, the two fundamental principles at stake. As noted, the higher frequencies of the radio band are in fact the weakest in terms of being able to penetrate things -- this is why, for instance, submarines use a very low frequency, so that their signals can penetrate the deep ocean, and even the earth itself. At the other end of the spectrum, so-called "short" waves are useful, since they can't penetrate the earth's ionosphere, but bounce off it instead, increasing their potential range. Getting back to cellular signals, there's the fundamental fact that, the higher the frequency, the more information they can carry, and more and more information is what's wanted. So, some years ago, the FCC moved television off the old VHF (Very High Frequency) and UHF (Ultra High Frequency) bands, and sold them off to cellular carriers. UHF peaked out at 3 GHz, and the new services will utilize the top end of UHF, and on upwards to 6 GHz. So at least a portion of this "new" frequency will be the same old frequency that once brought you midnight horror movies and Bowling for Dollars.

And then there's amplitude. The 5G signal will be far too low in energy to do any damage. With old-fashioned single transmitter systems such as radio, many watts of power were needed to give the signal a wide range, but the 5G antennas that spike these worries are in fact very very low power -- they're essentially "repeaters," picking up and rebroadcasting a very low power signal to give it range. No cell phone company would waste more electricity on these than needed to power these mini-antennas -- and even if they tried, the FCC's regulations on phones would prevent them from doing so. The current regulation for phones is for an absorption rate of 1.6 watts per kilogram of mass, which isn't enough amplitude to warm the surface of your skin more than a tiny fraction of a degree, assuming your cellphone is in direct contact with it. All phones sold in the US must meet this standard. Ultimately, these high frequencies, because they can't penetrate the body, are dissipated in the form of heat -- and if it's heat you're worried about, the electric hot pad you use for your sore neck puts out hundreds of thousands of times more.

All this leaves us with just the content of our signals to worry about. And here I would agree with Tesla -- we get what we deserve. Even when, at times, it seems we don't want it -- since now, with all our clicks tracked in one way or another, the system itself works to try to predict our desires. And yet, despite the screaming echo chambers of the 'net, it's just the sound of our own voice, really -- and we have only ourselves to blame for it if we listen.

Friday, May 2, 2025

Fake News and the Death of the Donut

Many years ago -- in another age, an age where newspapers, television, and magazines thrived, and journalism schools were filled with bright young students, one commonly taught principle of fair reporting used the humble donut as its metaphor. The center of the donut -- the hole -- represented facts and matters of general consensus: George Washington was the first president of the United States, men walked on the moon on July 20th, 1969, and vaccines were vital to winning the war over dangerous childhood diseases. No one disagreed -- then, at least -- about these matters, and there was no need to present alternative views about them.

The donut itself consisted of matters about which reasonable people might disagree: Was the 55-mile-an-hour speed limit a good idea? Was Sir Winston Churchill the greatest British Prime Minister? Should we get rid of daylight savings time? When an issue fell within the donut, responsible journalism called for balance, and the representation of opposing views. Then there was the outside of the donut. Earth was colonized by aliens in 40,000 B.C. -- the Holocaust never happened -- Abraham Lincoln was never assassinated, but lived to the age of 87 in a secret log cabin in the hills of Virginia. These ideas were the stuff of the "lunatic fringe," and the proper way for serious journalists to respond to them was not to respond to them at all. This model, of course, assumed that journalists -- because they alone had the opportunity to disseminate news to millions of people -- could and should function as gatekeepers of our shared sense of reality. In such a day, Walter Cronkite could reasonably say, "And that's the way it is," knowing that the stories on the evening news had been carefully reported, fact-checked, and vetted before they went on the air.

We shouldn't necessarily blame the journalists of today for the death of the donut. It's still there, to some extent, at the larger national and international newspapers, and some (though not all) network news shows. But the gate that the press was keeping was in a wall -- a wall representing the cost of disseminating news, printing papers, erecting transmission towers and building television studios -- that simply no longer exists. There's no need for any news to pass through this gateway; like the lovely wrought-iron gate of Cremorne Gardens in London, it stands by itself, is easily walked around, and the gardens to which it once led have long vanished. There's no chance of such a barrier being rebuilt in the future, and none of the efforts of social media sites such as Facebook or Twitter are going to have much effect, since anyone who wants to can find a workaround to whatever filters or barriers they erect.

Is there any hope at all? Well, certainly the remaining outlets of old-fashioned journalism should be taking the lead by calling a lie a lie, and continuing to robustly fact-checking their own stories. Sites such as Snopes.com can help, and the increase of traffic there is a healthy sign that some people actually do want to check up on an implausible or dodgy story they've heard. But what it really means is that everybody is going to have to do their own checking, and that in addition to teaching mere facts, the task of education, now more than ever, must be to give students the tools to sort out the wheat of information from the chaff of useless babble, and the poison of disinformation, rumor, and conspiracy theories.

There's only one problem with this hope, of course -- that those who write, share, and consume those poisons have the same robust tools to keep reality from their gates as do those who favor reality. It's going to be a bumpy night.

Saturday, April 26, 2025

Dissolution

 Throughout the latter part of the twentieth century, there was one common understanding, in the United States at least: a college education was a good thing. Returning soldiers took advantage of the original GI Bill, and their children took advantage of what was, for a time, a very affordable college education (at least at public colleges and universities). One result was the greatest economic boom in the nation's history, but another was its most educated public. It would be hard for anyone back in the 1980's, (to pick a decade) to imagine a time when college attendance wasn't a reachable ideal and an unquestioned good for anyone who could manage it.

Today, a college education is has been questioned on every front; tuition has risen dramatically, and pundits -- who seem to take for granted that the best way to measure the value of something is in the increased earning power it confers -- declare that a useful trade-school education would be a much better deal. We're told that we have to let go of the idea that college is for everyone, as pressure mounts on public institutions (public in name only, as decades of cuts by state legislatures have meant that most state colleges and universities receive less than 1/3 of their funding from the state).  And now, at the behest of the current administration, there's an active attack on the fundamental existence of universities, one aimed -- so far -- at the wealthiest and most prestigious among them. Various reasons are offered for this antipathy, but the likeliest one, to my mind, is that it's just part of a broader anti-intellectual drive, of the sort that's been seen many times before in authoritarian regimes.

All this puts me in mind of a much earlier moment when a large number of institutions whose work was universally understood as contributing to the greater good came under government pressure to prove that worth. Secured in beautiful gothic buildings, supplied with dormitories for their residents, dining  halls for their meals, and chapels for their worship, the inhabitants of these institutions little dreamt that, in a few short years, their buildings would be reduced to roofless rubble -- and by their own king. Thomas Cromwell (yes, that one -- whose career is revisited in Hilary Mantel's recent novels) sent out "visitors" to investigate the practices at these places, and the reports that came back were damning: the people there were not following their own rules, they were living lives of suspicious comfort, and worse yet -- by providing spiritual services to their neighbors -- like the loaning of blessed girdles to aid women in childbirth -- they were perpetuating false superstitions and scurrilous beliefs.

The king, Henry VIII, used these reports as the justification for what would, earlier in his own reign, have been an unthinkable act. He, with the collusion of Parliament and a few strokes of a pen, dissolved all the monsateries of England, seized their buildings and properties, and sent troops to round up the valuables. The golden altar furniture and books were added to the King's own collections, while the buildings and land were given to his political allies as their personal estates. The monks and nuns were supposed to receive pensions, but there seems no record of many collecting them. What the King's men left behind, local villagers pillaged, finding new uses for door-hinges and window glass. The lead that had protected the roofs was rolled up and hauled away (it being a valuable metal at the time), and before long, the unprotected beams and boards rotted and collapsed.

To add to the irony, some of the funds raised by some of these closures were used to fund colleges and endowments, including what later became Christ Church at Oxford.