Sunday, December 18, 2016

Fake News and the Death of the Donut

Many years ago -- in another age, an age where newspapers, television, and magazines thrived, and journalism schools were filled with bright young students, one commonly taught principle of fair reporting used the humble donut as its metaphor. The center of the donut -- the hole -- represented facts and matters of general consensus: George Washington was the first president of the United States, men walked on the moon on July 20th, 1969, and vaccines were vital to winning the war over dangerous childhood diseased. No one disagreed -- then, at least -- about these matters, and there was no need to present alternative views about them.

The donut itself consisted of matters about which reasonable people might disagree: Was the 55-mile-an-hour speed limit a good idea? Was Sir Winston Churchill the greatest British Prime Minister? Should we get rid of daylight savings time? When an issue fell within the donut, responsible journalism called for balance, and the representation of opposing views. Then there was the outside of the donut. Earth was colonized by aliens in 40,000 B.C. -- the Holocaust never happened -- Abraham Lincoln was never assassinated, but lived to the age of 87 in a secret log cabin in the hills of Virginia. These ideas were the stuff of the "lunatic fringe," and the proper way for serious journalists to respond to them was not to respond to them at all. This model, of course, assumed that journalists -- because they alone had the opportunity to disseminate news to millions of people -- could and should function as gatekeepers of our shared sense of reality. In such a day, Walter Cronkite could reasonably say, "And that's the way it is," knowing that the stories on the evening news had been carefully reported, fact-checked, and vetted before they went on the air.

We shouldn't necessarily blame the journalists of today for the death of the donut. It's still there, to some extent, at the larger national and international newspapers, and some (though not all) network news shows. But the gate that the press was keeping was in a wall -- a wall representing the cost of disseminating news, printing papers, erecting transmission towers and building television studios -- that simply no longer exists. There's no need for any news to pass through this gateway; like the lovely wrought-iron gate of Cremorne Gardens in London, it stands by itself, is easily walked around, and the gardens to which it once led have long vanished. There's no chance of such a barrier being rebuilt in the future, and none of the efforts of social media sites such as Facebook or Twitter are going to have much effect, since anyone who wants to can find a workaround to whatever filters or barriers they erect.

Is there any hope at all? Well, certainly the remaining outlets of old-fashioned journalism should be taking the lead by calling a lie a lie, and continuing to robustly fact-checking their own stories. Sites such as Snopes.com can help, and the increase of traffic there is a healthy sign that some people actually do want to check up on an implausible or dodgy story they've heard. But what it really means is that everybody is going to have to do their own checking, and that in addition to teaching mere facts, the task of education, now more than ever, must be to give students the tools to sort out the wheat of information from the chaff of useless babble, and the poison of disinformation, rumor, and conspiracy theories.

There's only one problem with this hope, of course -- that those who write, share, and consume those poisons have the same robust tools to keep reality from their gates as do those who favor reality. It's going to be a bumpy night.

Saturday, June 4, 2016

They shall not pass!

There's been a lot of talk in the press these days, with the rise of he-who-must-not-be-named in American politics, about how to stop fascistic, totalitarian figures from coming into power. It appears to be something that folks in the US and UK haven't had to deal with, and so they must look to Germany, or Spain -- but in fact, that's not true at all.

Back in Britain in the 1930's, there was a divisive political leader whose rallies were beginning to cause concerns. A former political liberal, he'd taken to denouncing his previous views; at massive indoor rallies (including one at the Albert Hall), he stirred his listeners to passionate cheers by denouncing immigrants, Jews, and a shadowy cabal of forces that were driving the common man down. He love to egg on protesters at these rallies, and had special squads of "stewards" to rough them up and throw them out, but in fact they served a purpose. Indeed, he credited them with making his speeches all the more effective, even saying he looked forward to them.

The man in question was Sir Oswald Mosley. And Britons weren't quite sure what to make of him; though many despised his views, his party -- the British Union of Fascists -- was granted the same rights as any other, and in the early 1930's their polling numbers were on the rise, although they did not yet have any elected member of Parliament. Mosley, inspired by visits with Mussolini and Hitler, decided to give his party a paramilitary flavor, initially going with plain black futuristic uniforms, but later graduating to jackboots and a peaked cap (for Mosley himself at least). From their large indoor rallies, they graduated to outdoor ones with marches; when Mosley was clobbered with a flying brick by protesters at one, he wore his bandages as a cap of pride. It was part of his strategy to march directly into areas, such as the East End of London, where the bricks were most likely to fly; this both fed his sense of justified anger, and forced the police into serving as his own personal security.

Things came to a head in October of 1936; Mosley had planned a rally and march through the East End, and had pulled out all stops; his new uniform was ready, as were many of his followers, some of whom rode in specially fortified vans. The Metropolitan Police, under the leadership of Sir Philip Game, mustered out 6,000 officers, one of the largest forces since the days of the Chartist rallies in the 1840's, but even then it soon became clear they were stretched thin; attempts to disperse a crowd of nearly 100,000 East Enders proved impossible. Those opposed to Mosley, whose planned route would have taken him down Cable Street, erected barricades of bricks and sticks, along with a lorry and some construction equipment they'd procured from a nearby builder's yard. They hoped to block the march by their mere presence, though they were also prepared to fight; one witness described men wrapping barbed wire around chair-legs as improvised weapons. The chant of one and all was simply this: "They shall not pass!"

In the end, Sir Philip Game persuaded Mosley to call off that part of his plans, and to march instead back where he'd come from to Hyde Park. Public alarm over the events led to the passing of the Public Order Act, which although it did not outlaw Mosley's party, prohibited political uniforms such as those the "blackshirts" had favored. When Britain and Germany went to war, the BUF was banned, and Mosley and many of its other leaders were imprisoned for the duration. One, though, did escape: the scar-faced William Joyce, who as "Lord Haw-Haw" broadcast mocking ripostes to nearly every one of Churchill's radio speeches. After the war, Joyce was convicted of high treason, and hanged at Wandsworth Prison. Sir Oswald returned to political life, though in his one election -- for Kensington North in 1959 -- he polled only 8% of the vote with an anti-immigration platform that included deporting those from the West Indies. 

Monday, May 30, 2016

Girl number twenty unable to define a horse!

Thomas Gradgrind
Consider this a sort of open letter to the American education system. It's been fourteen years since the passing of the "No Child Left Behind" law, and in my college classes, I now have students who've experienced its effects for much, if not all, of their primary and secondary educations. The neo-utilitarians have had their day, and here are the results: a generation of students who have learned the real lesson of the new regime: hunker down, follow instructions, and learn whatever is going to be on the test. Fiction is out; nonfiction is in; reading for pleasure is out; reading for content is in. After all, the only skills that matter are the ones that can be measured.

And they're not very good students. They're good people -- they have all the hopes, fears, and aspirations that my students have always had -- but as students, they're uniquely unprepared for college, at least on the side of the humanities. Their habit of scanning syllabi and readings to see what will be tested, what will be measured, makes them poor readers; when they encounter an ambiguity, or a difficult reading, they tense up in anticipation of being taken to task for some vague failure. My endeavor has always been to share my love of literature, to (I hope) instill in students an excitement, an interest, a sense of something personally significant for themselves, in everything they read -- but it's getting tougher each semester. Their curiosity has been hammered; their personal experience belittled, their idiosyncrasies ironed out. As a result, the pleasure of discovery of something new has given way to an anxiety about the unfamiliar; it's like a jungle out there, sometimes I wonder how they keep from going under.

Are they better prepared for the wonderful world of employment? Perhaps, so long as the employment they find demands repetitive goal-oriented work, with little room for innovation and frequent employee evaluations. They may, so long as they've had the prerequisites, do well in science, technology, engineering, and math -- though only at a rudimentary level. I can't see any new inventions, innovative theories, or speculative hypotheses coming from this generation, as anything they did or thought which deviated from the appointed path earned them the electric shock of a poor mark. In the humanities, they arrive with only the most superficial skills, and with what I find an astonishingly low level of motivation -- motivation here being defined as some strong desire from within one's own self to learn and grow. That desire seems to have been largely amputated, and in its place our brave new Benthamites have instilled a simple wish for nothing more than a clear set of instructions and evaluative rules. Learn, do, demonstrate. Wash, rinse repeat.

It's a sad day for me, and for college educators generally. We can't turn back the hands of time -- we'll have to do what we can to help these students regain their own self-confidence, to revive and replenish their ossified curiosity and shrivelled sense of self. It will be hard work, and until that work is done, the work that professors such as myself used to do -- of teaching our field of study -- will have to be postponed. It'll be like a pre-college program, only it will take up college time. We ourselves have been sent the memo; our departments, programs, and fields of study already have to produce the "documentation" of what skills and knowledge we add, and our "metrics" for assessing the "outcomes" of our labors. The madness of measurement has overtaken us all, and if anything resembling real education is going to take place, it's going to have to do so by flying under the radar of this new regime.

Tuesday, April 5, 2016

Phil Ochs Time

As I read the latest news from the state of Mississippi, which once again has taken the lead in raw, unbridled, smug, self-satisfied hatred, I felt increasingly angry. And then, as I sometimes do, I went beyond my anger, beyond my frustration: I decided it was Phil Ochs time.

And it almost always is. It's no coincidence that Billy Bragg found evoked him with "I Dreamed I Saw Phil Ochs Last Night"-- for indeed, he's still "as 'live as you or me" -- in fact, he's more alive now than he ever has been. Phil always claimed to just grab his songs from the headlines -- as witness, the wry title of his first LP,  All the News that's Fit to Sing -- but there was a whole lot more to them than that. Phil was a sort of tuning fork for his times, but in being that, he was tuned right in to many of the timeless foibles every era shares.

Many hailed (or dismissed) Phil as a mere "protest" singer -- a label I'm sure he wouldn't have minded -- but there was so much more to him than that. He could write the greatest anti-war ballad ever penned ("I Ain't Marching Anymore"), and at the same moment depict the common sailor in a way any Navy veteran couldn't help but admire ("The Men Behind the Guns"). At a moment when many hardcore lefties were dismissing JFK as a sellout, Phil wrote an elegy for the slain president that stands with Whitman's among the finest ever written -- and yet, his acerbic tirades against American interventionism never lost their edge, as in "The Marines Have Landed on the Shores of Santo Domingo" or "Cops of the World." And he always had a crazy sense of ironic humor (on both sides), which bubbled up in songs such "Draft Dodger Rag" and "Love Me, I'm A Liberal."

Another side of Phil was that he dug deep in the annals of American poetry, and had an uncanny knack for setting poems to music. His version of Alfred Noyes's "The Highwayman" rescues that poem from what, in mere printed words, might seem maudlin, crafting it into a ballad that can bring a tear to the eye of the most hardened anti-romantic. But still better is his setting of Edgar Allan Poe's "The Bells," which transmogrifies a tuneful poem into a poetic tune, complete with guitar harmonics for the bells themselves -- it's a masterpiece.

It's often said that Phil felt despair, and lost his way, later in his career. Fear he surely felt, but his way was always sure; in his late musings such as "The Flower Lady" and "Pleasures of the Harbor," he reached a lyric height far beyond the common songwriter's scope, and with "Crucifixion" he revisited the Kennedy assassination with an apocalyptic overlay of epic proportions. Many of these songs are the equal -- some might say, the better -- of Dylan's anthems of the time. Phil and Bob were, indeed, always linked together; it was Phil that brought Bob out of motorcycle-accident retirement to Madison Square Garden's Evening with Salvador Allende. And, back in the day, it was to Phil's apartment that Bob went to share "Mr. Tambourine Man" for the first time.

Phils Ochs took his own life forty years ago in Far Rockaway, New York. Och's biographer, Marc Eliot, wrote about how personally that loss was felt, He predicted that, when Dylan dies, his death -- though greater in some wider sense -- would never be felt as personally as was Phil's.

And I agree.

Tuesday, July 7, 2015

Debt

West Germany signs up for its 50% debt reduction in 1953
What, exactly, is debt? The question takes on new urgency as, yet again, the economic wise men of the European Union declare that Greece must submit to their terms -- for, after all, is there not a great debt at stake? Not to address it in the EU's terms, clearly, would be irresponsible and disastrous -- and so, their ministers speak to Greece -- the cradle of the civilization they claim to represent -- as one would to a child.

The Greek government has debts, indeed, and Greek banks have still more. But what does this really mean? Nietzsche was hardly the first to notice that the German word for guilt (schuld) derives from the older concept of debt (schulden). In English texts, "forgive us our debts" was an older and more common form of the Lord's Prayer's "forgive us our trespasses." Debt is sin, Christ is our "redeemer," and entrance into the kingdom of heaven is to be -- in these terms -- a forgiveness of debts.

We in the U.S. like to pillory the profligate -- it's almost a national passion. Republicans in the US Congress took the lead in making personal bankruptcy more onerous, and have harped, from the Reagan era to our own, on the evil of deficits, and the need to cover any expense with a parallel cut. It's become almost a mantra with the Tea Party set.

But there's a problem here. For one, the debt of nations is not at all like the debt of people. Ordinary people can't print their own currency, or re-value it, or issue bonds to fund their new chimney. Nations can do this, because they have the larger essential reserves -- a labor force, roads, cities, minerals and other natural resources to generate future wealth, and that creates the ability to evaluate their worth, and trade in it.  But even more than this, the debt of nations is not guilt, nor is their spending like the profligacy of private persons.

Greece has suffered. It has suffered for its overspending (which, as Nobel Laureate Paul Krugman has pointed out, was no so much more horrific than that of many other nations), for its failure to live up to the expectations of the creditors who loaned it and its banks generous funds during what seemed to be boom times. The EU forces have imposed austerity to an amazing extent -- cuts in pensions, tax increases, cuts in social services, privatization of national treasures -- its entire economy has shrunk by more than a quarter. If Greece today is less well-prepared to pay back its debts -- or rather, the debts of its banks -- than it was several years ago, it's largely because it's taken the EU prescription of austerity.

And what is "austerity"? Very clearly, it is a special sort of punishment, one reserved to errant nations. They must be made to pay! Even if every last citizen is to starve, even if the cure kills the patient, pay they must. The latest statements from the German finance minister make it quite clear that the goal of the latest agreement, is not to make it more likely to recoup Greek debt -- rather, it is to punish the Greeks.

One should see at a glance how much this sort of "debt" is like "guilt." But there's one way in which "debt" is different, when it's money that's at stake: those who loan money expect a profit from it. They don't lend out of kindness, nor was the debt incurred out of errant sins. They gave because they expected to receive, and then some.

But let's not focus on Greece -- let's look at Ireland, a country which willingly accepted the EU's austerity prescriptions. Now, seven years later, we are told that there are "signs of recovery" -- signs evident to economists, its seems, but hard for the ordinary Irish citizen to see. Any number of years is too much to endure such pain, the more so when all it really does is satisfy the creditors (who, one might imagine, might prefer to be satisfied sooner -- but not, it seems, as much as they prefer the glee of punishing another).

And seven years, interestingly, the the time laid out for Sh'mittah -- the traditional forgiveness of debts in the Jewish Torah. In that year, debts are to be forgiven, and the poor welcomed to glean in the fields. Seems to me it's time.

Wednesday, March 25, 2015

Dissolution

Throughout the latter part of the twentieth century, there was one common understanding, in the United States at least: a college education was a good thing. Returning soldiers took advantage of the original GI Bill, and their children took advantage of what was, for a time, a very affordable college education (at least at public colleges and universities). One result was the greatest economic boom in the nation's history, but another was its most educated public. It would be hard for anyone back in the 1980's, (to pick a decade) to imagine a time when college attendance wasn't a reachable ideal and an unquestioned good for anyone who could manage it.

Today, college education is questioned on every front; tuition has risen dramatically, and pundits -- who take for granted that the best way to measure the value of something is in the increased earning power it confers -- declare that a useful trade-school education would be a much better deal. We're told that we have to let go of the idea that college is for everyone, as pressure mounts on public institutions (public in name only, as decades of cuts by state legislatures have meant that most state colleges and universities receive less than 1/3 of their funding from the state). Even those who talk up the importance of college are insisting on "accountability," which means endless rounds of assessments and measurements of "outcomes," and even the supposedly liberal President Obama, and his erstwhile secretary of education, Arne Duncan, wax ecstatic talking up their plans for cradle-to-grave tracking of the correlatrion between education and earnings.

But in the midst of this neo-utilitarian fervor, something has been forgotten. As Mark Thomason wrote in a comment on a recent New York Times editorial,
I send my kids to college as a growth experience. It changes them, in good ways. I hope they do well financially, but I am not sending them to a trade school, I'm sending them to complete their education and complete growing up. It did me a lot of good, and it is doing them a lot of good.
The only difficulty is that this good -- which I agree is the most important aspect of a college education -- is very difficult to quantify. It doesn't necessarily lead to higher earnings; those who are inspired by their college experience to undertake creative careers in the arts, or work for a better society, often find they're earning a great deal less. But their personal satisfaction, and benefits to the world of their labors, though not necessarily tangible or measurable, are certainly vital.

All this puts me in mind of a much earlier moment when a large number of institutions whose work was universally understood as contributing to the greater good came under government pressure to prove that worth. Secured in beautiful gothic buildings, supplied with dormitories for their residents, dining  halls for their meals, and chapels for their worship, the inhabitants of these institutions little dreamt that, in a few short years, their buildings would be reduced to roofless rubble -- and by their own king. Thomas Cromwell (yes, that one -- whose career is revisited in Hilary Mantel's recent novels) sent out "visitors" to investigate the practices at these places, and the reports that came back were damning: the people there were not following their own rules, they were living lives of suspicious comfort, and worse yet -- by providing spiritual services to their neighbors, they were perpetuating false superstitions and scurrilous beliefs.

The king, Henry VIII, used these reports as the justification for what would, earlier in his own reign, have been an unthinkable act. He, with the collusion of Parliament and a few strokes of a pen, dissolved all the monsateries of England, seized their buildings and properties, and sent troops to round up the valuables. The golden altar furniture and books were added to the King's own collections, while the buildings and land were given to his political allies as their personal estates. The monks and nuns were supposed to receive pensions, but there seems no record of many collecting them. What the King's men left behind, local villagers pillaged, finding new uses for door-hinges and window glass. The lead that had protected the roofs was rolled up and hauled away (it being a valuable metal at the time), and before long, the unprotected beams and boards rotted and collapsed.

To add to the irony, some of the funds raised by some these closures were used to fund colleges and endowments, including what later became Christ Church at Oxford.

I know it sounds crazy --  how could such a thing happen today? But I think it's a cautionary tale, especially for a population which seems unable to resist the siren-song of mere utilitarian value.

Wednesday, March 11, 2015

Blurred Lines

Blurred lines, indeed. The jury verdict of 7.4 million dollars against Pharrell Williams and Robin Thicke offers yet another example of how juries -- that is to say, how most people -- misunderstand originality in music. Because, in fact, music is by its very nature unoriginal -- every melody line, every hook, every grace note is but a variation on a number of ancient themes, progressions, and melodies. And in fact, that's why we like music -- precisely because it feels both new and familiar. As former Vandals drummer and present-day entertainment lawyer Joe Escalante remarked to the LA Times, it's a dark day for creativity, and in the end, this will be a net loss for music fans" -- but "good news for lawyers and the bitter everywhere."

There are, contrary to popular belief, only a limited number of musical possibilities out there. Nearly all pop music is in 4/4 time, and relies upon a number of common 'progressions' -- chord sequences -- I/V/vi/IV, I/V/vi/iii (the "Pachebel's Canon/Lighter Shade of Pale progression), vi/V/IV/V, and so on. There are a few less common ones, and of course jazz and other genres use a wider variety of them than popular music (though, it's been argued, all jazz essentially derives from the basic blues pattern). Within these, there' a restricted number of possible melodies -- many, but far from infinite -- and, like some robot throwing vast amounts of spaghetti against the fridge to see if it sticks, songwriters and composers have tried out them all. Some, it seems, are stickier than others -- and stickiness is what listeners want, after all -- so they are turned to repeatedly.

So of course we've been here before. And before. And before. Perhaps the most egregious example until now was the case of George Harrison's "My Sweet Lord" vs. the Chiffons "He's So Fine" -- or, as it's officially known, "Bright Tunes Music vs. Harrisongs Music" -- details at USC's fabulous Music Copyright Infringement Resource). In that case, it came down to a few of what were called 'grace notes'(actually appoggiatura), which suggested the possibility of what the judge called 'unconscious  borrowing.' The damages awarded for this were spectacular: 1.6 million dollars -- 6.5 million in today's currency, nearly as much as the Gaye case.

Such an award is justified by its champions as discouraging 'illegal' infringements, but in fact it does no such thing. If they had lawyers enough, there are tens of thousands of songs whose authorship could be litigated in this way -- and almost any new pop song you can imagine would be a fresh candidate. Instead, it will stifle creativity -- Harrison himself admitted he was 'too paranoid' to write any new material for years after the lawsuit --by preventing the natural and inexorable process of fusing the old and the new that's what Joni Mitchell called 'the star-maker machinery behind the popular song.'

Part of the problem is the way music copyright is handled. The songwriter rights, also known as publishing rights, date back to the era when sheet music sales were a key source of revenue. That's not true today, but these underlying rights still apply, since any recording of them -- including the 'original' one -- relies on a license to to 'use' them. It's why, when "My Sweet Lord" was in dispute, the parties weren't Harrison himself or the Chiffons, but Harrisongs and Bright Tunes, the music's publishers. And, when boiled down to sheet music, songs look a lot more similar than they in fact are -- since part of what makes a song a hit is the arrangement and performance of a particular version. Ins some cases, a cover version does better than the original -- Richie Havens's "Here Comes the Sun," for instance, was a bigger hit on the charts than Harrison's own version -- but in that case, the 'publisher' portion of the royalties went to Harrison anyway -- as would be the case with any covers.

But the problem is, almost all music is a 'cover' of something. Boiled down to sheet music, the similarities are greater than the differences -- but in this modern era, when sheet music isn't even printed in most cases, this hardly seems the point.

What we need, I'd argue, is the throw out the entire existing copyright system for music. Get rid of the 'sandwich' -- publishing rights/performance rights/broadcast rights/non-earthbound communication rights -- and replace it with a system in which 10% of all royalties for all new songs are placed in a fund available to those who can make a case for similarity to older ones; damages should be capped. Then, pay a fixed portion for any performance or rebroadcast, accounting for the (divided) writing royalties. This may sound complex, but in fact, it's been tacitly done within the industry for decades -- which is why cases such as the Gaye/Williams/Thicke one are rare. Let's 'fess up, folks -- when it comes to pop music, there really isn't anything new under the sun.

Friday, January 30, 2015

Burst

There are still places on this earth where people do burst into song.

I was in Dundalk, Ireland, in an ordinary, nondescript Italian restaurant on the high street, with some friends who were there attending a conference, along with a few other miscellaneous diners, when suddenly a woman at a table nearby, probably in her forties or so, let out with the refrain to Cat Stevens’s “Moonshadow” – and at once we were all, instantly, “leapin’ and hoppin’” along. The strange thing was that the waiters and waitresses took no notice of it; she sang, and we all came in on the chorus. At the end of it, there was applause, and people looked ‘round the room at one another, and someone called out “Well, give us another song!” And someone did.

This would never, of course, happen where I live, in the United States. Here, no one bursts into song, except maybe in a Karaoke bar, and badly. If you were to start singing at the top of your lungs in a TGI Friday’s or a Pizza Hut, people would think you were crazy, and before long the police would be called, and you’d be taken away to face charges.

But here, in Ireland, it's understood that people do do such things, and that it's a perfect part of the ordinary. People could, actually, burst into song in a public place, and just as in an episode of  “Pennies from Heaven,” the other diners – if they didn’t join in themselves – would just carry on with their business as though it were the most usual thing in the world.

*     *     *     *     *

Growing up in the suburbs of Cleveland, Ohio, I was subjected to my parents’ deep and inexhaustible love of musicals. Living far from New York, we never actually went to one, but they blared forth from the stereo all the same. “My Fair Lady” – “Oklahoma!” – “The Most Happy Fella” – “The Pajama Game” – “The Mikado” – “Carousel” – “South Pacific” – the list seemed endless. My mother loved them, and my father loved them even more, turning out all the lights in the living room and turning up the stereo as far as it would go without distortion. “We are gentlemen of Japan!” … “All I want is a room somewhere” … “A p'liceman’s lot is not a happy one” …“Some enchanted evening” … there was an instant drama in every line, and though I hated the music itself, some portion of its feeling got under my skin, and planted a kind of weird seed for the future. I remember hating these songs, the darkened room, the way that my father would, from his prostrate position on the couch, quiver quietly with tears.

But now, decades later, when I hear these same songs, I’m the one that's weeping. Not for the fate of poor little Buttercup, or Liza Doolittle, or Billy Bigelow -- but for the lost world in which people indeed did burst into song. And in that world, it now seems to me, there was such a tremendous reservoir of pent-up emotions, such a backlog of heartache, that it was absolutely impossible to speak – it could only be sung. And yet, once the final lines had finished and the strings faded, everyone went back to their ordinary, pent-up ways – and it was this final loss, the loss of the miraculous world of people actually singing about their feelings, that touched me the most. For even then, when such things were common knowledge, they failed to transform the world with their beauty; people went back to ugliness, got on with their tawdry lives, and if asked would probably have said “Singin’? I didn’t hear no singin’! Did you, pal?”

All of this came rushing back to me recently on when I re-watched the 1981 film Pennies from Heaven with Steve Martin and Bernadette Peters. I’d seen it when it first came out in a theatre – a lonesome one, populated only by myself and the love of my life, two little old ladies, and an usher collecting for the Jimmy fund. Somehow, the lip-synching made these songs work, and made the self-representation tragedy of singing one’s heart out a double loss – for the voice these actors sang with was not their own. Of course, that was often true in the classic Hollywood era, when Marni Nixon provided the voice for everyone from Deborah Kerr to Audrey Hepburn, but I never knew that back then. Later, of course, I saw Singin’ in the Rain and realized what a strange fate it was to be the unknown voice of a famous actress. It’s an uncanny kind of real-time pathos that goes back at least to Cyrano de Bergerac and right up through Disney’s Little Mermaid: to watch another woo one’s love with one’s own words, or voice. It’s a kind of tourture, really.

Some years seeing the 1981 film, I tracked down a copy of the BBC series with Bob Hoskins. Until then, I hadn’t realized how heavy-handed the irony was – that a man who sold love-songs for a living could be, not just a low-grade jerk, but a complete asshole – so that the transformations he underwent, and witnessed were a sort of false, cruel magic, one in which the kiss of song transformed toad-like people into temporary princes, not because they were really princes inside, but because they and the whole goddamned world were so deeply, perversely, incurably toad-like that this formed for them a kind of cruel joke. I wondered whether Dennis Potter was a cruel man, or whether, in the manner of other dark comedies, it was meant to be humorous.  His genius, I decided, was that it didn’t matter – it worked either way.

But what's happened to musicals today? It's strange to realize that Pennies from Heaven was in fact the very last musical produced by MGM before its long descent into financial ruin and dissolution, its legendary film library sold to Ted Turner, and then to Warner's. It was Disney -- or, more properly, Ashman and Menken -- who revived the musical feature film in 1989 with The Little Mermaid. Somehow, bursting into song had become something that was too much for any human actor; it had transcended that world and moved into animation. Even Broadway was swept by this phenomenon, as adaptations of Disney films have been among the highest-grossing and longest-running stage musicals.

There are exceptions, it's true, but they're rare. I have to confess that Stephen Sondheim, for all his storied career, has always left me cold. His melodies never quite soar; like tethered birds, they tweet too close to the ground, and are soon forgotten. Wicked -- which I've seen in its current incarnation at the Gershwin -- is transcedent on stage, but the soundtrack doesn't really do it for me. There's soaring -- who can forget their first experience of "Defying Gravity"? -- but without the visuals, the music itself sounds overblown and frantic.

And then there's Glee. Many love it. I loathe it. Not because it lacks magical moments of bursting into song, but because there are too many. You can't just constantly be bursting into song; as Alan Jay Lerner put it,
"A man's life is made up of thousands and thousands of little pieces. In writing fiction, you select 20 or 30 of them. In a musical, you select even fewer than that."
And that to me is what a musical is -- a series of dramatic moments, carefully curated. But a serial musical is like a serial killer -- you never know when it will strike again; we live in dread instead of wonder. Each song must count, must form the peak of one of the mountains of our existence. We mustn't descend too soon to our tawdry, toad-like world -- we must allow these shadows of our better selves to burst, to break, to echo down the corridors of everyday life, daring to sing out loud.