Wednesday, March 25, 2015

Dissolution

Throughout the latter part of the twentieth century, there was one common understanding, in the United States at least: a college education is a good thing. Returning soldiers took advantage of the original GI Bill, and their children took advantage of what was, for a time, a very affordable college education (at least at public colleges and universities). One result was the greatest economic boom in the nation's history, but another was its most educated public. It would be hard for anyone back in the 1980's, (to pick a decade) to imagine a time when college attendance wasn't a reachable ideal and an unquestioned good for anyone who could manage it.

Today, college education is questioned on every front; tuition has risen dramatically, and pundits -- who take for granted that the best way to measure the value of something is in the increased earning power it confers -- declare that a useful trade-school education would be a much better deal. We're told that we have to let go of the idea that college is for everyone, as pressure mounts on public institutions (public in name only, as decades of cuts by state legislatures have meant that most state colleges and universities receive less than 1/3 of their funding from the state). Even those who talk up the importance of college are insisting on "accountability," which means endless rounds of assessments and measurements of "outcomes" even the supposedly liberal President Obama, and his secretary of education, Arne Duncan, wax ecstatic talking up their plans for cradle-to-grave tracking of the correlatrion between education and earnings.

But in the midst of this neo-utilitarian fervor, something has been forgotten. As Mark Thomason wrote in a comment on a recent New York Times editorial,
I send my kids to college as a growth experience. It changes them, in good ways. I hope they do well financially, but I am not sending them to a trade school, I'm sending them to complete their education and complete growing up. It did me a lot of good, and it is doing them a lot of good.
The only difficulty is that this good -- which I agree is the most important aspect of a college education -- is very diffult to quantify. It doesn't necessarily lead to higher earnings; those who are inspired by their college experience to undertake creative careers in the arts, or work for a better society, often find they're earning a great deal less. But their personal satisfaction, and benefits to the world of their labors, though not necessarily tangible or measurable, are certainly vital.

All this puts me in mind of a much earlier moment when a large number of institutions whose work was universally understood as contributing to the greater good came under government pressure to prove that worth. Secured in beautiful gothic buildings, supplied with dormitories for their residents, dining  halls for their meals, and chapels for their worship, the inhabitants of these institutions little dreamt that, in a few short years, their buildings would be reduced to roofless rubble -- and by their own king. Thomas Cromwell (yes, that one -- whose villainy is deliciously revisited in Hilary Mantel's recent novels) sent out "visitors" to investigate the practices at these places, and the reports that came back were damning: the people there were not following their own rules, they were living lives of suspicious comfort, and worse yet -- by providing spiritual services to their neighbors, they were perpetuating false superstitions and scurrilous beliefs.

The king, Henry VIII, used these reports as the justification for what would, earlier in his own reign, have been an unthinkable act. He, with the collusion of Parliament and a few strokes of a pen, dissolved all the monsateries of England, seized their buildings and properties, and sent troops to round up the valuables. The golden altar furniture and books were added to the King's own collections, while the buildings and land were given to his political allies as their personal estates. The monks and nuns were supposed to receive pensions, but there seems no record of many, perhaps any, actually collecting them. What the King's men left behind, local villagers pillaged, finding new uses for door-hinges and window glass. The lead that had protected the roofs was rolled up and hauled away (it being a valuable metal at the time), and before long, the unprotected beams and boards rotted and collapsed.

To add to the irony, some of the funds raised by some these closures were used to fund colleges and endowments, including what later became Christ Church at Oxford.

I know it sounds crazy --  how could such a thing happen today? But I think it's a cautionary tale, especially for a population which seems unable to resist the siren-song of mere utilitarian value.

Tuesday, March 24, 2015

Uncreating Kenneth Goldsmith

I will always be grateful for Ubuweb -- that marvelous, odd agglomeration of avant-garde texts, audio, and video of which our eponymous hero is a founder. But other than that, I just wish -- sincerely and without prejudice -- that Kenneth Goldsmith would simply go away. If he is true to his word, it should be a simple affair -- I'll just xerox all his books, write my name on them with a magic marker across a strip of duct tape, and declare them them to be the "works" of Russell A. Potter. Of course they'll be changed -- changed utterly no doubt -- but that will be but one more Goldsmith-esque transmogrification. One you declare that there is nothing new under the sun, then you yourself are no longer new -- and (aside from the paisley outfit) you're fairly easy to duplicate.

But I should back up. Mr. Goldsmith has been travelling quite a bit lately -- speaking, in my little neighborhood, at Brown and RISD in praise of what he likes to call "uncreativity." He tells his listeners that he teaches his students to plagiarize actively, and thoughtfully, to "bring into the open" what he -- and perhaps all of us in academe -- have suspected that students have long been doing anyway, just surreptitiously. Bravo to him for his honesty! But shame on us for setting our sights so low. If what it is that we, as professors, ask for, can be answered, or even seem to be answered, with a cut-n-paste readymade of already-existing text(s), then we deserve what we already have -- which is, apparently, to be living in the end days of academe. We've asked for the wrong thing.

The academic essay is dead, indeed -- was it ever anything but a careful re-arrangement of summary, weak opinion, and borrowed authority? -- but that does not mean that everything is dead, that writing is dead. Mr. Goldsmith should read -- perhaps he has, and thinks he likes -- Jorge Luis Borges's "Library of Babel." It's a fantasy in which every possible combination of letters exists in a fabulous library -- which means, of course, that the library contains every book that could possibly be written (the true story of your life, the true story of your life with one slight mistake, two slight mistakes, one serious error, five, eleven, or nineteen errors, etc. etc.) actually exists. In Goldsmith's world, we are weary -- behold! (as Nietzsche once wrote of a certain species of conservative historians) -- everything meaningful already exists! Why clutter the world with excess verbiage?

But we aren't there yet -- in fact, very far from it. According to Patheos.com's math, using Borges's (admittedly speculative) scenario, if the Library contained only those variants of Tolstoy's War and Peace with twelve or fewer single-character differences, it would already fill the entire knowable universe -- and this would be but one portion of the larger Library. Our problem is not that words are at an end, and the time for recycling is upon us -- it's that we've scarcely begun. The sensation that all that's said has been said before is not a reality -- it is pose, a posture, a complaint -- and it's not even new. Let the Goldsmiths of the world spin their fine fabric -- the Emperor has no clothes, and (in point of fact) never did -- and there is plenty of real text yet to be woven.

Wednesday, March 11, 2015

Blurred Lines

Blurred lines, indeed. The jury verdict of 7.4 million dollars against Pharrell Williams and Robin Thicke offers yet another example of how juries -- that is to say, how most people -- misunderstand originality in music. Because, in fact, music is by its very nature unoriginal -- every melody line, every hook, every grace note is but a variation on a number of ancient themes, progressions, and melodies. And in fact, that's why we like pop music -- precisely because it feels both new and familiar. As former Vandals drummer and present-day entertainment lawyer Joe Escalante remarked to the LA Times, "it's a dark day for creativity, and in the end, this will be a net loss for music fans" -- but "good news for lawyers and the bitter everywhere."

There are, contrary to popular belief, only a limited number of musical possibilities out there. Nearly all pop music is in 4/4 time, and relies upon a number of common 'progressions' -- chord sequences -- I/V/vi/IV, I/V/vi/iii (the "Pachebel's Canon/Lighter Shade of Pale progression), vi/V/IV/V, and so on. There are a few less common ones, and of course jazz and other genres use a wider variety of them than popular music (though, it's been argued, all jazz essentially derives from the basic blues pattern). Within these, there's a restricted number of possible melodies -- many, but far from infinite -- and, like some robot throwing vast amounts of spaghetti against the fridge to see if it sticks, songwriters and composers have tried out them all. Some, it seems, are stickier than others -- and stickiness is what listeners want, after all -- so they are turned to repeatedly.

So of course we've been here before. And before. And before. Perhaps the most egregious example until now was the case of George Harrison's "My Sweet Lord" vs. the Chiffons "He's So Fine" -- or, as it's officially known, "Bright Tunes Music vs. Harrisongs Music" -- details at USC's fabulous Music Copyright Infringement Resource). In that case, it came down to a few of what were called 'grace notes'(actually appoggiatura), which suggested the possibility of what the judge called 'unconscious  borrowing.' The damages awarded for this were spectacular: 1.6 million dollars -- 6.5 million in today's currency, nearly as much as in the Gaye case.

Such an award is justified by its champions as discouraging 'illegal' infringements, but in fact it does no such thing. If they had lawyers enough, there are tens of thousands of songs whose authorship could be litigated in this way -- and almost any new pop song you can imagine would be a fresh candidate. Instead, it will stifle creativity -- Harrison himself admitted he was 'too paranoid' to write any new material for years after the lawsuit --by preventing the natural and inexorable process of fusing the old and the new that Joni Mitchell called 'the star-maker machinery behind the popular song.'

Part of the problem is the way music copyright is handled. The songwriter rights, also known as publishing rights, date back to the era when sheet music sales were a key source of revenue. That's not true today, but these underlying rights still apply, since any recording of them -- including the 'original' one -- relies on a license to 'use' them. It's why, when "My Sweet Lord" was in dispute, the parties weren't Harrison himself or the Chiffons, but Harrisongs and Bright Tunes, the music's publishers. And, when boiled down to sheet music, songs look a lot more similar than they in fact are -- since part of what makes a song a hit is the arrangement and performance of a particular version. Ins some cases, a cover version does better than the original -- Richie Havens's "Here Comes the Sun," for instance, was a bigger hit on the charts than Harrison's own version -- but in that case, the 'publisher' portion of the royalties went to Harrison anyway -- as would be the case with any covers.

But the problem is, almost all pop music is a 'cover' of something. Boiled down to sheet music, the similarities are greater than the differences -- but in this modern era, when sheet music isn't even printed in most cases, this hardly seems the point.

What we need, I'd argue, is to throw out the entire existing copyright system for music. Get rid of the 'sandwich' -- publishing rights/performance rights/broadcast rights/non-earthbound communication rights -- and replace it with a simple formula: if the song draws on earlier songs, assign a percentage of all royalties to the ones most similar. Pay a fixed portion for any performance or rebroadcast, accounting for the (divided) writing royalties. This may sound complex, but in fact, it's been tacitly done within the industry for decades -- which is why cases such as the Gaye/Williams/Thicke one are rare. Let's 'fess up, folks -- when it comes to pop music, there really isn't anything new under the sun.

Friday, January 30, 2015

Burst

There are still places on this earth where people do burst into song.

I was in Dundalk, Ireland, in an ordinary, nondescript Italian restaurant on the high street, with some friends who were there attending a conference, along with a few other miscellaneous diners, when suddenly a woman at a table nearby, probably in her forties or so, let out with the refrain to Cat Stevens’s “Moonshadow” – and at once we were all, instantly, “leapin’ and hoppin’” along. The strange thing was that the waiters and waitresses took no notice of it; she sang, and we all came in on the chorus. At the end of it, there was applause, and people looked ‘round the room at one another, and someone called out “Well, give us another song!” And someone did.

This would never, of course, happen where I live, in the United States. Here, no one bursts into song, except maybe in a Karaoke bar, and badly. If you were to start singing at the top of your lungs in a TGI Friday’s or a Pizza Hut, people would think you were crazy, and before long the police would be called, and you’d be taken away to face charges.

But here, in Ireland, it was understood that people did do such things, and that it was a perfect part of the ordinary. People could, actually, burst into song in a public place, and just as in an episode of  “Pennies from Heaven,” the other diners – if they didn’t join in themselves` – would just carry on with their business as though it were the most usual thing in the world.

*     *     *     *     *

Growing up in the suburbs of Cleveland, Ohio, I was subjected to my parents’ deep and inexhaustible love of musicals. Living far from New York, we never actually went to one, but they blared forth from the stereo all the same. “My Fair Lady” – “Oklahoma!” – “The Most Happy Fella” – “The Pajama Game” – “The Mikado” – “Carousel” – “South Pacific” – the list seemed endless. My mother loved them, but my father loved them even more, turning out all the lights in the living room and turning up the stereo as far as it would go without distortion. “We are gentlemen of Japan!” … “All I want is a room somewhere” … “A policeman’s lot is not a happy one” …“Some enchanted evening” … there was an instant drama in every line, and though I hated the music itself, some portion of its feeling got under my skin, and planted a kind of weird seed for the future. I remember hating these songs, the darkened room, the way that my father would, from his prostrate position on the couch, quiver quietly with tears.

But now, decades later, when I hear these same songs, I’m the one that's crying. Not for the fate of poor little Buttercup, or Liza Doolittle, or Billy Bigelow -- but for the lost world in which people indeed did burst into song. And in that world, it now seems to me, there was such a tremendous reservoir of pent-up emotions, such a backlog of heartache, that it was absolutely impossible to speak – it had to be sung. And yet, once the final lines had finished and the strings faded, everyone went back to their ordinary, pent-up ways – and it was this final loss, the loss of the miraculous world of people actually singing about their feelings, that touched me the most. For even then, when such things were common knowledge, they failed to transform the world with their beauty; people went back to ugliness, got on with their tawdry lives, and if asked would probably have said “Singin’? I didn’t hear no singin’! Did you, pal?”

All of this came rushing back to me recently on when I re-watched the 1981 film Pennies from Heaven with Steve Martin and Bernadette Peters. I’d seen it when it first came out in a theatre – a lonesome one, populated only by myself and the love of my life, two little old ladies, and an usher collecting for the Jimmy fund. Somehow, the lip-synching made these songs work, and made the self-representation tragedy of singing one’s heart out a double loss – for the voice these actors sang with was not their own. Of course, that was often true in the classic Hollywood era, when Marni Nixon provided the voice for everyone from Deborah Kerr to Audrey Hepburn, but I never knew that back then. Later, of course, I saw Singin’ in the Rain and realized what a strange fate it was to be the unknown voice of a famous actress. It’s an uncanny kind of real-time pathos that goes back at least to Cyrano de Bergerac and right up through Disney’s Little Mermaid: to watch another woo one’s love with one’s own words, or voice. It’s a kind of tourture, really.

Some years seeing the 1981 film, I tracked down a copy of the BBC series with Bob Hoskins. Until then, I hadn’t realized how heavy-handed the irony was – that a man who sold love-songs for a living could be, not just a low-grade jerk, but a complete asshole – so that the transformations he underwent, and witnessed were a sort of false, cruel magic, one in which the kiss of song transformed toad-like people into temporary princes, not because they were really princes inside, but because they and the whole goddamned world were so deeply, perversely, incurably toad-like that this formed for them a kind of cruel joke. I wondered whether Dennis Potter was a cruel man, or whether, in the manner of other dark comedies, it was meant to be humorous.  His genius, I decided, was that it didn’t matter – it worked either way.

But what's happened to musicals today? It's strange to realize that Pennies from Heaven was in fact the very last musical produced by MGM before its long descent into financial ruin and dissolution, its legendary film library sold to Ted Turner, and then to Warner's. It was Disney -- or, more properly, Ashman and Menken -- who revived the musical feature film in 1989 with The Little Mermaid. Somehow, bursting into song had become something that was too much for any human actor; it had transcended that world and moved into animation. Even Broadway was swept by this phenomenon, as adaptations of Disney films have been among the highest-grossing and longest-running stage musicals.

There are exceptions, it's true, but they're rare. I have to confess that Stephen Sondheim, for all his storied career, has always left me cold. His melodies never quite soar; like tethered birds, they tweet too close to the ground, and are soon forgotten. Wicked -- which I've seen in its current incarnation at the Gershwin -- is transcedent on stage, but the soundtrack doesn't really do it for me. There's soaring -- who can forget their first experience of "Defying Gravity"? -- but without the visuals, the music itself sounds overblown and frantic.

And then there's Glee. Many love it. I loathe it. Not because it lacks magical moments of bursting into song, but because there are too many. You can't just constantly be bursting into song; as Alan Jay Lerner put it,
"A man's life is made up of thousands and thousands of little pieces. In writing fiction, you select 20 or 30 of them. In a musical, you select even fewer than that."
And that to me is what a musical is -- a series of dramatic moments, carefully curated. But a serial musical is like a serial killer -- you never know when it will strike again; we live in dread instead of wonder. Each song must count, must form the peak of one of the mountains of our existence. We mustn't descend too soon to our tawdry, toad-like world -- we must allow these shadows of our better selves to burst, to break, to echo down the corridors of everyday life, daring to sing out loud.

Wednesday, December 3, 2014

New Ideas about Policing -- from 1829

The very idea of a "police force" in the modern sense was in every way a Victorian invention. In London in the earlier part of the nineteenth century, crime was fought by an unwieldy array of forces: parish officers (beadles), private night watchmen, and the infamous "Bow Street Runners," whose principal job was apprehending persons wanted on charges to ensure their appearance in court, but who did little or nothing of what we'd conceive of as "patrolling."

The force behind this force was British PM Sir Robert Peel, whose name gave us two popular early nicknames for officers of the police he established ("Bobbies" and "Peelers"). In 1829, in the Police Act, he set forth a clear set of guidelines for these officers, which became known as Peel's Principles. Peel realized that, absent the public's trust and co-operation, the very idea of a police force was doomed to failure.

The police -- in London and elsewhere -- have changed in many ways since 1829. The MET, as it's known for short, has had to expand its mission and learn to tackle new challenges. The realization that plain-clothes police could help solve crimes led to the establishment of the Detective Division; the challenge of the Fenians, who were willing to blow things up to advance the cause of Irish independence, led to the creation of the Special Branch. The Met now even has special riot control units, some of members of which conducted themselves very poorly indeed in the killing of Ian Tomlinson in 2009, an event which -- though absent the element of race -- had much in common with recent American incidents.

But despite that, Peel's original principles would make as much sense for reforming the police in the UK as for reforming them here in the US where I live. The biggest difference is that the police were imagined primarily as a force to prevent crime, rather than merely apprehending criminals after the fact. But just as important was Peel's insistence that the police must be thought of -- and must think of themselves -- not as a special class of persons with unusual powers, but simply as "members of the public who are paid to give full-time attention to duties which are incumbent on every citizen in the interests of community welfare and existence."

"The police are the public and that the public are the police" -- that's the way Peel put it. We could hardly do better to "reform" the police than to re-assert this one-hundred-and-eighty-five year old sense of values.

Wednesday, February 5, 2014

The Assessment Craze

In the past decade, the movement for education “assessment” has reached a fevered pitch, moving from public elementary and high schools into higher education, at least at most state colleges and universities.  This has been done, ostensibly, in response to a public demand for “accountability,” another loaded word, and this demand has been translated into the most absolute of terms. At my own college, every department and program was, in essence, ordered to develop an assessment plan.  The buzzwords of these plans – rubrics, targets, educational outcomes, and so forth – came along with the orders, though each department was ostensibly free to decide what they would describe as their “outcomes,” and how they would say they would meet them, and measure their success in doing so. And it came with a not-so-veiled threat: if you don't come up with an assessment plan, we'll come up with one for you.'

The next step in this utilitarian, quality-control notion of education, of course, was to make sure that these "outcomes," elaborately measured and assessed, were used to reform -- or more precisely, to penalize -- the educators and pupils who failed to meet the chosen benchmarks. It's a notion so forward-looking that it takes one right back to the 1860's, when Lord Palmerston's government promulgated regulations stipulating the each pupil failing their exams on a prescribed subject would result in the school losing 2s. 8d. in the next year's funds. As the eminent Victorianist Richard Altick describes it, "a new premium was put upon rote memory, for throughout the year every effort was bent toward grinding into the child the sentences or the facts the inspector might demand of him." Bradley Headstone, Dickens's dark schoolmaster in Our Mutual Friend, could hardly have been more pleased.

The word "education" comes from the Latin ex-ducere, “to lead forth" and shares a root with “Duke.”  But it can also mean “to draw forth,” and shares this sense with ductile, meaning stretchable or pliable, as with ducts, and duct tape.  On the difference between these two senses a whole worldview depends: if to educate is to lead, to command, to instill by and with authority, then doubtless it makes sense to see how good one’s pupils are at following instructions, mastering principles, and learning protocols.  But if it means more to “draw forth” -- and this is the shade of its meaning I would emphasize -- then it is not a matter of commanding, but of encouraging students to stretch and bend their minds, making pliable the rigid postures into which authority, and life, have pressed us.   The first sense can be measured, up to a point: I’m sure many baby-boomers such as myself remember the 20-item quiz, which began with with “read all questions before you start” and ended with item 20 being “put down your pencil and don’t answer any of the questions.”  But mental ductility, unlike that of physical materials, is almost impossible to quantify, since some part of it necessarily involves breaking the rules, thinking outside of one’s ideological confines, and questioning presuppositions.

There has been, I will admit, an attempt to take cognisance of this vital quality -- the phrase that is usually used is “critical thinking.”  English professors such as myself generally like this phrase; to us it suggests a complex array of thought and capability.  But what is it, and how might we measure it? One current test defines it as “analyzing problems, generating logical and reasonable approaches to solve and implement solutions, reflecting consistent value orientations” (CTAB). Another assessment rubric recently discussed in my department speaks cryptically of “breaking down informational materials,” “using previously learned information in new and concrete situations” and “creatively or divergently applying prior knowledge.” Such definitions offer little encouragement; to me, they sound more like a plan for recycling than a definition of innovative or genuine analysis. But it matters not: in essence, all these plans are, at their foundations, anti-intellectual assaults on genuine learning.  For true learning is not a plan, not a fixed process, and very rarely a readily-measurable thingAs Albert Einstein -- a famously slow learner - observed, "It is nothing short of a miracle that modern methods of instruction have not yet entirely strangled the holy curiosity of inquiry. For this delicate little plant, aside from stimulation, stands mainly in need of freedom."

I've been teaching at the college level since 1986 -- twenty-nine years and counting. If this new regime of supposed "assessment" has its day, then higher education as we've known it will soon be coming to a close, to be replaced by a utilitarian yardstick that knows no value other than, or beyond, mere functionality.  Would the last person in the university please turn out the lights?

Tuesday, September 24, 2013

I'm J.K. Rowling ... and so's my wife!

It's an infamous scene in Monty Python's Life of Brian (and a sly parody of Kubrick's Spartacus): a centurion arrives at the place of crucifixion with orders to release "Brian" -- but he  he has just one a problem, which one of these poor sods hanging from crosses is Brian? He takes the practical route: "Where's Brian of Nazareth?" he calls out, "I have an order for his release!" And then, one by one, everyone (except of course the real Brian of Nazareth) starts calling out "I'm Brian!" "No, I'm Brian!" -- after which one especially bold fellow calls out "I'm Brian! And so's my wife!"

Needless to add, the real Brian isn't rescued. And that's how I felt about the recent bru-ha-ha over J.K. Rowling's try at a pseudonymous other authorial life as "Robert Galbraith." It's certainly her right to have given it a try -- if I were as well-known for one series of books as she is, I can imagine wanting to escape and re-invent myself. And, as she explained when the whole thing was uncovered, she'd done it very discreetly -- so much so that The Cuckoo's Calling was given the usual treatment accorded first-time novelists whose book hasn't been singled out for a big campaign (that would be most of them): some review copies were sent out, the proper ISBN's and ASIN numbers were sent to major retailers, along with a few copies of the book. It garnered some good reviews, too -- but, just like others of its seeming kind, it sold around 1,000 copies. Tell me about it -- I've been there.

Which is perfectly fine, I suppose, except for what happened once Rowling's authorship was revealed -- the book shot to #1 on the bestseller lists, and the publishers hastened to print the hundreds of thousands of copies they now knew it would sell. As James Stewart commented in the New York Times, it was not just a depressing sign of how little effort publishers put into promoting most new novels, but of how difficult it is to promote a book at all. One can Tweet, and blog, and Tumble all one wants; one can give readings to as many near-empty bookstores as one can stand; one can whisper into as many grapevines as one wants -- but there's no way to make sure a new book, however good it may be, escapes being swept away in a greyish fog of indifference. In one especially sad consequence of the success of the Harry Potter books, Bloomsbury -- which went from tiny publisher to UK giant on the sales of Rowling's books -- no longer even has a slush-pile, which was where the first Potter book's manuscript was rescued from obscurity.

But maybe there is a way. After all, we don't know whether this is Rowling's first outing in disguise. She might well have written others, and who knows under how many names. In fact, it seems to me that she might possibly have written my novel, and perhaps those of other lesser-known writers as well. How could one prove otherwise, in an age when denial is the strongest sign of the truth of what's denied.

So I'll say it now: I'm not Russell Potter (wasn't that name a bit of a give-away?) -- I'm actually  J.K. Rowling.

And I'd encourage every other writer I know to say the same thing. Go ahead, prove us wrong! Conduct a computer analysis of our writing habits, track down the falsified contracts, call the publishers' representatives.  In the meantime, while all that's going on, we'll be enjoying selling more books in a day than we have in the past five years.

But seriously: I feel for J.K. Rowling. It's been harder to her to publish something under a pseudonym than it was for Prince Hal to pass unnoticed among his troops at Agincourt. But if she really wants to earn some respect from the actual "Robert Galbraiths" of the world, she should tell her publishers to re-open that slush pile. Heck, she might try reading a few manuscripts herself. 

Saturday, September 7, 2013

Breaking Dad


What would my father, a chemist with frustrations of his own, have thought of Walter White?

Like countless others over the past few years, my family and I have been dealing with a drug habit: an addiction to Breaking Bad, the AMC television series about Walter White, a former high school chemistry teacher who takes up a second career cooking crystal meth after he’s diagnosed with cancer. We’ve often debated exactly why we find the show so compelling -- of course there are the great performances by Bryan Cranston as White and Aaron Paul as his partner (and former student) Jesse Pinkman, the inventive camera work, and writing that’s as clear and sharp as a tray of new-cooked product.  For me, at any rate, what really drew me in at the start, and has kept me watching ever since, was the way White’s backstory -- a chemistry genius who had a shot at making millions in a tech startup company, but ended up teaching a roomful of bored high school students -- meshed with his new persona as a dangerous man, a man who can put a bullet through a drug-dealer’s head if he has to. Before his diagnosis, Walt was a man who’d had to swallow his pride, and whittle away his gifts in a job that had become deeply unfulfilling.  After his diagnosis, though his body is wracked with cancer, he regains his pride by deliberately, willfully breaking the law in order to provide for his family -- all the while making it a point of pride that his crystal is chemically superior to anything else on the market.

My father grew up in the farm country of western Washington State, picking strawberries and working in a pea cannery to help his family make ends meet.  His boyhood hero was Thomas Edison, and he dreamt of a career as an inventor, once alarming his parents when his attempt to build a galvanic cell set fire to the corner of the chicken coop. After graduating from Washington State, he got his Ph.D. in physical chemistry from Purdue, and went to work for the one company with the most direct tie to Edison’s genius, General Electric.  At their lighting division “campus” at NELA Park in Cleveland, he too cooked crystals -- crystals of gallium arsenide -- in enormous blast furnaces, then passed electrical current through them to make some of the world’s first multi-colored LED lights.  And, although his bookshelves at home were laden with the paperweight-size awards that GE gave out to its top inventors, the company -- whose profits increasingly came from its financial-services division -- downsized his laboratory again and again, eventually closing down the entire research department and outsourcing it to a subcontractor who hired Ph.D.’s from Hungary at a tenth of my dad’s old salary.  I don’t think he ever forgave GE, and though he tried teaching chemistry for a while, quit in disgust at the low level of motivation among his students.  And then, as with Walt, there came an illness -- not cancer in his case, but Parkinson’s disease, which he came to believe might have been triggered by some of the chemicals he’d been exposed to during his years in the lab; in 2004, he died of the disease and its complications.

He certainly had plenty of reasons to feel resentful.  But would my dad have sympathized with Walter White?

At first, I didn’t think so. My dad was the kind of guy who would walk out of a movie theatre if there was a scene depicting an adulterous relationship; he was as honest as a boy scout and faithful as an old dog.  He never lied on his taxes or went over the speed limit, except once a year when he wanted to “clean the carbon” off the spark-plugs of his blue Oldsmobile Delta 88. But he was proud of his chemical knowledge.  A walk through the woods was an occasion for an explanation of osmosis, the process through which the sap ascended to the branches; when I was a kid, he would bring home old beakers (no round-bottom boiling flasks, alas) and once gave me a chunk of metallic sodium so that I could throw it in a pond and watch it explode.  If there was a mistake in chemistry on a science show -- or even in a science-fiction movie -- he’d write a strongly worded letter to the producers.  And he expected a reply, too.

But lawbreaking?  Deliberately making something that made other people addicted, and sick? And, when it seemed necessary, killing those who threatened him or stood in his way? I couldn’t imagine it. And yet, when I saw those duffel-bags full of rolled $100 bills in the early episodes, I thought to myself: wouldn’t my dad have felt satisfied if, after being forced into early retirement by GE, he’d been able to earn that kind of money?  If he could finally have bought that fishing shack up in the Cascade mountains where he’d hiked as a boy, gotten a new car, or even started his own lab to make something better and more valuable than he ever had for GE? Wouldn’t he be rooting for Walter White?

My Dad believed in good guys and bad guys; back in the days of old-time radio, his moral sense was honed by the Lone Ranger and the Cisco Kid.  He liked a good adventure story, and got a kick out of retro-styled films like Indiana Jones.  And he was a very emotional viewer, although you could only see that river when it spilled over its banks.  He could be a little unpredictable, but there was never any doubt as to where his sentiments lay. I remember watching the climactic scene of James Cameron’s Titanic with him -- where the elderly Rose drops the “Star of the Ocean” off the bow of the ship -- he sat through all it as stone-faced as Buster Keaton, but when Tom Hanks lost “Wilson” in Cast Away he wept profusely.

But he also had a powerful sense of justice.  Back when he was a student at Mount Vernon Junior College, he’d submitted a science essay for a contest.  The science teacher disqualified the paper, citing as a mistake a formula my dad knew was absolutely correct.  Dad wrote an angry letter to the school’s principal, who refused to overrule the teacher. Years later, when he heard through a friend that his old science teacher had died, he gave a grunt of satisfaction -- “served the bastard right,” he declared -- this from a man who almost never swore.

And, although he had a comfortable middle-class life -- more comfortable than Walt’s -- my dad had his share of money troubles later in life.  When my mom suffered a stroke, and his General Electric health policy refused to pay because he hadn’t contacted them for permission before she was admitted to the hospital, Dad went ballistic.  He  picked up the phone and argued his case for weeks, all the way up to the top -- Jack Welch, a former research chemist himself, was head of GE then -- and got them to change their mind.  He was proud of that.

But what if GE hadn’t come through with the money for my mom’s care?  What if my dad found he couldn’t provide for his family, despite all his years of hard work?  And what if he knew that his knowledge of chemistry could cover all his family’s bills -- would he have used it?

I doubt he would have ever considered the path of Walter White, but I have a feeling that he’d have sympathized with him all the same.  I know he’d have been pleased when the show got its chemistry right, and critical when they missed the mark.  “Mercury fulminate!  That’s much too unstable!”  I can almost hear him saying those words, along with a number of explanations of other chemicals mentioned in the show. So I suspect that he would have been just as closely glued to his screen as I am now -- even given Walter White’s increasing lies and deceits -- and I think he’d have been quietly rooting for him. In many ways, he was Walt --  he too had to learn to swallow his pride, to defer his dreams and waste time trying to justify his research to bosses who scarcely understood it.  He’d looked out over a lecture hall full of community college students as they chatted, or dozed, instead of listening to his impassioned explanation of the carbon-hydrogen bond.

The anger in Walt’s voice reminds me of my Dad’s anger, his frustration.  The way Walt uses chemistry in every one of his plans and devices reminds me of the way my Dad would use it to solve every household problem, from cleaning a stain on the carpet with hydrogen peroxide to graphing the temperature curve of the Thanksgiving turkey to tell when it would be done. There’s a purity about chemistry, a symmetry, a predictable architecture -- that actual life often seems to lack.  I think that’s where Walt’s rage comes from -- and it’s a rage he and my father shared. Late in his life, his mind drifting in the mists of Parkinson’s-related dementia, the anger was the one thing he could hold onto.  He raged, raged, against the dying of the light.

And in this dark and unpredictable world, where justice is so seldom really served, this is a rage we all can recognize.  Maybe that’s why we’re addicted to Breaking Bad.
NB: This is a repost from a year ago, on the occasion of the final installments of BB.