Tuesday, March 4, 2014

The Textbook Rental Scam

Ever since there have been colleges and universities, there have been textbooks. In the earliest days -- and by earliest, I mean the twelfth century -- manuscript books were far too costly for students to own; they had to be studied at the college's library. But early on, "school texts," which were often abridged versions of classic texts, were made available, along with anthologies that collected excerpts from the more commonly-used texts; the early era of print saw quite a few of these (as in the one shown above, published in Italy in the fifteenth century by Pellisson). These, too, were rather too expensive for most students to purchase outright, and so was born the idea of renting them.

Here in the United States, it became more common for students to purchase their books, usually through a college bookstore; at the end of the term these could be sold back for some portion of the original cost. This could be thought of, of course, as rental by another name -- the difference between the original cost and the buyback amount being the 'rent' -- but at least it gave students something tangible, something they could even (if they wished to) keep. Publishing college texts was an unglamorous business, and in the humanities at least college texts tended to be plain, durable, and drab.

Then came the publishing-merger boom of the 1990's. As other kinds of sales became less predictable, the idea of a captive audience of students who would have to buy a book became a vital asset, and textbook publishers consolidated even more than trade publishers. In the end, just a few giant publishers -- Pearson being the largest -- ended up controlling most of the market (the list of publishers gobbled up by Pearson is a who's who of the old textbook universe: Scott Foresman, Penguin, Longman, Allyn & Bacon, Addison Wesley, and many more). Controlling as much of the market as it did, Pearson could set its prices as high as it liked -- by one account, textbook prices in the past 20 years have gone up 812%.

What was a student to do? One could try to find a used copy, and until a few years ago the friendly folks at Off Campus Books would often have one available for 20% or 30% less; when buyback time came, they'd also pay a bit more. Many college towns had such stores. But publishing giants like Pearson didn't like this secondary market, and went to extraordinary lengths to undercut it: they issued "new" editions constantly (which had the effect of making the old ones worthless) even when all they'd done was change the page numbers; they included online material with an access code that would be useless for anyone but the original purchaser; they bundled Blackboard support with books in a similar way, so that the book itself would be insufficient. And their strategy worked; Off Campus Books went out of business, as did many similar stores.

French or Spanish texts for $150? Science textbooks for $250? Business textbooks for $450? When a book with the same number of pages and binding sells in the trade-books world for no more than $40? And that's only the half of it; some of the same publishers also market reference books to libraries (another seemingly captive market) for as much as $1,000 a volume.

Who can afford such books? Almost no one. And so, textbook rental has come back, with a vengeance. Amazon, Barnes & Noble, and other online services offer them, and many college bookstores have arrangements with book publishers that allow for rental of a physical book through the store's own system. But beware: often, the rental price is not all that much lower than the cover price (and surely more than a used copy would be) -- and, if you forget to return your book on time, your bank account may be dinged for the full retail cost of the book PLUS late fees. And, even if you DO return the book, it's possible that Amazon (or whoever else it may be) will decide that it's "no longer in acceptable rental condition." If that's the case, they charge you the full price, and send the book back to you.

You'd think that, in the same age where Netflix and other no-late-fee subscription services put Blockbuster and Hollywood Video out of business, that high initial costs and huge late fees would be a terrible business model. But students are a captive audience, and many of them don't read the fine print when they sign up to "rent" a book and save a few dollars. Some students have lobbied for regulating textbook prices, and in response, there's now a Federal regulation -- but it doesn't limit book prices. It just mandates that professors announce their textbooks farther in advance, giving students "time" to search for better bargains. Good luck with that.

Thursday, February 13, 2014


Throughout the latter part of the twentieth century, there was one common understanding, in the United States at least: a college education is a good thing. Returning soldiers took advantage of the original GI Bill, and their children took advantage of what was, for a time, a very affordable college education (at least at public colleges and universities). One result was the greatest economic boom in the nation's history, but another was its most educated public. It would be hard for anyone back in the 1980's, (to pick a decade) to imagine a time when college attendance wasn't a reachable ideal and an unquestioned good for anyone who could manage it.

Today, college education is questioned on every front; tuition has risen dramatically, and pundits -- who take for granted that the best way to measure the value of something is in the increased earning power it confers -- declare that a useful trade-school education would be a much better deal. We're told that we have to let go of the idea that college is for everyone, as pressure mounts on public institutions (public in name only, as decades of cuts by state legislatures have meant that most state colleges and universities receive less than 1/3 of their funding from the state). Even those who talk up the importance of college are insisting on "accountability," which means endless rounds of assessments and measurements of "outcomes" even the supposedly liberal President Obama, and his secretary of education, Arne Duncan, wax ecstatic talking up their plans for cradle-to-grave tracking of the correlatrion between education and earnings.

But in the midst of this neo-utilitarian fervor, something has been forgotten. As Mark Thomason wrote in a comment on a recent New York Times editorial,
I send my kids to college as a growth experience. It changes them, in good ways. I hope they do well financially, but I am not sending them to a trade school, I'm sending them to complete their education and complete growing up. It did me a lot of good, and it is doing them a lot of good.
The only difficulty is that this good -- which I agree is the most important aspect of a college education -- is very diffult to quantify. It doesn't necessarily lead to higher earnings; those who are inspired by their college experience to undertake creative careers in the arts, or work for a better society, often find they're earning a great deal less. But their personal satisfaction, and benefits to the world of their labors, though not necessarily tangible or measurable, are certainly vital.

All this puts me in mind of a much earlier moment when a large number of institutions whose work was universally understood as contributing to the greater good came under government pressure to prove that worth. Secured in beautiful gothic buildings, supplied with dormitories for their residents, dining  halls for their meals, and chapels for their worship, the inhabitants of these institutions little dreamt that, in a few short years, their buildings would be reduced to roofless rubble -- and by their own king. Thomas Cromwell (yes, that one -- whose villainy is deliciously revisited in Hilary Mantel's recent novels) sent out "visitors" to investigate the practices at these places, and the reports that came back were damning: the people there were not following their own rules, they were living lives of suspicious comfort, and worse yet -- by providing spiritual services to their neighbors, they were perpetuating false superstitions and scurrilous beliefs.

The king, Henry VIII, used these reports as the justification for what would, earlier in his own reign, have been an unthinkable act. He, with the collusion of Parliament and a few strokes of a pen, dissolved all the monsateries of England, seized their buildings and properties, and sent troops to round up the valuables. The golden altar furniture and books were added to the King's own collections, while the buildings and land were given to his political allies as their personal estates. The monks and nuns were supposed to receive pensions, but there seems no record of many, perhaps any, actually collecting them. What the King's men left behind, local villagers pillaged, finding new uses for door-hinges and window glass. The lead that had protected the roofs was removed and hauled away (it being a valuable metal at the time), and before long, the unprotected beams and boards rotted and collapsed.

To add to the irony, some of the funds raised by some these closures were used to fund colleges and endowments, including what later became Christ Church at Oxford.

I know it sounds crazy --  how could such a thing happen today? But I think it's a cautionary tale, especially for a population which seems unable to resist the siren-song of mere utilitarian value.

Wednesday, February 5, 2014

The Assessment Craze

In the past decade, the movement for education “assessment” has reached a fevered pitch, moving from public elementary and high schools into higher education, at least at most state colleges and universities.  This has been done, ostensibly, in response to a public demand for “accountability,” another loaded word, and this demand has been translated into the most absolute of terms. At my own college, every department and program was, in essence, ordered to develop an assessment plan.  The buzzwords of these plans – rubrics, targets, educational outcomes, and so forth – came along with the orders, though each department was ostensibly free to decide what they would describe as their “outcomes,” and how they would say they would meet them, and measure their success in doing so. And it came with a not-so-veiled threat: if you don't come up with an assessment plan, we'll come up with one for you.'

The next step in this utilitarian, quality-control notion of education, of course, was to make sure that these "outcomes," elaborately measured and assessed, were used to reform -- or more precisely, to penalize -- the educators and pupils who failed to meet the chosen benchmarks. It's a notion so forward-looking that it takes one right back to the 1860's, when Lord Palmerston's government promulgated regulations stipulating the each pupil failing their exams on a prescribed subject would result in the school losing 2s. 8d. in the next year's funds. As the eminent Victorianist Richard Altick describes it, "a new premium was put upon rote memory, for throughout the year every effort was bent toward grinding into the child the sentences or the facts the inspector might demand of him." Bradley Headstone, Dickens's dark schoolmaster in Our Mutual Friend, could hardly have been more pleased.

The word "education" comes from the Latin ex-ducere, “to lead forth" and shares a root with “Duke.”  But it can also mean “to draw forth,” and shares this sense with ductile, meaning stretchable or pliable, as with ducts, and duct tape.  On the difference between these two senses a whole worldview depends: if to educate is to lead, to command, to instill by and with authority, then doubtless it makes sense to see how good one’s pupils are at following instructions, mastering principles, and learning protocols.  But if it means more to “draw forth” -- and this is the shade of its meaning I would emphasize -- then it is not a matter of commanding, but of encouraging students to stretch and bend their minds, making pliable the rigid postures into which authority, and life, have pressed us.   The first sense can be measured, up to a point: I’m sure many baby-boomers such as myself remember the 20-item quiz, which began with with “read all questions before you start” and ended with item 20 being “put down your pencil and don’t answer any of the questions.”  But mental ductility, unlike that of physical materials, is almost impossible to quantify, since some part of it necessarily involves breaking the rules, thinking outside of one’s ideological confines, and questioning presuppositions.

There has been, I will admit, an attempt to take cognisance of this vital quality -- the phrase that is usually used is “critical thinking.”  English professors such as myself generally like this phrase; to us it suggests a complex array of thought and capability.  But what is it, and how might we measure it? One current test defines it as “analyzing problems, generating logical and reasonable approaches to solve and implement solutions, reflecting consistent value orientations” (CTAB). Another assessment rubric recently discussed in my department speaks cryptically of “breaking down informational materials,” “using previously learned information in new and concrete situations” and “creatively or divergently applying prior knowledge.” Such definitions offer little encouragement; to me, they sound more like a plan for recycling than a definition of innovative or genuine analysis. But it matters not: in essence, all these plans are, at their foundations, anti-intellectual assaults on genuine learning.  For true learning is not a plan, not a fixed process, and very rarely a readily-measurable thingAs Albert Einstein -- a famously slow learner - observed, "It is nothing short of a miracle that modern methods of instruction have not yet entirely strangled the holy curiosity of inquiry. For this delicate little plant, aside from stimulation, stands mainly in need of freedom."

I've been teaching at the college level since 1986 -- twenty-seven years and counting. If this new regime of supposed "assessment" has its day, then higher education as we've known it will soon be coming to a close, to be replaced by a utilitarian yardstick that knows no value other than, or beyond, mere functionality.  Would the last person in the university please turn out the lights?

Tuesday, September 24, 2013

I'm J.K. Rowling ... and so's my wife!

It's an infamous scene in Monty Python's Life of Brian (and a sly parody of Kubrick's Spartacus): a centurion arrives at the place of crucifixion with orders to release "Brian" -- but he  he has just one a problem, which one of these poor sods hanging from crosses is Brian? He takes the practical route: "Where's Brian of Nazareth?" he calls out, "I have an order for his release!" And then, one by one, everyone (except of course the real Brian of Nazareth) starts calling out "I'm Brian!" "No, I'm Brian!" -- after which one especially bold fellow calls out "I'm Brian! And so's my wife!"

Needless to add, the real Brian isn't rescued. And that's how I felt about the recent bru-ha-ha over J.K. Rowling's try at a pseudonymous other authorial life as "Robert Galbraith." It's certainly her right to have given it a try -- if I were as well-known for one series of books as she is, I can imagine wanting to escape and re-invent myself. And, as she explained when the whole thing was uncovered, she'd done it very discreetly -- so much so that The Cuckoo's Calling was given the usual treatment accorded first-time novelists whose book hasn't been singled out for a big campaign (that would be most of them): some review copies were sent out, the proper ISBN's and ASIN numbers were sent to major retailers, along with a few copies of the book. It garnered some good reviews, too -- but, just like others of its seeming kind, it sold around 1,000 copies. Tell me about it -- I've been there.

Which is perfectly fine, I suppose, except for what happened once Rowling's authorship was revealed -- the book shot to #1 on the bestseller lists, and the publishers hastened to print the hundreds of thousands of copies they now knew it would sell. As James Stewart commented in the New York Times, it was not just a depressing sign of how little effort publishers put into promoting most new novels, but of how difficult it is to promote a book at all. One can Tweet, and blog, and Tumble all one wants; one can give readings to as many near-empty bookstores as one can stand; one can whisper into as many grapevines as one wants -- but there's no way to make sure a new book, however good it may be, escapes being swept away in a greyish fog of indifference. In one especially sad consequence of the success of the Harry Potter books, Bloomsbury -- which went from tiny publisher to UK giant on the sales of Rowling's books -- no longer even has a slush-pile, which was where the first book's manuscript was rescued from obscurity.

But maybe there is a way. After all, we don't know whether this is Rowling's first outing in disguise. She might well have written others, and who knows under how many names. In fact, it seems to me that she might possibly have written my novel, and perhaps those of other lesser-known writers as well. How could one prove otherwise, in an age when denial is the strongest sign of the truth of what's denied.

So I'll say it now: I'm not Russell Potter (wasn't that name a bit of a give-away?) -- I'm actually  J.K. Rowling.

And I'd encourage every other writer I know to say the same thing. Go ahead, prove us wrong! Conduct a computer analysis of our writing habits, track down the falsified contracts, call the publishers' representatives.  In the meantime, while all that's going on, we'll be enjoying selling more books in a day than we have in the past five years.

But seriously: I feel for J.K. Rowling. It's been harder to her to publish something under a pseudonym than it was for Prince Hal to pass unnoticed among his troops at Agincourt. But if she really wants to earn some respect from the actual "Robert Galbraiths" of the world, she should tell her publishers to re-open that slush pile. Heck, she might try reading a few manuscripts herself. 

Saturday, September 7, 2013

Breaking Dad

What would my father, a chemist with frustrations of his own, have thought of Walter White?

Like countless others over the past few years, my family and I have been dealing with a drug habit: an addiction to Breaking Bad, the AMC television series about Walter White, a former high school chemistry teacher who takes up a second career cooking crystal meth after he’s diagnosed with cancer. We’ve often debated exactly why we find the show so compelling -- of course there are the great performances by Bryan Cranston as White and Aaron Paul as his partner (and former student) Jesse Pinkman, the inventive camera work, and writing that’s as clear and sharp as a tray of new-cooked product.  For me, at any rate, what really drew me in at the start, and has kept me watching ever since, was the way White’s backstory -- a chemistry genius who had a shot at making millions in a tech startup company, but ended up teaching a roomful of bored high school students -- meshed with his new persona as a dangerous man, a man who can put a bullet through a drug-dealer’s head if he has to. Before his diagnosis, Walt was a man who’d had to swallow his pride, and whittle away his gifts in a job that had become deeply unfulfilling.  After his diagnosis, though his body is wracked with cancer, he regains his pride by deliberately, willfully breaking the law in order to provide for his family -- all the while making it a point of pride that his crystal is chemically superior to anything else on the market.

My father grew up in the farm country of western Washington State, picking strawberries and working in a pea cannery to help his family make ends meet.  His boyhood hero was Thomas Edison, and he dreamt of a career as an inventor, once alarming his parents when his attempt to build a galvanic cell set fire to the corner of the chicken coop. After graduating from Washington State, he got his Ph.D. in physical chemistry from Purdue, and went to work for the one company with the most direct tie to Edison’s genius, General Electric.  At their lighting division “campus” at NELA Park in Cleveland, he too cooked crystals -- crystals of gallium arsenide -- in enormous blast furnaces, then passed electrical current through them to make some of the world’s first multi-colored LED lights.  And, although his bookshelves at home were laden with the paperweight-size awards that GE gave out to its top inventors, the company -- whose profits increasingly came from its financial-services division -- downsized his laboratory again and again, eventually closing down the entire research department and outsourcing it to a subcontractor who hired Ph.D.’s from Hungary at a tenth of my dad’s old salary.  I don’t think he ever forgave GE, and though he tried teaching chemistry for a while, quit in disgust at the low level of motivation among his students.  And then, as with Walt, there came an illness -- not cancer in his case, but Parkinson’s disease, which he came to believe might have been triggered by some of the chemicals he’d been exposed to during his years in the lab; in 2004, he died of the disease and its complications.

He certainly had plenty of reasons to feel resentful.  But would my dad have sympathized with Walter White?

At first, I didn’t think so. My dad was the kind of guy who would walk out of a movie theatre if there was a scene depicting an adulterous relationship; he was as honest as a boy scout and faithful as an old dog.  He never lied on his taxes or went over the speed limit, except once a year when he wanted to “clean the carbon” off the spark-plugs of his blue Oldsmobile Delta 88. But he was proud of his chemical knowledge.  A walk through the woods was an occasion for an explanation of osmosis, the process through which the sap ascended to the branches; when I was a kid, he would bring home old beakers (no round-bottom boiling flasks, alas) and once gave me a chunk of metallic sodium so that I could throw it in a pond and watch it explode.  If there was a mistake in chemistry on a science show -- or even in a science-fiction movie -- he’d write a strongly worded letter to the producers.  And he expected a reply, too.

But lawbreaking?  Deliberately making something that made other people addicted, and sick? And, when it seemed necessary, killing those who threatened him or stood in his way? I couldn’t imagine it. And yet, when I saw those duffel-bags full of rolled $100 bills in the early episodes, I thought to myself: wouldn’t my dad have felt satisfied if, after being forced into early retirement by GE, he’d been able to earn that kind of money?  If he could finally have bought that fishing shack up in the Cascade mountains where he’d hiked as a boy, gotten a new car, or even started his own lab to make something better and more valuable than he ever had for GE? Wouldn’t he be rooting for Walter White?

My Dad believed in good guys and bad guys; back in the days of old-time radio, his moral sense was honed by the Lone Ranger and the Cisco Kid.  He liked a good adventure story, and got a kick out of retro-styled films like Indiana Jones.  And he was a very emotional viewer, although you could only see that river when it spilled over its banks.  He could be a little unpredictable, but there was never any doubt as to where his sentiments lay. I remember watching the climactic scene of James Cameron’s Titanic with him -- where the elderly Rose drops the “Star of the Ocean” off the bow of the ship -- he sat through all it as stone-faced as Buster Keaton, but when Tom Hanks lost “Wilson” in Cast Away he wept profusely.

But he also had a powerful sense of justice.  Back when he was a student at Mount Vernon Junior College, he’d submitted a science essay for a contest.  The science teacher disqualified the paper, citing as a mistake a formula my dad knew was absolutely correct.  Dad wrote an angry letter to the school’s principal, who refused to overrule the teacher. Years later, when he heard through a friend that his old science teacher had died, he gave a grunt of satisfaction -- “served the bastard right,” he declared -- this from a man who almost never swore.

And, although he had a comfortable middle-class life -- more comfortable than Walt’s -- my dad had his share of money troubles later in life.  When my mom suffered a stroke, and his General Electric health policy refused to pay because he hadn’t contacted them for permission before she was admitted to the hospital, Dad went ballistic.  He  picked up the phone and argued his case for weeks, all the way up to the top -- Jack Welch, a former research chemist himself, was head of GE then -- and got them to change their mind.  He was proud of that.

But what if GE hadn’t come through with the money for my mom’s care?  What if my dad found he couldn’t provide for his family, despite all his years of hard work?  And what if he knew that his knowledge of chemistry could cover all his family’s bills -- would he have used it?

I doubt he would have ever considered the path of Walter White, but I have a feeling that he’d have sympathized with him all the same.  I know he’d have been pleased when the show got its chemistry right, and critical when they missed the mark.  “Mercury fulminate!  That’s much too unstable!”  I can almost hear him saying those words, along with a number of explanations of other chemicals mentioned in the show. So I suspect that he would have been just as closely glued to his screen as I am now -- even given Walter White’s increasing lies and deceits -- and I think he’d have been quietly rooting for him. In many ways, he was Walt --  he too had to learn to swallow his pride, to defer his dreams and waste time trying to justify his research to bosses who scarcely understood it.  He’d looked out over a lecture hall full of community college students as they chatted, or dozed, instead of listening to his impassioned explanation of the carbon-hydrogen bond.

The anger in Walt’s voice reminds me of my Dad’s anger, his frustration.  The way Walt uses chemistry in every one of his plans and devices reminds me of the way my Dad would use it to solve every household problem, from cleaning a stain on the carpet with hydrogen peroxide to graphing the temperature curve of the Thanksgiving turkey to tell when it would be done. There’s a purity about chemistry, a symmetry, a predictable architecture -- that actual life often seems to lack.  I think that’s where Walt’s rage comes from -- and it’s a rage he and my father shared. Late in his life, his mind drifting in the mists of Parkinson’s-related dementia, the anger was the one thing he could hold onto.  He raged, raged, against the dying of the light.

And in this dark and unpredictable world, where justice is so seldom really served, this is a rage we all can recognize.  Maybe that’s why we’re addicted to Breaking Bad.
NB: This is a repost from a year ago, on the occasion of the final installments of BB.

Thursday, August 1, 2013

Literature is bad for you

With attacks on the humanities -- and on my field of English in particular -- coming from every direction, I feel that it's time to consider some new and perhaps radical strategies to promote the reading of literature in these attention-span dwindling days. What was the last straw? Was it when a student in one of my classes condemned Edgar Allan Poe's "A Descent into the Maelstrom" as "too detailed"? Was it when a student in the midst of a discussion of Shirley Jackson's Haunting of Hill House asked whether its author was mentally ill? Or perhaps it was the day that someone asked whether, since they'd already read Hamlet in eleventh grade, they'd have to read it "all over again" in my class?

We who teach literature labor under a false premise: that if a teacher or college professor tells young people that a particular book or author is "good" for them, that they'll a) take our word for it; and b) read it. But what if our saying that it's "good" is precisely the problem? After all, none of the propaganda about the virtues of spinach, whole wheat bread, or tofu has made any of those foods more popular with teenagers. And what sort of honor is it, anyway, to have one's work declared to be a "classic"? I think with sorrow on the moment when Groucho Marx, who was having dinner with the poet T.S. Eliot, boasted that his daughter was studying "The Waste Land" at Beverly High. "I'm sorry to hear that," the great poet had replied, "I have no wish to become compulsory reading."

And herein lies the dark essence of my plan: instead of saying that reading literature is good for you, I think we should start telling people it's bad for them. After all, it has some serious side-effects: for a time, at least, the reader believes in, and worries about, the travails of completely non-existent people. Their hearts beat faster, they break out in a sweat, they turn the pages feverishly -- all in the quest to discover the fate of a woman or a man who has never lived. No less a light than St. Augustine condemned Virgil's Æneid for this reason: why should he be made to weep for Dido, a woman who was little more than a chimera created by a whiff of words, when his own immortal soul, not yet confessed unto Christ, was in so much greater and more profound peril?

The Irish novelist Flann O'Brien put it succinctly: a novel, like a drug, is self-administered in private, and its key effect is to convince the user of the reality of nonexistent beings. So why not regulate, or better yet, ban it, as we do other hallucinogenic drugs? Asking kids to read novels over their summer vacations is little better than popping LSD in their lunch-bags: the results will be much the same. Warning labels, at least, seem to be in order. Searches of backpacks should be conducted at every school, and every one of these illusionary text-drugs confiscated. Libraries? Let them, like those of Dickens's Professor Gradgrind, contain nothing but facts. A book which consists of facts, after all, does not deceive its readers into believing in the reality of imaginary people or places. Teaching literature? Eliminated. Let those who still wish to read novels be obliged to purchase them illegally, on street-corners, and hide them within paper bags until they reach the safety of their homes. And as for e-books, well -- aren't e-cigarettes, as much as the older, match-lit variety, little more than drug delivery vehicles? Fortunately, amazon.com can easily delete all fiction from the Kindle readers of offenders, refunding the cost so that users can more wisely spend their funds on non-fiction and works of reference.

And then, finally, I think you'd get the next generation interested in reading again. The thrill of the forbidden, the discovery of books as contraband, and the risk of arrest would make books cool. People would brag about scoring a gram of Poe down on the corner, or dropping a little Shakespeare in some dark alley. Bootleg editions of Woolf and Cather would be printed with plain brown covers, and once more, you'd have to smuggle copies of Joyce's Ulysses across the border inside boxes labelled "sanitary towels." Underground book groups would form, and meet in secret, shifting locations, the address sent out via encrypted e-mails. Oh, sure, there'd be some places -- Amsterdam, I suppose -- that would tolerate fiction, setting up "reader parks" where you could openly turn the pages of Kerouac or Kesey. But we'd all know that would never work; fiction is just a gateway drug, and the only solution is one of zero tolerance. For, as the town manager in Terry Gilliam's Munchausen notes, we can't have people escaping at a time like this.

Thursday, July 11, 2013

What's a Book?

The late great Maurice Sendak, irascible and sharp as ever in his last interviews, had this to say of e-books: "I hate them. It's like making believe there's another kind of sex. There isn't another kind of sex. There isn't another kind of book! A book is a book is a book." It's a hard quote to improve on, but it's also worth considering, really: what is a book? And is an e-book really a book at all?

I'd say that, to be a book, whether material or virtual, there are a few basic qualifications -- I can think of six off the top of my head:

• A book must contain readable text.
• It must be portable -- the ability to take a book anywhere is one of its key strengths.
• The text must be persistent -- that is, it should still be there if you go away and come back again later.
• You should be able to do what you want with it: store it, loan it, give it away, bequeath it, and (yes) destroy it if you have a mind to.
• It should be able to be annotated, written in, drawn in, dog-eared or place-marked. Call it "interactivity" if you like.
• It shouldn't vanish unexpectedly. And, if undisturbed, it should last for years.

So is a typical e-book a book by these measures? In most cases, no. It meets the first two criteria, yes -- but is it persistent? Some e-books leant by libraries expire after a certain date and can no longer be read; some e-books can only be read in certain places (as with Barnes & Noble's 'share-in-store' café feature) -- that's not real persistence. The fourth qualification, though, is the biggest stumbling block: almost no commercial e-book format allows lending or giving of any kind. If, in a lifetime, you amass a library of physical books on which you spend tens of thousands of dollars, you can give it to a friend, leave it to your kids, or donate it to a library. If you did the same with e-books, you'd have nothing -- your death would be the death of every e-book you'd bought.

Annotation? Some platforms allow this, and there's even one model in which one can see other peoples' annotations -- wow, just like a book! There are "signed" e-books never touched by an author's hand. But if the lifespan of an e-book is uncertain, the duration of these user-added annotations is even more questionable.

And disappearing? Amazon.com famously deleted copies of Orwell's Animal Farm from its users' Kindles, kindly crediting them 99 cents, after the company was informed by the Orwell estate that the book was still in copyright -- talk about Orwellian. And there's nothing to say Amazon or some other vendor couldn't do it again. What's more, if you decided not to be an Amazon customer, or not to replace a broken Kindle, or if Kindle were to be replaced by some hardware or software that wasn't backwards-compatible with older e-book formats, then your books would have vanished for you.

Lastly, what would one make of some archaeologist of the far future, coming upon a buried e-library? If Amazon didn't exist in the future, there'd be no way to recover these battered e-readers and tablets -- their data was mostly stored on cloud that once floated in the sky of a lost civilization. And like clouds, there'd be no getting them back.

So I suggest a label, or some sort of certification: Only e-books and readers that met the six criteria above would be certified as genuine "books" -- everything else would have to use some other word: text-sacks, wordblobs, readoids, or libri-fizzles. Anything but "books."

(illustration from wikimedia commons)

Tuesday, June 25, 2013

HG Wells, Television, and "Things to Come"

I'm an enormous fan of the folks at the Criterion Collection, and keenly awaited their restored DVD of HG Wells and Alexander Korda's 1936 film Things to Come. And the disc did not disappoint; the restoration is brilliant, and the bonus features and commentaries are all illuminating as always. But they did miss one point, and I think it's a point worth making: Wells included television in his futuristic vision, and there was a very direct connection between his ideas and the Scottish engineer John Logie Baird, who had invented and demonstrated the first practical system of television a decade before the film's release. There's ample evidence of the connection, and it would be perfect commentary for the moment at which the rebel artist "Theotocopulos" beams his image around the ultramodern city. Indeed, the word that Wells consistently used in his original treatment/script was "televisor," and that was the very term Baird used for his apparatus.

Baird's original design used a electro-mechanical interface: a spinning disc with a spiral of precisely-placed holes served both as camera and as receiver. The camera/transmitter placed one or more photocells in front of the frame, and the receiver backlit an identically-proportioned disc with a small neon bulb with replicated the pulses recorded by the photocells. This "disc" system was used in the earliest public broadcasts of television in the early 1930's, during which home viewers -- "lookers-in" as they were called -- could tune to the high end of the radio band and receive remarkably clear pictures in the tiny window of their sets. The pictures were only 30 lines in resolution, but image captures and a few surviving off-air recordings show a distinct and recognizable images of human faces. The aspect ration of these broadcast was tall and narrow -- 3:7 -- which was designed to capture the presenter's upper body, a system one of the BBC's engineers quippingly called "head and shoulders."

And what we see in Things to Come is indeed a very tall and narrow aspect ratio. Wells's original treatment also refers on several occasions to the televisor's "disc," though in the final production design no discs are seen, and instead of a close-up of a disc with the outline of Cabal's head and shoulders, we're shown what looks like a filmstrip moving along a table. The screens in the film are of various sizes, including a small one in a translucent frame mounted on wheels, which we see Cabal pushing aside on his desk, disgusted with what he "sees" (a clever shot which eliminated the need of showing the screen image). Showing the same image on devices of many sizes, and at several locations, certainly prefigures our age of desktops, laptops, tablets, and smartphones, a connection that goes unremarked by the disc's commentator, David Kalat.

And interestingly, if Wells was fascinated by Baird's invention, Baird had grown up as an avid fan of Wells, whom he jokingly referred to as his 'demigod.' By chance, in 1931, both were passengers on the US-bound Aquitania, and Baird was able to finally meet his hero, though by his account the conversation was awkward, and did not touch on television; the snapshot above shows their meeting. It's hard to imagine a more resonant pairing -- the inventor of television with the pre-eminent writer of science fiction -- and yet what comes through most clearly is that both were, as Baird remarked, 'poor vulgar creatures' -- mere mortals who, as things turned out, would not live to see the ultimate forms the technologies they imagined would take.