Saturday, June 4, 2016

They shall not pass!

There's been a lot of talk in the press these days, with the rise of he-who-must-not-be-named in American politics, about how to stop fascistic, totalitarian figures from coming into power. It appears to be something that folks in the US and UK haven't had to deal with, and so they must look to Germany, or Spain -- but in fact, that's not true at all.

Back in Britain in the 1930's, there was a divisive political leader whose rallies were beginning to cause concerns. A former political liberal, he'd taken to denouncing his previous views; at massive indoor rallies (including one at the Albert Hall), he stirred his listeners to passionate cheers by denouncing immigrants, Jews, and a shadowy cabal of forces that were driving the common man down. He love to egg on protesters at these rallies, and had special squads of "stewards" to rough them up and throw them out, but in fact they served a purpose. Indeed, he credited them with making his speeches all the more effective, even saying he looked forward to them.

The man in question was Sir Oswald Mosley. And Britons weren't quite sure what to make of him; though many despised his views, his party -- the British Union of Fascists -- was granted the same rights as any other, and in the early 1930's their polling numbers were on the rise, although they did not yet have any elected member of Parliament. Mosley, inspired by visits with Mussolini and Hitler, decided to give his party a paramilitary flavor, initially going with plain black futuristic uniforms, but later graduating to jackboots and a peaked cap (for Mosley himself at least). From their large indoor rallies, they graduated to outdoor ones with marches; when Mosley was clobbered with a flying brick by protesters at one, he wore his bandages as a cap of pride. It was part of his strategy to march directly into areas, such as the East End of London, where the bricks were most likely to fly; this both fed his sense of justified anger, and forced the police into serving as his own personal security.

Things came to a head in October of 1936; Mosley had planned a rally and march through the East End, and had pulled out all stops; his new uniform was ready, as were many of his followers, some of whom rode in specially fortified vans. The Metropolitan Police, under the leadership of Sir Philip Game, mustered out 6,000 officers, one of the largest forces since the days of the Chartist rallies in the 1840's, but even then it soon became clear they were stretched thin; attempts to disperse a crowd of nearly 100,000 East Enders proved impossible. Those opposed to Mosley, whose planned route would have taken him down Cable Street, erected barricades of bricks and sticks, along with a lorry and some construction equipment they'd procured from a nearby builder's yard. They hoped to block the march by their mere presence, though they were also prepared to fight; one witness described men wrapping barbed wire around chair-legs as improvised weapons. The chant of one and all was simply this: "They shall not pass!"

In the end, Sir Philip Game persuaded Mosley to call off that part of his plans, and to march instead back where he'd come from to Hyde Park. Public alarm over the events led to the passing of the Public Order Act, which although it did not outlaw Mosley's party, prohibited political uniforms such as those the "blackshirts" had favored. When Britain and Germany went to war, the BUF was banned, and Mosley and many of its other leaders were imprisoned for the duration. One, though, did escape: the scar-faced William Joyce, who as "Lord Haw-Haw" broadcast mocking ripostes to nearly every one of Churchill's radio speeches. After the war, Joyce was convicted of high treason, and hanged at Wandsworth Prison. Sir Oswald returned to political life, though in his one election -- for Kensington North in 1959 -- he polled only 8% of the vote with an anti-immigration platform that included deporting those from the West Indies. 

Monday, May 30, 2016

Girl number twenty unable to define a horse!

Thomas Gradgrind
Consider this a sort of open letter to the American education system. It's been fourteen years since the passing of the "No Child Left Behind" law, and in my college classes, I now have students who've experienced its effects for much, if not all, of their primary and secondary educations. The neo-utilitarians have had their day, and here are the results: a generation of students who have learned the real lesson of the new regime: hunker down, follow instructions, and learn whatever is going to be on the test. Fiction is out; nonfiction is in; reading for pleasure is out; reading for content is in. After all, the only skills that matter are the ones that can be measured.

And they're not very good students. They're good people -- they have all the hopes, fears, and aspirations that my students have always had -- but as students, they're uniquely unprepared for college, at least on the side of the humanities. Their habit of scanning syllabi and readings to see what will be tested, what will be measured, makes them poor readers; when they encounter an ambiguity, or a difficult reading, they tense up in anticipation of being taken to task for some vague failure. My endeavor has always been to share my love of literature, to (I hope) instill in students an excitement, an interest, a sense of something personally significant for themselves, in everything they read -- but it's getting tougher each semester. Their curiosity has been hammered; their personal experience belittled, their idiosyncrasies ironed out. As a result, the pleasure of discovery of something new has given way to an anxiety about the unfamiliar; it's like a jungle out there, sometimes I wonder how they keep from going under.

Are they better prepared for the wonderful world of employment? Perhaps, so long as the employment they find demands repetitive goal-oriented work, with little room for innovation and frequent employee evaluations. They may, so long as they've had the prerequisites, do well in science, technology, engineering, and math -- though only at a rudimentary level. I can't see any new inventions, innovative theories, or speculative hypotheses coming from this generation, as anything they did or thought which deviated from the appointed path earned them the electric shock of a poor mark. In the humanities, they arrive with only the most superficial skills, and with what I find an astonishingly low level of motivation -- motivation here being defined as some strong desire from within one's own self to learn and grow. That desire seems to have been largely amputated, and in its place our brave new Benthamites have instilled a simple wish for nothing more than a clear set of instructions and evaluative rules. Learn, do, demonstrate. Wash, rinse repeat.

It's a sad day for me, and for college educators generally. We can't turn back the hands of time -- we'll have to do what we can to help these students regain their own self-confidence, to revive and replenish their ossified curiosity and shrivelled sense of self. It will be hard work, and until that work is done, the work that professors such as myself used to do -- of teaching our field of study -- will have to be postponed. It'll be like a pre-college program, only it will take up college time. We ourselves have been sent the memo; our departments, programs, and fields of study already have to produce the "documentation" of what skills and knowledge we add, and our "metrics" for assessing the "outcomes" of our labors. The madness of measurement has overtaken us all, and if anything resembling real education is going to take place, it's going to have to do so by flying under the radar of this new regime.

Friday, February 19, 2016

Tolkien, Moorcock, and "Crypto-Fascism"

Recently, an interview in the New Statesman with Michael Moorcock has been making the Facebook rounds (the interview actually came out last summer, but it's always "today" in Facebook land). In it, Mr. Moorcock says some rather pointed things about other science-fiction and fantasy writers, but it's his comments on Tolkien that have turned the piece into clickbait and Facebook fodder:

“I think he’s a crypto-fascist,” says Moorcock, laughing. “In Tolkien, everyone’s in their place and happy to be there. We go there and back, to where we started. There’s no escape, nothing will ever change and nobody will ever break out of this well-­ordered world.”

There's a lot to unpack in these few sentences -- but the main problem is that they paint an inaccurate picture of Tolkien's work. The Hobbits are 'fond of comfort,' indeed, but find themselves in a world where, even in their long-secluded bower, the dangers of the 'wide world' are pressing in, and the happiness of former times is no longer possible. We do indeed go 'there and back again' -- for this is a mythic journey, and like all other such quests (whether ancient ones, e.g. Homer's Odyssey, or modern ones, such as L. Frank Baum's The Wizard of Oz), it concludes with a return home -- not, however, to the old home one had left behind, or as one's old self -- but as a renewed person in a renewed world. All is changed -- changed utterly -- even as, externally, these deeper changes may not be evident to all.

And mythic fantasies are not worlds one breaks out of, but in to, from the oppressive, authoritarian forces of of the demands and expectations of the "real" world. They are, indeed, acts of escape, and as Tolkien noted in his essay "On Fairy Tales," that's a term he feels has been unjustly maligned:
I have claimed that Escape is one of the main functions of fairy-stories, and since I do not disapprove of them, it is plain that I do not accept the tone of scorn or pity with which “Escape” is now so often used: a tone for which the uses of the word outside literary criticism give no warrant at all. In what the misusers are fond of calling Real Life, Escape is evidently as a rule very practical, and may even be heroic. In real life it is difficult to blame it, unless it fails; in criticism it would seem to be the worse the better it succeeds. Evidently we are faced by a misuse of words, and also by a confusion of thought. Why should a man be scorned if, finding himself in prison, he tries to get out and go home? Or if, when he cannot do so, he thinks and talks about other topics than jailers and prison-walls? The world outside has not become less real because the prisoner cannot see it.
But the biggest problem with Moorcock's misreading of Tolkien comes when he adds its fibers to the broad brush with which he wants to tar later "Tolkienism," such as the books/television series Game of Thrones. But this, too, is mistaken; Game of Thrones may wear the trappings of a Tolkienesque world, but it's not high fantasy, or really even fantasy at all -- it's just a nasty action-adventure soap opera with no deeper ethical or artistic point.

It's a point that many readers, and even avid fans, of Tolkien, still miss. The Lord of the Rings is a serious book, as serious as Milton's Paradise Lost or Shakespeare's Tempest. Power is not, within such a world, something one can play "games" with; it's not about families or factions or politics. Something far greater than mere power is at stake; to think otherwise is to think as Sauron did. The greatest power in Tolkien's universe, as he repeatedly made clear, was not to use the greatest power; abnegation was its highest expression. Good is, always was, and always will be outgunned, because it has to follow its own rules, while Evil (within the book's universe) has no such restraints. In fact, I'd go so far as to say that Tolkien's work -- first invented in the trenches of the first World War, and completed in the shadow of the second -- may be the most profoundly ant-fascist narrative ever written.

Tuesday, July 7, 2015


West Germany signs up for its 50% debt reduction in 1953
What, exactly, is debt? The question takes on new urgency as, yet again, the economic wise men of the European Union declare that Greece must submit to their terms -- for, after all, is there not a great debt at stake? Not to address it in the EU's terms, clearly, would be irresponsible and disastrous -- and so, their ministers speak to Greece -- the cradle of the civilization they claim to represent -- as one would to a child.

The Greek government has debts, indeed, and Greek banks have still more. But what does this really mean? Nietzsche was hardly the first to notice that the German word for guilt (schuld) derives from the older concept of debt (schulden). In English texts, "forgive us our debts" was an older and more common form of the Lord's Prayer's "forgive us our trespasses." Debt is sin, Christ is our "redeemer," and entrance into the kingdom of heaven is to be -- in these terms -- a forgiveness of debts.

We in the U.S. like to pillory the profligate -- it's almost a national passion. Republicans in the US Congress took the lead in making personal bankruptcy more onerous, and have harped, from the Reagan era to our own, on the evil of deficits, and the need to cover any expense with a parallel cut. It's become almost a mantra with the Tea Party set.

But there's a problem here. For one, the debt of nations is not at all like the debt of people. Ordinary people can't print their own currency, or re-value it, or issue bonds to fund their new chimney. Nations can do this, because they have the larger essential reserves -- a labor force, roads, cities, minerals and other natural resources to generate future wealth, and that creates the ability to evaluate their worth, and trade in it.  But even more than this, the debt of nations is not guilt, nor is their spending like the profligacy of private persons.

Greece has suffered. It has suffered for its overspending (which, as Nobel Laureate Paul Krugman has pointed out, was no so much more horrific than that of many other nations), for its failure to live up to the expectations of the creditors who loaned it and its banks generous funds during what seemed to be boom times. The EU forces have imposed austerity to an amazing extent -- cuts in pensions, tax increases, cuts in social services, privatization of national treasures -- its entire economy has shrunk by more than a quarter. If Greece today is less well-prepared to pay back its debts -- or rather, the debts of its banks -- than it was several years ago, it's largely because it's taken the EU prescription of austerity.

And what is "austerity"? Very clearly, it is a special sort of punishment, one reserved to errant nations. They must be made to pay! Even if every last citizen is to starve, even if the cure kills the patient, pay they must. The latest statements from the German finance minister make it quite clear that the goal of the latest agreement, is not to make it more likely to recoup Greek debt -- rather, it is to punish the Greeks.

One should see at a glance how much this sort of "debt" is like "guilt." But there's one way in which "debt" is different, when it's money that's at stake: those who loan money expect a profit from it. They don't lend out of kindness, nor was the debt incurred out of errant sins. They gave because they expected to receive, and then some.

But let's not focus on Greece -- let's look at Ireland, a country which willingly accepted the EU's austerity prescriptions. Now, seven years later, we are told that there are "signs of recovery" -- signs evident to economists, its seems, but hard for the ordinary Irish citizen to see. Any number of years is too much to endure such pain, the more so when all it really does is satisfy the creditors (who, one might imagine, might prefer to be satisfied sooner -- but not, it seems, as much as they prefer the glee of punishing another).

And seven years, interestingly, the the time laid out for Sh'mittah -- the traditional forgiveness of debts in the Jewish Torah. In that year, debts are to be forgiven, and the poor welcomed to glean in the fields. Seems to me it's time.

Wednesday, March 25, 2015


Throughout the latter part of the twentieth century, there was one common understanding, in the United States at least: a college education was a good thing. Returning soldiers took advantage of the original GI Bill, and their children took advantage of what was, for a time, a very affordable college education (at least at public colleges and universities). One result was the greatest economic boom in the nation's history, but another was its most educated public. It would be hard for anyone back in the 1980's, (to pick a decade) to imagine a time when college attendance wasn't a reachable ideal and an unquestioned good for anyone who could manage it.

Today, college education is questioned on every front; tuition has risen dramatically, and pundits -- who take for granted that the best way to measure the value of something is in the increased earning power it confers -- declare that a useful trade-school education would be a much better deal. We're told that we have to let go of the idea that college is for everyone, as pressure mounts on public institutions (public in name only, as decades of cuts by state legislatures have meant that most state colleges and universities receive less than 1/3 of their funding from the state). Even those who talk up the importance of college are insisting on "accountability," which means endless rounds of assessments and measurements of "outcomes," and even the supposedly liberal President Obama, and his erstwhile secretary of education, Arne Duncan, wax ecstatic talking up their plans for cradle-to-grave tracking of the correlatrion between education and earnings.

But in the midst of this neo-utilitarian fervor, something has been forgotten. As Mark Thomason wrote in a comment on a recent New York Times editorial,
I send my kids to college as a growth experience. It changes them, in good ways. I hope they do well financially, but I am not sending them to a trade school, I'm sending them to complete their education and complete growing up. It did me a lot of good, and it is doing them a lot of good.
The only difficulty is that this good -- which I agree is the most important aspect of a college education -- is very difficult to quantify. It doesn't necessarily lead to higher earnings; those who are inspired by their college experience to undertake creative careers in the arts, or work for a better society, often find they're earning a great deal less. But their personal satisfaction, and benefits to the world of their labors, though not necessarily tangible or measurable, are certainly vital.

All this puts me in mind of a much earlier moment when a large number of institutions whose work was universally understood as contributing to the greater good came under government pressure to prove that worth. Secured in beautiful gothic buildings, supplied with dormitories for their residents, dining  halls for their meals, and chapels for their worship, the inhabitants of these institutions little dreamt that, in a few short years, their buildings would be reduced to roofless rubble -- and by their own king. Thomas Cromwell (yes, that one -- whose villainy is deliciously revisited in Hilary Mantel's recent novels) sent out "visitors" to investigate the practices at these places, and the reports that came back were damning: the people there were not following their own rules, they were living lives of suspicious comfort, and worse yet -- by providing spiritual services to their neighbors, they were perpetuating false superstitions and scurrilous beliefs.

The king, Henry VIII, used these reports as the justification for what would, earlier in his own reign, have been an unthinkable act. He, with the collusion of Parliament and a few strokes of a pen, dissolved all the monsateries of England, seized their buildings and properties, and sent troops to round up the valuables. The golden altar furniture and books were added to the King's own collections, while the buildings and land were given to his political allies as their personal estates. The monks and nuns were supposed to receive pensions, but there seems no record of many, perhaps any, actually collecting them. What the King's men left behind, local villagers pillaged, finding new uses for door-hinges and window glass. The lead that had protected the roofs was rolled up and hauled away (it being a valuable metal at the time), and before long, the unprotected beams and boards rotted and collapsed.

To add to the irony, some of the funds raised by some these closures were used to fund colleges and endowments, including what later became Christ Church at Oxford.

I know it sounds crazy --  how could such a thing happen today? But I think it's a cautionary tale, especially for a population which seems unable to resist the siren-song of mere utilitarian value.

Friday, January 30, 2015


There are still places on this earth where people do burst into song.

I was in Dundalk, Ireland, in an ordinary, nondescript Italian restaurant on the high street, with some friends who were there attending a conference, along with a few other miscellaneous diners, when suddenly a woman at a table nearby, probably in her forties or so, let out with the refrain to Cat Stevens’s “Moonshadow” – and at once we were all, instantly, “leapin’ and hoppin’” along. The strange thing was that the waiters and waitresses took no notice of it; she sang, and we all came in on the chorus. At the end of it, there was applause, and people looked ‘round the room at one another, and someone called out “Well, give us another song!” And someone did.

This would never, of course, happen where I live, in the United States. Here, no one bursts into song, except maybe in a Karaoke bar, and badly. If you were to start singing at the top of your lungs in a TGI Friday’s or a Pizza Hut, people would think you were crazy, and before long the police would be called, and you’d be taken away to face charges.

But here, in Ireland, it's understood that people do do such things, and that it's a perfect part of the ordinary. People could, actually, burst into song in a public place, and just as in an episode of  “Pennies from Heaven,” the other diners – if they didn’t join in themselves – would just carry on with their business as though it were the most usual thing in the world.

*     *     *     *     *

Growing up in the suburbs of Cleveland, Ohio, I was subjected to my parents’ deep and inexhaustible love of musicals. Living far from New York, we never actually went to one, but they blared forth from the stereo all the same. “My Fair Lady” – “Oklahoma!” – “The Most Happy Fella” – “The Pajama Game” – “The Mikado” – “Carousel” – “South Pacific” – the list seemed endless. My mother loved them, and my father loved them even more, turning out all the lights in the living room and turning up the stereo as far as it would go without distortion. “We are gentlemen of Japan!” … “All I want is a room somewhere” … “A p'liceman’s lot is not a happy one” …“Some enchanted evening” … there was an instant drama in every line, and though I hated the music itself, some portion of its feeling got under my skin, and planted a kind of weird seed for the future. I remember hating these songs, the darkened room, the way that my father would, from his prostrate position on the couch, quiver quietly with tears.

But now, decades later, when I hear these same songs, I’m the one that's weeping. Not for the fate of poor little Buttercup, or Liza Doolittle, or Billy Bigelow -- but for the lost world in which people indeed did burst into song. And in that world, it now seems to me, there was such a tremendous reservoir of pent-up emotions, such a backlog of heartache, that it was absolutely impossible to speak – it could only be sung. And yet, once the final lines had finished and the strings faded, everyone went back to their ordinary, pent-up ways – and it was this final loss, the loss of the miraculous world of people actually singing about their feelings, that touched me the most. For even then, when such things were common knowledge, they failed to transform the world with their beauty; people went back to ugliness, got on with their tawdry lives, and if asked would probably have said “Singin’? I didn’t hear no singin’! Did you, pal?”

All of this came rushing back to me recently on when I re-watched the 1981 film Pennies from Heaven with Steve Martin and Bernadette Peters. I’d seen it when it first came out in a theatre – a lonesome one, populated only by myself and the love of my life, two little old ladies, and an usher collecting for the Jimmy fund. Somehow, the lip-synching made these songs work, and made the self-representation tragedy of singing one’s heart out a double loss – for the voice these actors sang with was not their own. Of course, that was often true in the classic Hollywood era, when Marni Nixon provided the voice for everyone from Deborah Kerr to Audrey Hepburn, but I never knew that back then. Later, of course, I saw Singin’ in the Rain and realized what a strange fate it was to be the unknown voice of a famous actress. It’s an uncanny kind of real-time pathos that goes back at least to Cyrano de Bergerac and right up through Disney’s Little Mermaid: to watch another woo one’s love with one’s own words, or voice. It’s a kind of tourture, really.

Some years seeing the 1981 film, I tracked down a copy of the BBC series with Bob Hoskins. Until then, I hadn’t realized how heavy-handed the irony was – that a man who sold love-songs for a living could be, not just a low-grade jerk, but a complete asshole – so that the transformations he underwent, and witnessed were a sort of false, cruel magic, one in which the kiss of song transformed toad-like people into temporary princes, not because they were really princes inside, but because they and the whole goddamned world were so deeply, perversely, incurably toad-like that this formed for them a kind of cruel joke. I wondered whether Dennis Potter was a cruel man, or whether, in the manner of other dark comedies, it was meant to be humorous.  His genius, I decided, was that it didn’t matter – it worked either way.

But what's happened to musicals today? It's strange to realize that Pennies from Heaven was in fact the very last musical produced by MGM before its long descent into financial ruin and dissolution, its legendary film library sold to Ted Turner, and then to Warner's. It was Disney -- or, more properly, Ashman and Menken -- who revived the musical feature film in 1989 with The Little Mermaid. Somehow, bursting into song had become something that was too much for any human actor; it had transcended that world and moved into animation. Even Broadway was swept by this phenomenon, as adaptations of Disney films have been among the highest-grossing and longest-running stage musicals.

There are exceptions, it's true, but they're rare. I have to confess that Stephen Sondheim, for all his storied career, has always left me cold. His melodies never quite soar; like tethered birds, they tweet too close to the ground, and are soon forgotten. Wicked -- which I've seen in its current incarnation at the Gershwin -- is transcedent on stage, but the soundtrack doesn't really do it for me. There's soaring -- who can forget their first experience of "Defying Gravity"? -- but without the visuals, the music itself sounds overblown and frantic.

And then there's Glee. Many love it. I loathe it. Not because it lacks magical moments of bursting into song, but because there are too many. You can't just constantly be bursting into song; as Alan Jay Lerner put it,
"A man's life is made up of thousands and thousands of little pieces. In writing fiction, you select 20 or 30 of them. In a musical, you select even fewer than that."
And that to me is what a musical is -- a series of dramatic moments, carefully curated. But a serial musical is like a serial killer -- you never know when it will strike again; we live in dread instead of wonder. Each song must count, must form the peak of one of the mountains of our existence. We mustn't descend too soon to our tawdry, toad-like world -- we must allow these shadows of our better selves to burst, to break, to echo down the corridors of everyday life, daring to sing out loud.

Wednesday, December 3, 2014

New Ideas about Policing -- from 1829

The very idea of a "police force" in the modern sense was in every way a Victorian invention. In London in the earlier part of the nineteenth century, crime was fought by an unwieldy array of forces: parish officers (beadles), private night watchmen, and the infamous "Bow Street Runners," whose principal job was apprehending persons wanted on charges to ensure their appearance in court, but who did little or nothing of what we'd conceive of as "patrolling."

The force behind this force was British PM Sir Robert Peel, whose name gave us two popular early nicknames for officers of the police he established ("Bobbies" and "Peelers"). In 1829, in the Police Act, he set forth a clear set of guidelines for these officers, which became known as Peel's Principles. Peel realized that, absent the public's trust and co-operation, the very idea of a police force was doomed to failure.

The police -- in London and elsewhere -- have changed in many ways since 1829. The MET, as it's known for short, has had to expand its mission and learn to tackle new challenges. The realization that plain-clothes police could help solve crimes led to the establishment of the Detective Division; the challenge of the Fenians, who were willing to blow things up to advance the cause of Irish independence, led to the creation of the Special Branch. The Met now even has special riot control units, some of members of which conducted themselves very poorly indeed in the killing of Ian Tomlinson in 2009, an event which -- though absent the element of race -- had much in common with recent American incidents.

But despite that, Peel's original principles would make as much sense for reforming the police in the UK as for reforming them here in the US where I live. The biggest difference is that the police were imagined primarily as a force to prevent crime, rather than merely apprehending criminals after the fact. But just as important was Peel's insistence that the police must be thought of -- and must think of themselves -- not as a special class of persons with unusual powers, but simply as "members of the public who are paid to give full-time attention to duties which are incumbent on every citizen in the interests of community welfare and existence."

"The police are the public and that the public are the police" -- that's the way Peel put it. We could hardly do better to "reform" the police than to re-assert this one-hundred-and-eighty-five year old sense of values.

Wednesday, February 5, 2014

The Assessment Craze

In the past decade, the movement for education “assessment” has reached a fevered pitch, moving from public elementary and high schools into higher education, at least at most state colleges and universities.  This has been done, ostensibly, in response to a public demand for “accountability,” another loaded word, and this demand has been translated into the most absolute of terms. At my own college, every department and program was, in essence, ordered to develop an assessment plan.  The buzzwords of these plans – rubrics, targets, educational outcomes, and so forth – came along with the orders, though each department was ostensibly free to decide what they would describe as their “outcomes,” and how they would say they would meet them, and measure their success in doing so. And it came with a not-so-veiled threat: if you don't come up with an assessment plan, we'll come up with one for you.'

The next step in this utilitarian, quality-control notion of education, of course, was to make sure that these "outcomes," elaborately measured and assessed, were used to reform -- or more precisely, to penalize -- the educators and pupils who failed to meet the chosen benchmarks. It's a notion so forward-looking that it takes one right back to the 1860's, when Lord Palmerston's government promulgated regulations stipulating the each pupil failing their exams on a prescribed subject would result in the school losing 2s. 8d. in the next year's funds. As the eminent Victorianist Richard Altick describes it, "a new premium was put upon rote memory, for throughout the year every effort was bent toward grinding into the child the sentences or the facts the inspector might demand of him." Bradley Headstone, Dickens's dark schoolmaster in Our Mutual Friend, could hardly have been more pleased.

The word "education" comes from the Latin ex-ducere, “to lead forth" and shares a root with “Duke.”  But it can also mean “to draw forth,” and shares this sense with ductile, meaning stretchable or pliable, as with ducts, and duct tape.  On the difference between these two senses a whole worldview depends: if to educate is to lead, to command, to instill by and with authority, then doubtless it makes sense to see how good one’s pupils are at following instructions, mastering principles, and learning protocols.  But if it means more to “draw forth” -- and this is the shade of its meaning I would emphasize -- then it is not a matter of commanding, but of encouraging students to stretch and bend their minds, making pliable the rigid postures into which authority, and life, have pressed us.   The first sense can be measured, up to a point: I’m sure many baby-boomers such as myself remember the 20-item quiz, which began with with “read all questions before you start” and ended with item 20 being “put down your pencil and don’t answer any of the questions.”  But mental ductility, unlike that of physical materials, is almost impossible to quantify, since some part of it necessarily involves breaking the rules, thinking outside of one’s ideological confines, and questioning presuppositions.

There has been, I will admit, an attempt to take cognisance of this vital quality -- the phrase that is usually used is “critical thinking.”  English professors such as myself generally like this phrase; to us it suggests a complex array of thought and capability.  But what is it, and how might we measure it? One current test defines it as “analyzing problems, generating logical and reasonable approaches to solve and implement solutions, reflecting consistent value orientations” (CTAB). Another assessment rubric recently discussed in my department speaks cryptically of “breaking down informational materials,” “using previously learned information in new and concrete situations” and “creatively or divergently applying prior knowledge.” Such definitions offer little encouragement; to me, they sound more like a plan for recycling than a definition of innovative or genuine analysis. But it matters not: in essence, all these plans are, at their foundations, anti-intellectual assaults on genuine learning.  For true learning is not a plan, not a fixed process, and very rarely a readily-measurable thingAs Albert Einstein -- a famously slow learner - observed, "It is nothing short of a miracle that modern methods of instruction have not yet entirely strangled the holy curiosity of inquiry. For this delicate little plant, aside from stimulation, stands mainly in need of freedom."

I've been teaching at the college level since 1986 -- thirty years and counting. If this new regime of supposed "assessment" has its day, then higher education as we've known it will soon be coming to a close, to be replaced by a utilitarian yardstick that knows no value other than, or beyond, mere functionality.  Would the last person in the university please turn out the lights?