Thursday, August 11, 2022

I've had all I can stands, and I can't stands no more

The Stand
It should have been a simple transaction. Having decided that a stand for my iPad would be a handy thing to have -- and that several such stands cold be had for under $15 at amazon.com, I glanced at the available models and ordered one. I've been a "Prime" member since that service began, and I suppose it's made be just a tad lazier over the years -- add to that the fact that retail shops often don't carry such small accessories -- so for things such as this one, I just click and expect to see it at my doorstep in two days.

But in this case, that was a false hope. The first stand, ordered on a Tuesday, was supposed to be delivered on Thursday. Alas, when I went to check its status, I was told that it was "now expected Monday." No reason was given, but I noticed that the shipper was an entity previously unknown to me, "Amazon Logistics." There was a "tracking number" but no link to any actual live tracking of any kind.

"I've had all I can stands ...."
So I did what I'd done on other occasions -- I ordered another. This one was slated to arrive on Sunday, so at least I'd have it sooner; I could always return the delayed one, after all. So I waited. And then, just as mysteriously as the first, the second stand went into limbo -- no sign of it, no tracking. The first one now simply said "delayed" and to call if it hadn't arrived by Sunday. So I called, and was given a refund, which came with the caveat that I was to return the stand "when delivered." So with no stand in hand, and none on the horizon, I ordered a third one, due to be delivered on Tuesday

Then on Monday, a miracle occurred -- the delayed stand from Sunday was delivered, by the reliable US Postal service -- they even put it in the mailbox! When Tuesday rolled 'round, I checked online for the third stand, and was initially pleased to see it, too, was marked as delivered, by UPS -- another carrier I generally trust. That is, before I checked and found that someone named "Linda" had signed for it -- who was this Linda? I called amazon, and they told me that in fact it hadn't been delivered by UPS (despite having a valid UPS tracking number) -- but rather by my old friend "Amazon Logistics"! I was issued a refund for that stand. So finally, six days after my order, I had a sort of resolution -- I had, after all, gotten a stand -- but there were still a few dangling threads. What had become of the first stand -- would it travel the world, unable to be returned or delivered, like the legendary "Man who Never Returned" on the MBTA? And what about Linda -- was she enjoying her stand? Or did she even exist? And as for Amazon Logistics, there remained a final puzzle: why would a company that prides itself on quick and reliable delivery launch a new "service" -- one whose packages can't be tracked, and which failed to deliver or mis-delivered two out of three packages of mine in less than a week? First world problems, I know -- but still, the whole affair left me with a good deal less confidence in the ability of amazon.com to "deliver smiles" -- they looked more like crooked arrows now.

Wednesday, May 4, 2022

Panic over the airwaves

It's always been something of a miracle. From around 1920, when the development of signal amplification enabled the human voice to travel over radio waves, the thought that sounds spoken hundreds or thousands of miles away could by some mysterious process appear in our homes and cars as though the speaker were present, has been perhaps the first and greatest "miracle" of technology. Later, with the development of television, sights traveled by the same means, adding to this wonder, and shaping the baby boomer generation in a way no generation had ever been shaped before. But of course there were those who wondered: wasn't sending all this electricity, this radiation, through the air a sort of health hazard? Curiously, those who worried about that also worried about the tawdry nature of much mass entertainment, or the potential of words and sights so widely transmitted to alter or control our minds.

In a debate in the Radio Mirror back in 1934, Charles Shaw of NYU tussled with no less a figure than Nicola Tesla, the man who in many senses invented radio, long before Marconi. Shaw voiced the concern that, since radio engineers in close proximity with transmission equipment seemed to have higher body temperatures, that perhaps radio waves were going to bake us all while we slept! Not only that, but its noise and drivel "lowered our cultural standards." Tesla was left to point out that radio waves were far too weak in amplitude, and the wrong frequency, to do any cooking, and as to radio's content, wisely noted that "You can't blame lowering our culture on radio," he insisted, "blame it on yourself and myself. The type of program that comes over the air is the type you and I want to listen to."

A light pole in San Francisco
But in recent years, alas, these same two intertwined fears have arisen again about cell phone frequencies. Cell phones were "cooking" our brains, they said, or were leading to an increase in cancer. The panic has increased with the implementation of "5G" technology -- particularly since 5G will require many more antennas to give its higher frequencies coverage. Ironically, these higher frequencies are exactly the reason that 5G is harmless; it is much less able than lower frequencies to penetrate solid objects, which would include peoples' skulls and buttocks. The human body is impenetrable to frequencies above 70 megahertz, a fact which Tesla took advantage of in an experiment that demonstrated their safety. He took alternating current of a very high frequency, but a very low amplitude (power) and used it to electrify himself and others (Mark Twain among them). The high frequency prevented its penetrating the body, and the low amplitude eliminated any other risk of harm. And yet, a lightbulb in the hand of such an electrified person would glow!

Mark Twain in Tesla's lab in 1894
People today don't get much of an education in these matters, it seems. They tremble in fear of the word "radiation," not understanding that all sound and light and heat are also radiation. They confuse high-level radiation, which is known as ionizing radiation (x-rays, gamma rays, etc.) which can be lethal, with the many forms of low-level radiation (infrared waves, radio waves, etc.) which are largely harmless. They also don't understand frequency and amplitude, the two fundamental principles at stake. As noted, the higher frequencies of the radio band are in fact the weakest in terms of being able to penetrate things -- this is why, for instance, submarines use a very low frequency, so that their signals can penetrate the deep ocean, and even the earth itself. At the other end of the spectrum, so-called "short" waves are useful, since they can't penetrate the earth's ionosphere, but bounce off it instead, increasing their potential range. Getting back to cellular signals, there's the fundamental fact that, the higher the frequency, the more information they can carry, and more and more information is what's wanted. So, some years ago, the FCC moved television off the old VHF (Very High Frequency) and UHF (Ultra High Frequency) bands, and sold them off to cellular carriers. UHF peaked out at 3 GHz, and the new services will utilize the top end of UHF, and on upwards to 6 GHz. So at least a portion of this "new" frequency will be the same old frequency that once brought you midnight horror movies and Bowling for Dollars.

And then there's amplitude. The 5G signal will be far too low in energy to do any damage. With old-fashioned single transmitter systems such as radio, many watts of power were needed to give the signal a wide range, but the 5G antennas that spike these worries are in fact very very low power -- they're essentially "repeaters," picking up and rebroadcasting a very low power signal to give it range. No cell phone company would waste more electricity on these than needed to power these mini-antennas -- and even if they tried, the FCC's regulations on phones would prevent them from doing so. The current regulation for phones is for an absorption rate of 1.6 watts per kilogram of mass, which isn't enough amplitude to warm the surface of your skin more than a tiny fraction of a degree, assuming your cellphone is in direct contact with it. All phones sold in the US must meet this standard. Ultimately, these high frequencies, because they can't penetrate the body, are dissipated in the form of heat -- and if it's heat you're worried about, the electric hot pad you use for your sore neck puts out hundreds of thousands of times more.

All this leaves us with just the content of our signals to worry about. And here I would agree with Tesla -- we get what we deserve. Even when, at times, it seems we don't want it -- since now, with all our clicks tracked in one way or another, the system itself works to try to predict our desires. And yet, despite the screaming echo chambers of the 'net, it's just the sound of our own voice, really -- and we have only ourselves to blame for it if we listen.

Friday, August 31, 2018

Dissolution

Throughout the latter part of the twentieth century, there was one common understanding, in the United States at least: a college education was a good thing. Returning soldiers took advantage of the original GI Bill, and their children took advantage of what was, for a time, a very affordable college education (at least at public colleges and universities). One result was the greatest economic boom in the nation's history, but another was its most educated public. It would be hard for anyone back in the 1980's, (to pick a decade) to imagine a time when college attendance wasn't a reachable ideal and an unquestioned good for anyone who could manage it.

Today, college education is questioned on every front; tuition has risen dramatically, and pundits -- who take for granted that the best way to measure the value of something is in the increased earning power it confers -- declare that a useful trade-school education would be a much better deal. We're told that we have to let go of the idea that college is for everyone, as pressure mounts on public institutions (public in name only, as decades of cuts by state legislatures have meant that most state colleges and universities receive less than 1/3 of their funding from the state). Even those who talk up the importance of college are insisting on "accountability," which means endless rounds of assessments and measurements of "outcomes," and even the supposedly liberal President Obama, and his erstwhile secretary of education, Arne Duncan, wax ecstatic talking up their plans for cradle-to-grave tracking of the correlatrion between education and earnings.

But in the midst of this neo-utilitarian fervor, something has been forgotten. As Mark Thomason wrote in a comment on a recent New York Times editorial,
I send my kids to college as a growth experience. It changes them, in good ways. I hope they do well financially, but I am not sending them to a trade school, I'm sending them to complete their education and complete growing up. It did me a lot of good, and it is doing them a lot of good.
The only difficulty is that this good -- which I agree is the most important aspect of a college education -- is very difficult to quantify. It doesn't necessarily lead to higher earnings; those who are inspired by their college experience to undertake creative careers in the arts, or work for a better society, often find they're earning a great deal less. But their personal satisfaction, and benefits to the world of their labors, though not necessarily tangible or measurable, are certainly vital.

All this puts me in mind of a much earlier moment when a large number of institutions whose work was universally understood as contributing to the greater good came under government pressure to prove that worth. Secured in beautiful gothic buildings, supplied with dormitories for their residents, dining  halls for their meals, and chapels for their worship, the inhabitants of these institutions little dreamt that, in a few short years, their buildings would be reduced to roofless rubble -- and by their own king. Thomas Cromwell (yes, that one -- whose career is revisited in Hilary Mantel's recent novels) sent out "visitors" to investigate the practices at these places, and the reports that came back were damning: the people there were not following their own rules, they were living lives of suspicious comfort, and worse yet -- by providing spiritual services to their neighbors, they were perpetuating false superstitions and scurrilous beliefs.

The king, Henry VIII, used these reports as the justification for what would, earlier in his own reign, have been an unthinkable act. He, with the collusion of Parliament and a few strokes of a pen, dissolved all the monsateries of England, seized their buildings and properties, and sent troops to round up the valuables. The golden altar furniture and books were added to the King's own collections, while the buildings and land were given to his political allies as their personal estates. The monks and nuns were supposed to receive pensions, but there seems no record of many collecting them. What the King's men left behind, local villagers pillaged, finding new uses for door-hinges and window glass. The lead that had protected the roofs was rolled up and hauled away (it being a valuable metal at the time), and before long, the unprotected beams and boards rotted and collapsed.

To add to the irony, some of the funds raised by some these closures were used to fund colleges and endowments, including what later became Christ Church at Oxford.

I know it sounds crazy --  how could such a thing happen today? But I think it's a cautionary tale, especially for a population which seems unable to resist the siren-song of mere utilitarian value.

Thursday, February 1, 2018

Burst

There are still places on this earth where people do burst into song.

I was in Dundalk, Ireland, in an ordinary, nondescript Italian restaurant on the high street, with some friends who were there attending a conference, along with a few other miscellaneous diners, when suddenly a woman at a table nearby, probably in her forties or so, let out with the refrain to Cat Stevens’s “Moonshadow” – and at once we were all, instantly, “leapin’ and hoppin’” along. The strange thing was that the waiters and waitresses took no notice of it; she sang, and we all came in on the chorus. At the end of it, there was applause, and people looked ‘round the room at one another, and someone called out “Well, give us another song!” And someone did.

This would never, of course, happen where I live, in the United States. Here, no one bursts into song, except maybe in a Karaoke bar, and badly. If you were to start singing at the top of your lungs in a TGI Friday’s or a Pizza Hut, people would think you were crazy, and before long the police would be called, and you’d be taken away to face charges.

But here, in Ireland, it's understood that people do do such things, and that it's a perfect part of the ordinary. People could, actually, burst into song in a public place, and just as in an episode of  “Pennies from Heaven,” the other diners – if they didn’t join in themselves – would just carry on with their business as though it were the most usual thing in the world.

*     *     *     *     *

Growing up in the suburbs of Cleveland, Ohio, I was subjected to my parents’ deep and inexhaustible love of musicals. Living far from New York, we never actually went to one, but they blared forth from the stereo all the same. “My Fair Lady” – “Oklahoma!” – “The Most Happy Fella” – “The Pajama Game” – “The Mikado” – “Carousel” – “South Pacific” – the list seemed endless. My mother loved them, and my father loved them even more, turning out all the lights in the living room and turning up the stereo as far as it would go without distortion. “We are gentlemen of Japan!” … “All I want is a room somewhere” … “A p'liceman’s lot is not a happy one” …“Some enchanted evening” … there was an instant drama in every line, and though I hated the music itself, some portion of its feeling got under my skin, and planted a kind of weird seed for the future. I remember hating these songs, the darkened room, the way that my father would, from his prostrate position on the couch, quiver quietly with tears.

But now, decades later, when I hear these same songs, I’m the one that's weeping. Not for the fate of poor little Buttercup, or Liza Doolittle, or Billy Bigelow -- but for the lost world in which people indeed did burst into song. And in that world, it now seems to me, there was such a tremendous reservoir of pent-up emotions, such a backlog of heartache, that it was absolutely impossible to speak – it could only be sung. And yet, once the final lines had finished and the strings faded, everyone went back to their ordinary, pent-up ways – and it was this final loss, the loss of the miraculous world of people actually singing about their feelings, that touched me the most. For even then, when such things were common knowledge, they failed to transform the world with their beauty; people went back to ugliness, got on with their tawdry lives, and if asked would probably have said “Singin’? I didn’t hear no singin’! Did you, pal?”

All of this came rushing back to me recently on when I re-watched the 1981 film Pennies from Heaven with Steve Martin and Bernadette Peters. I’d seen it when it first came out in a theatre – a lonesome one, populated only by myself and the love of my life, two little old ladies, and an usher collecting for the Jimmy fund. Somehow, the lip-synching made these songs work, and made the self-representation tragedy of singing one’s heart out a double loss – for the voice these actors sang with was not their own. Of course, that was often true in the classic Hollywood era, when Marni Nixon provided the voice for everyone from Deborah Kerr to Audrey Hepburn, but I never knew that back then. Later, of course, I saw Singin’ in the Rain and realized what a strange fate it was to be the unknown voice of a famous actress. It’s an uncanny kind of real-time pathos that goes back at least to Cyrano de Bergerac and right up through Disney’s Little Mermaid: to watch another woo one’s love with one’s own words, or voice. It’s a kind of tourture, really.

Some years seeing the 1981 film, I tracked down a copy of the BBC series with Bob Hoskins. Until then, I hadn’t realized how heavy-handed the irony was – that a man who sold love-songs for a living could be, not just a low-grade jerk, but a complete asshole – so that the transformations he underwent, and witnessed were a sort of false, cruel magic, one in which the kiss of song transformed toad-like people into temporary princes, not because they were really princes inside, but because they and the whole goddamned world were so deeply, perversely, incurably toad-like that this formed for them a kind of cruel joke. I wondered whether Dennis Potter was a cruel man, or whether, in the manner of other dark comedies, it was meant to be humorous.  His genius, I decided, was that it didn’t matter – it worked either way.

But what's happened to musicals today? It's strange to realize that Pennies from Heaven was in fact the very last musical produced by MGM before its long descent into financial ruin and dissolution, its legendary film library sold to Ted Turner, and then to Warner's. It was Disney -- or, more properly, Ashman and Menken -- who revived the musical feature film in 1989 with The Little Mermaid. Somehow, bursting into song had become something that was too much for any human actor; it had transcended that world and moved into animation. Even Broadway was swept by this phenomenon, as adaptations of Disney films have been among the highest-grossing and longest-running stage musicals.

There are exceptions, it's true, but they're rare. I have to confess that Stephen Sondheim, for all his storied career, has always left me cold. His melodies never quite soar; like tethered birds, they tweet too close to the ground, and are soon forgotten. Wicked -- which I've seen in its current incarnation at the Gershwin -- is transcedent on stage, but the soundtrack doesn't really do it for me. There's soaring -- who can forget their first experience of "Defying Gravity"? -- but without the visuals, the music itself sounds overblown and frantic.

And then there's Glee. Many love it. I loathe it. Not because it lacks magical moments of bursting into song, but because there are too many. You can't just constantly be bursting into song; as Alan Jay Lerner put it,
"A man's life is made up of thousands and thousands of little pieces. In writing fiction, you select 20 or 30 of them. In a musical, you select even fewer than that."
And that to me is what a musical is -- a series of dramatic moments, carefully curated. But a serial musical is like a serial killer -- you never know when it will strike again; we live in dread instead of wonder. Each song must count, must form the peak of one of the mountains of our existence. We mustn't descend too soon to our tawdry, toad-like world -- we must allow these shadows of our better selves to burst, to break, to echo down the corridors of everyday life, daring to sing out loud.

Breaking Dad


What would my father, a chemist with frustrations of his own, have thought of Walter White?


Like countless others over the past few years, my family and I have been dealing with a drug habit: an addiction to Breaking Bad, the AMC television series about Walter White, a former high school chemistry teacher who takes up a second career cooking crystal meth after he’s diagnosed with cancer. We’ve often debated exactly why we find the show so compelling -- of course there are the great performances by Bryan Cranston as White and Aaron Paul as his partner (and former student) Jesse Pinkman, the inventive camera work, and writing that’s as clear and sharp as a tray of new-cooked product.  For me, at any rate, what really drew me in at the start, and has kept me watching ever since, was the way White’s backstory -- a chemistry genius who had a shot at making millions in a tech startup company, but ended up teaching a roomful of bored high school students -- meshed with his new persona as a dangerous man, a man who can put a bullet through a drug-dealer’s head if he has to. Before his diagnosis, Walt was a man who’d had to swallow his pride, and whittle away his gifts in a job that had become deeply unfulfilling.  After his diagnosis, though his body is wracked with cancer, he regains his pride by deliberately, willfully breaking the law in order to provide for his family -- all the while making it a point of pride that his crystal is chemically superior to anything else on the market.

My father grew up in the farm country of western Washington State, picking strawberries and working in a pea cannery to help his family make ends meet.  His boyhood hero was Thomas Edison, and he dreamt of a career as an inventor, once alarming his parents when his attempt to build a galvanic cell set fire to the corner of the chicken coop. After graduating from Washington State, he got his Ph.D. in physical chemistry from Purdue, and went to work for the one company with the most direct tie to Edison’s genius, General Electric.  At their lighting division “campus” at NELA Park in Cleveland, he too cooked crystals -- crystals of gallium arsenide -- in enormous blast furnaces, then passed electrical current through them to make some of the world’s first multi-colored LED lights.  And, although his bookshelves at home were laden with the paperweight-size awards that GE gave out to its top inventors, the company -- whose profits increasingly came from its financial-services division -- downsized his laboratory again and again, eventually closing down the entire research department and outsourcing it to a subcontractor who hired Ph.D.’s from Hungary at a tenth of my dad’s old salary.  I don’t think he ever forgave GE, and though he tried teaching chemistry for a while, quit in disgust at the low level of motivation among his students.  And then, as with Walt, there came an illness -- not cancer in his case, but Parkinson’s disease, which he came to believe might have been triggered by some of the chemicals he’d been exposed to during his years in the lab; in 2004, he died of the disease and its complications.

He certainly had plenty of reasons to feel resentful.  But would my dad have sympathized with Walter White?

At first, I didn’t think so. My dad was the kind of guy who would walk out of a movie theatre if there was a scene depicting an adulterous relationship; he was as honest as a boy scout and faithful as an old dog.  He never lied on his taxes or went over the speed limit, except once a year when he wanted to “clean the carbon” off the spark-plugs of his blue Oldsmobile Delta 88. But he was proud of his chemical knowledge.  A walk through the woods was an occasion for an explanation of osmosis, the process through which the sap ascended to the branches; when I was a kid, he would bring home old beakers (no round-bottom boiling flasks, alas) and once gave me a chunk of metallic sodium so that I could throw it in a pond and watch it explode.  If there was a mistake in chemistry on a science show -- or even in a science-fiction movie -- he’d write a strongly worded letter to the producers.  And he expected a reply, too.

But lawbreaking?  Deliberately making something that made other people addicted, and sick? And, when it seemed necessary, killing those who threatened him or stood in his way? I couldn’t imagine it. And yet, when I saw those duffel-bags full of rolled $100 bills in the early episodes, I thought to myself: wouldn’t my dad have felt satisfied if, after being forced into early retirement by GE, he’d been able to earn that kind of money?  If he could finally have bought that fishing shack up in the Cascade mountains where he’d hiked as a boy, gotten a new car, or even started his own lab to make something better and more valuable than he ever had for GE? Wouldn’t he be rooting for Walter White?

My Dad believed in good guys and bad guys; back in the days of old-time radio, his moral sense was honed by the Lone Ranger and the Cisco Kid.  He liked a good adventure story, and got a kick out of retro-styled films like Indiana Jones.  And he was a very emotional viewer, although you could only see that river when it spilled over its banks.  He could be a little unpredictable, but there was never any doubt as to where his sentiments lay. I remember watching the climactic scene of James Cameron’s Titanic with him -- where the elderly Rose drops the “Star of the Ocean” off the bow of the ship -- he sat through all it as stone-faced as Buster Keaton, but when Tom Hanks lost “Wilson” in Cast Away he wept profusely.

But he also had a powerful sense of justice.  Back when he was a student at Mount Vernon Junior College, he’d submitted a science essay for a contest.  The science teacher disqualified the paper, citing as a mistake a formula my dad knew was absolutely correct.  Dad wrote an angry letter to the school’s principal, who refused to overrule the teacher. Years later, when he heard through a friend that his old science teacher had died, he gave a grunt of satisfaction -- “served the bastard right,” he declared -- this from a man who almost never swore.

And, although he had a comfortable middle-class life -- more comfortable than Walt’s -- my dad had his share of money troubles later in life.  When my mom suffered a stroke, and his General Electric health policy refused to pay because he hadn’t contacted them for permission before she was admitted to the hospital, Dad went ballistic.  He  picked up the phone and argued his case for weeks, all the way up to the top -- Jack Welch, a former research chemist himself, was head of GE then -- and got them to change their mind.  He was proud of that.

But what if GE hadn’t come through with the money for my mom’s care?  What if my dad found he couldn’t provide for his family, despite all his years of hard work?  And what if he knew that his knowledge of chemistry could cover all his family’s bills -- would he have used it?

I doubt he would have ever considered the path of Walter White, but I have a feeling that he’d have sympathized with him all the same.  I know he’d have been pleased when the show got its chemistry right, and critical when they missed the mark.  “Mercury fulminate!  That’s much too unstable!”  I can almost hear him saying those words, along with a number of explanations of other chemicals mentioned in the show. So I suspect that he would have been just as closely glued to his screen as I am now -- even given Walter White’s increasing lies and deceits -- and I think he’d have been quietly rooting for him. In many ways, he was Walt --  he too had to learn to swallow his pride, to defer his dreams and waste time trying to justify his research to bosses who scarcely understood it.  He’d looked out over a lecture hall full of community college students as they chatted, or dozed, instead of listening to his impassioned explanation of the carbon-hydrogen bond.

The anger in Walt’s voice reminds me of my Dad’s anger, his frustration.  The way Walt uses chemistry in every one of his plans and devices reminds me of the way my Dad would use it to solve every household problem, from cleaning a stain on the carpet with hydrogen peroxide to graphing the temperature curve of the Thanksgiving turkey to tell when it would be done. There’s a purity about chemistry, a symmetry, a predictable architecture -- that actual life often seems to lack.  I think that’s where Walt’s rage comes from -- and it’s a rage he and my father shared. Late in his life, his mind drifting in the mists of Parkinson’s-related dementia, the anger was the one thing he could hold onto.  He raged, raged, against the dying of the light.

And in this dark and unpredictable world, where justice is so seldom really served, this is a rage we all can recognize.  Maybe that’s why we’re addicted to Breaking Bad.
NB: This is a repost from a few years ago, on the occasion of the final installments of BB.

Sunday, December 18, 2016

Fake News and the Death of the Donut

Many years ago -- in another age, an age where newspapers, television, and magazines thrived, and journalism schools were filled with bright young students, one commonly taught principle of fair reporting used the humble donut as its metaphor. The center of the donut -- the hole -- represented facts and matters of general consensus: George Washington was the first president of the United States, men walked on the moon on July 20th, 1969, and vaccines were vital to winning the war over dangerous childhood diseases. No one disagreed -- then, at least -- about these matters, and there was no need to present alternative views about them.

The donut itself consisted of matters about which reasonable people might disagree: Was the 55-mile-an-hour speed limit a good idea? Was Sir Winston Churchill the greatest British Prime Minister? Should we get rid of daylight savings time? When an issue fell within the donut, responsible journalism called for balance, and the representation of opposing views. Then there was the outside of the donut. Earth was colonized by aliens in 40,000 B.C. -- the Holocaust never happened -- Abraham Lincoln was never assassinated, but lived to the age of 87 in a secret log cabin in the hills of Virginia. These ideas were the stuff of the "lunatic fringe," and the proper way for serious journalists to respond to them was not to respond to them at all. This model, of course, assumed that journalists -- because they alone had the opportunity to disseminate news to millions of people -- could and should function as gatekeepers of our shared sense of reality. In such a day, Walter Cronkite could reasonably say, "And that's the way it is," knowing that the stories on the evening news had been carefully reported, fact-checked, and vetted before they went on the air.

We shouldn't necessarily blame the journalists of today for the death of the donut. It's still there, to some extent, at the larger national and international newspapers, and some (though not all) network news shows. But the gate that the press was keeping was in a wall -- a wall representing the cost of disseminating news, printing papers, erecting transmission towers and building television studios -- that simply no longer exists. There's no need for any news to pass through this gateway; like the lovely wrought-iron gate of Cremorne Gardens in London, it stands by itself, is easily walked around, and the gardens to which it once led have long vanished. There's no chance of such a barrier being rebuilt in the future, and none of the efforts of social media sites such as Facebook or Twitter are going to have much effect, since anyone who wants to can find a workaround to whatever filters or barriers they erect.

Is there any hope at all? Well, certainly the remaining outlets of old-fashioned journalism should be taking the lead by calling a lie a lie, and continuing to robustly fact-checking their own stories. Sites such as Snopes.com can help, and the increase of traffic there is a healthy sign that some people actually do want to check up on an implausible or dodgy story they've heard. But what it really means is that everybody is going to have to do their own checking, and that in addition to teaching mere facts, the task of education, now more than ever, must be to give students the tools to sort out the wheat of information from the chaff of useless babble, and the poison of disinformation, rumor, and conspiracy theories.

There's only one problem with this hope, of course -- that those who write, share, and consume those poisons have the same robust tools to keep reality from their gates as do those who favor reality. It's going to be a bumpy night.

Saturday, June 4, 2016

They shall not pass!

There's been a lot of talk in the press these days, with the rise of he-who-must-not-be-named in American politics, about how to stop fascistic, totalitarian figures from coming into power. It appears to be something that folks in the US and UK haven't had to deal with, and so they must look to Germany, or Spain -- but in fact, that's not true at all.

Back in Britain in the 1930's, there was a divisive political leader whose rallies were beginning to cause concerns. A former political liberal, he'd taken to denouncing his previous views; at massive indoor rallies (including one at the Albert Hall), he stirred his listeners to passionate cheers by denouncing immigrants, Jews, and a shadowy cabal of forces that were driving the common man down. He love to egg on protesters at these rallies, and had special squads of "stewards" to rough them up and throw them out, but in fact they served a purpose. Indeed, he credited them with making his speeches all the more effective, even saying he looked forward to them.

The man in question was Sir Oswald Mosley. And Britons weren't quite sure what to make of him; though many despised his views, his party -- the British Union of Fascists -- was granted the same rights as any other, and in the early 1930's their polling numbers were on the rise, although they did not yet have any elected member of Parliament. Mosley, inspired by visits with Mussolini and Hitler, decided to give his party a paramilitary flavor, initially going with plain black futuristic uniforms, but later graduating to jackboots and a peaked cap (for Mosley himself at least). From their large indoor rallies, they graduated to outdoor ones with marches; when Mosley was clobbered with a flying brick by protesters at one, he wore his bandages as a cap of pride. It was part of his strategy to march directly into areas, such as the East End of London, where the bricks were most likely to fly; this both fed his sense of justified anger, and forced the police into serving as his own personal security.

Things came to a head in October of 1936; Mosley had planned a rally and march through the East End, and had pulled out all stops; his new uniform was ready, as were many of his followers, some of whom rode in specially fortified vans. The Metropolitan Police, under the leadership of Sir Philip Game, mustered out 6,000 officers, one of the largest forces since the days of the Chartist rallies in the 1840's, but even then it soon became clear they were stretched thin; attempts to disperse a crowd of nearly 100,000 East Enders proved impossible. Those opposed to Mosley, whose planned route would have taken him down Cable Street, erected barricades of bricks and sticks, along with a lorry and some construction equipment they'd procured from a nearby builder's yard. They hoped to block the march by their mere presence, though they were also prepared to fight; one witness described men wrapping barbed wire around chair-legs as improvised weapons. The chant of one and all was simply this: "They shall not pass!"

In the end, Sir Philip Game persuaded Mosley to call off that part of his plans, and to march instead back where he'd come from to Hyde Park. Public alarm over the events led to the passing of the Public Order Act, which although it did not outlaw Mosley's party, prohibited political uniforms such as those the "blackshirts" had favored. When Britain and Germany went to war, the BUF was banned, and Mosley and many of its other leaders were imprisoned for the duration. One, though, did escape: the scar-faced William Joyce, who as "Lord Haw-Haw" broadcast mocking ripostes to nearly every one of Churchill's radio speeches. After the war, Joyce was convicted of high treason, and hanged at Wandsworth Prison. Sir Oswald returned to political life, though in his one election -- for Kensington North in 1959 -- he polled only 8% of the vote with an anti-immigration platform that included deporting those from the West Indies. 

Monday, May 30, 2016

Girl number twenty unable to define a horse!

Thomas Gradgrind
Consider this a sort of open letter to the American education system. It's been fourteen years since the passing of the "No Child Left Behind" law, and in my college classes, I now have students who've experienced its effects for much, if not all, of their primary and secondary educations. The neo-utilitarians have had their day, and here are the results: a generation of students who have learned the real lesson of the new regime: hunker down, follow instructions, and learn whatever is going to be on the test. Fiction is out; nonfiction is in; reading for pleasure is out; reading for content is in. After all, the only skills that matter are the ones that can be measured.

This approach forces students to discard their interests and passions, and instead think strategically about how to focus their efforts on measurable things. They inevitably develop a habit of scanning syllabi and readings to see what will be tested, but this makes for poor readers; when they encounter an ambiguity, or a difficult reading, they tense up in anticipation of being taken to task for some vague failure. My endeavor has always been to share my love of literature, to (I hope) instill in students an excitement, an interest, a sense of something personally significant for themselves, in everything they read -- but it's getting tougher each semester. Their curiosity has been hammered; their personal experience belittled, their idiosyncrasies ironed out. As a result, it sometimes seems that the pleasure of discovery of something new has given way to an anxiety about the unfamiliar.

Are they better prepared for the wonderful world of employment? Perhaps, so long as the employment they find demands repetitive goal-oriented work, with little room for innovation and frequent employee evaluations. They may, so long as they've had the prerequisites, do well in science, technology, engineering, and math -- though only at a rudimentary level. It's hard to see many new inventions, innovative theories, or speculative hypotheses coming from this generation, as anything they did or thought which deviated from the appointed path earned them the electric shock of a poor grade. Curiosity seems to have been largely amputated, and in its place our brave new Benthamites have instilled a simple wish for nothing more than a clear set of instructions and evaluative rules. Learn, do, demonstrate. Wash, rinse repeat.

It's a sad day for me, and for college educators generally. We can't turn back the hands of time -- we'll have to do what we can to help these students regain their own self-confidence, to revive and replenish their ossified curiosity. It will be hard work, the more so as professors themselves have been sent the same memo; our departments, programs, and fields of study already have to produce the "documentation" of what skills and knowledge we add, and our "metrics" for assessing the "outcomes" of our labors. The madness of measurement has overtaken us all, and if anything resembling real education is going to take place, it's going to have to do so by flying under and around the radar of this new regime.