Passing glances: Amazon, coincidences, and war

In a private forum on the Authors Guild website, a contributor recently posted a link to a New York Times opinion piece bearing the provocative headline “New York Should Say No to Amazon.” The op-ed began by reminding readers, “This week, word leaked that Amazon may be close to finalizing a deal to set up a major operation in Long Island City, Queens” (the link is in the original), and it went on to subject Amazon itself as well as its possible New York expansion to withering criticism. The forum contributor said he had previously favored the deal but now wasn’t so sure, and he asked for other responses. Because the moves, literal and otherwise, of major tech companies are a pressing concern in New York and elsewhere, I’m reposting here the response I wrote for the forum:

First, it needs to be said that what Ron Kim and Zephyr Teachout propose in their opinion piece isn’t entirely captured by the headline. Rather than declaring flatly that New York City should say no, they propose this: “New York can and should say no, at least until we know a lot more. The public should see correspondences between the Cuomo administration and Amazon to learn what promises have been made. The State Senate should demand a full study to examine the impact of any proposed deal on transportation and the cost of housing. We should hold hearings on Amazon’s union-busting practices.” We ought to know what we’re getting into: I don’t disagree with that.

On the other hand, I’d like to know more about how these things are done in general. Wouldn’t it look bad for New York (where I reside) or any other city to bid for something, be accepted, and then back out? I wouldn’t think we’d agree in principle to subsidize a new sports stadium, a Super Bowl, an Olympics, a national political convention, or the like and then change our mind, and I wouldn’t think we’d do that in the case of a business that’s considering moving here. Surely the best time to consider the issues is before the offer is made—which is not to say we can’t do it now, only that we’ve probably gone about this the wrong way. But maybe I’m wrong; maybe it happens.

While Kim and Teachout lay out some of the areas of contention—namely Amazon’s effect on small businesses, workers, and the publishing industry—I’m not convinced by the evidence and arguments they muster. It’s one thing to say there are reasons to dislike Amazon’s way of doing business or its very existence in the marketplace; it’s another to say we don’t want it here. And, as a practical matter, I doubt very much that rejecting it would improve Amazon.

The monsters of the technology industry are monsters that we as consumers have created; reining them in, if indeed that needs to be done, isn’t likely to be very effective if we go about it piecemeal and purely on a local level. (It’s ironic that what George W. Bush once dismissed as “old Europe“ has gotten far ahead of the United States in this respect; the European Union’s Global Data Protection Regulation was years in the works and has already taken effect, while we’re still scratching our heads over what to do.) And where the local level does matter, as in the distortions of income and housing now faced by Seattle and San Francisco, it goes without saying that you can’t have much impact on a company that’s not based in your city at all.


Friday evening, a teaser on the NYT website front page said, “There Are Now Americans Who Have Lived Through Two Gun Massacres.” (The link, which I haven’t followed, goes here.) I know things that can top that. Until 2010 there was a man in Japan, named Tsutomu Yamaguchi, who had survived both atomic bomb blasts, first at Hiroshima and then at Nagasaki, and who lived to the age of 93. (You can find his Economist obituary here.) There was a woman who survived the sinking of Titanic, on which she worked as a stewardess, and then, working as a nurse on board its sister ship Britannic, which served as a hospital ship during World War I, she survived its sinking as well. (Her name was Violet Jessop, and her story can be found on Wikipedia and elsewhere.) These radical coincidences are fascinating because they’re incongruous; they invite various strange imaginings about the ways in which individual human lives can be whipped about by the hurricanes of history. But they’re not the kind of thing one wants to see more of. There’s something equally strange, but also more simply wrong because more easily preventable, about surviving two mass shootings in the United States.


The First World War ended 100 years ago today: after the signing of an armistice early in the morning, the guns fell silent—this seems to have become the standard way of expressing it—at the 11th hour of the 11th day of the 11th month of 1918. Apart from the American Civil War, that war was perhaps the first to bring about injuries and deaths on a scale so vast that, while we can grapple with them arithmetically, as mere numbers, we can’t really comprehend them in personal, human terms. Ah, industrialism. It was often the case that more people died in a single battle than we will ever meet over the course of a lifetime; though estimates have been made of the latter figure, it must be hard to pin down, but it seems unlikely to go beyond a few tens of thousands. While reading John Keegan’s study The Face of Battle, I copied out one set of figures: “By the time [the Battle of the Somme] ended, 419,654 British soldiers had become casualties on the Somme, and nearly 200,000 French.” Keegan does not, as I recall, report any figures for the war as a whole, though they can be found elsewhere, but he does tell us this: by the end of it, the British realized “that war could threaten with death the young manhood of a whole nation.”

Thereby hangs a tale—rather, a tale could hang on it, if I ever follow through on an idea. in the late 70s, when I heard that a Dallas theater company I was working for was going to do a vampire play, I was excited to think that it might be something new and different. Instead, it turned out to be only a stage adaptation of Bram Stoker’s Dracula. The writer and the director, whom I still count as friends, were probably on the right side of that question, as the show turned out to be pretty popular. But I was looking for something a little more imaginative, and I thought, Why not do a play that treats the immense wasting effect of World War I as though it were the result of vampires? I still think it might fly, but someone’s going to have to write it to find out.

Advertisements

Reading notes: On Sam Mendes and long-form TV

The 9/24/18 issue of The New Yorker contains an excellent profile of director Sam Mendes by John Lahr, called “Showman” in the printed edition. It reports this, which I had never noticed:

Much to his union’s chagrin, Mendes refuses to benefit from the hard-fought battle for “possessory credit”—you won’t find “A film by Sam Mendes” in the credits for any of his movies. A film, he said, “is written by someone else, shot by someone else. It’s not all me. It’s because of me.”

That comes off as a little less modest than Mendes may have thought, but it’s hard to judge how it sounded when he said it. In any case, it’s clear that he doesn’t think a film comes to exist solely because of him.

Something else that struck me was this:

      Outside the window of a seminar room at New College, Oxford, where Mendes faced about a dozen aspiring student filmmakers around a horseshoe-shaped table, the city’s original wall and the college chapel, both built in the fourteenth century, glowed in the sunlight, impervious to the vagaries of time. Mendes, in contrast, was bringing news of change. “The director as a concept, as a cultural phenomenon, is dying,” he said. “Coppola of ‘The Godfather,’ Scorsese of ‘Taxi Driver,’ Tarantino of ‘Pulp Fiction’—these figures are not going to emerge in the way they did in the twentieth century. The figures who are going to emerge will come out of long-form television.” He continued, “Now is an unbelievable time to be alive and a storyteller. The amount of original content being made, watched, talked about is unprecedented. You’re in the strongest position if you write. If you’re a writer, you can also be a showrunner. A showrunner is the new director.” Mendes invoked David Simon (“The Wire”), Vince Gilligan (“Breaking Bad”), and Matthew Weiner (“Mad Men”). Then, like a cinematic Moses coming down from the mountain, he reeled off the eye-watering amounts that will be spent annually on original material in the next few years by the streaming companies: Netflix, $10 billion; Amazon, $8 billion; Apple, $4.2 billion. “These streaming companies are going to steamroll the traditional studio system,” he said. (Hollywood, during the same period, will spend about $2 billion.)

In show business, form follows money. The boom of the streaming services has also changed the shape of filmed stories, shifting the old theatrical formula of “two hours’ traffic” into a new guideline of ten to sixty hours. “They want one never-ending movie,” Mendes said. “The model they’re chasing is ‘Game of Thrones.’ ” As a producer, Mendes understands the market forces; as a filmmaker, he resists the attenuated narrative. “I was brought up to believe that a movie should have a beginning, a middle, and an end. For me, a narrative is something you tell an audience in an evening. You can put your arms around it. It’s singular.” He added, “Even though my company produces a lot of television, I don’t feel comfortable not knowing if an audience is watching, or whether they’re watching all ten hours or ten minutes at a time. That’s where my theatre roots, I suppose, are most clear.”

Although he is realistic about Hollywood’s devotion to action and adventure movies—“They don’t give a shit about Academy movies and critics’ darlings”—Mendes takes heart from such ambitious studio films as “The Revenant” and “The Life of Pi.” “You can only make them if you can marshal the forces and the money from the studios,” he said. “For that, you have to have had a career over the past twenty years. The problem with these young directors is that the only way they can get that cachet is by doing a franchise film.”

One thing that led me to wonder about—I found myself thinking about it when I woke up—is a question that has occurred to me a few times before: how does one teach a course on the kind of TV material that Mendes is talking about? Needless to say, it depends on exactly what you’re trying to teach, but still, a work of art in this realm is longer than anything that’s traditionally been taught in the humanities. You can teach all of Proust’s In Search of Lost Time in a single semester and have room left over for other work. You can read a representative sampling of Henry James’s stories and novels from across his career, combine that with work by related writers such as Hawthorne and Dickens, add in some critical reading, and fit it all in a one-semester seminar, such as the one I took in the SMU English department in the 70s. (No doubt you could do more of all of that in a university that places higher demands on its students.) But how do you deal with the TV work of, say, Aaron Sorkin, which consists of more than one series? How do you deal with Buffy the Vampire Slayer, or The Sopranos, or Lost, or Sex and the City? How do you deal with any of the works that Mendes names? Each of them, to the extent that I know (I haven’t seen The Wire), is a paragon of modern long-form television, eminently worth considering in its entirety as a single, coherent creation, but how do you study it in a class?

An important part of this, which I wonder about in the case of present-day film courses too, is how the students watch the material. It used to be the case in film history that we’d do some reading on our own, but the films were shown to us in class (or sometimes, I think, in an outside-the-class-schedule screening). Can you now assign the students to buy or rent a DVD, or make the films available for viewing in the library? And the same applies to TV. Teaching Mad Men would be workable if you assume the students watch somewhere between 4 and 10 episodes a week outside of class, but how do you arrange for that? The challenge would be lesser if you assigned only parts of the show instead of dealing with all 92 episodes across its seven seasons, but there would still be a lot to watch. (Critical reading, such as the excellent Mad Men study I read, would probably need to be included as well, but that’s the old-fashioned kind of homework.) In any case, teaching about modern long-form TV begins to look less like an arts and humanities project and more like something in mathematics or the sciences, although even that isn’t a great comparison. The basic calculus course at SMU took three semesters (totaling 9 credit hours of coursework), and there was an intense version that met five days a week for two semesters (for 10 credit hours), but you can learn a lot of calculus in that amount of time. I wonder how much long-form TV you can cover in two or three semesters.

Memory and humility: Two notes on Brett Kavanaugh

One: During the confirmation hearings for the nomination of Judge Brett Kavanaugh to the Supreme Court, disagreements arose over what would seem to be basic facts, such as whether Kavanaugh assaulted Christine Blasey Ford at a party or whether Kavanaugh even attended a party where Ford was present. One thing that’s important to keep in mind while wrestling with questions of what really happened and what it means is that memory can be an unreliable witness. It’s possible that Ford and Kavanaugh differ without either of them lying; that is, it’s possible that neither is knowingly and deliberately telling an untruth. A New York Times article that I found a few days ago explored this from the standpoint of current scientific thinking about how memory works; some elements of the remarkable movie Marjorie Prime touched on the same issues (that film deserves its own discussion), and an Italian researcher whose work concerns memory and history once concluded from his studies, as I recall, that memory is not a record of what happened—memory is itself something that happens. To put it simply, we know that memory can’t necessarily be trusted (to borrow from Joe Orton, our memory plays us false even on the subject of its own reliability), except some of us apparently don’t know that at all, or have conveniently forgotten it, which accounts for many of the loud accusations of lying that we’ve heard. The Times article gave very few examples; other, bigger ones are out there, such as the odd fact that some people remember learning that Nelson Mandela died in prison.

Two: It’s conceivable that a person of either gender who had engaged in wrong or questionable behavior in the past, knowingly or not, could still find a place on the Supreme Court. But a person who, instead of admitting the fact or at least the possibility, and attempting to explain it, and accepting the consequences, insists that nothing of the kind did happen or could have happened and that anyone who says otherwise is part of a conspiracy—such a person is at odds with his or her own life as well as, in the two particular cases I have in mind, at odds with developing norms of our culture. This person may have a claim on our pity but can have only a partial claim on our admiration, no matter how far they’ve advanced in their career, and has no claim to a job arbitrating difficult questions of justice and law and society. In 1991, one such person, an accused perpetrator of sexual misconduct, joined the Supreme Court. That we now have two is no kind of progress.

Passing glances: Where Leonard Bernstein meets R.E.M.

This blog has been on vacation. If I had a greater sense of responsibility, I would’ve hung a sign over the image at the top of the landing page saying, “Gone fission—back sooner or later,” but making that look right would’ve taken some work, which is just what I’ve been trying to avoid lately. Are we still on vacation? The Magic 8 Ball prognosticator says, “Ask again later.” Meanwhile, there’s this.

Over the weekend I found myself thinking of how R.E.M. singer Michael Stipe gave a shout-out to Leonard Bernstein amid the rapid-fire, seemingly random patter of the band’s song “It’s the End of the World As We Know It (And I Feel Fine),” released in 1987. Here it is, in context:

The other night I dreamt of knives, continental drift divide
Mountains sit in a line, Leonard Bernstein
Leonid Brezhnev, Lenny Bruce and Lester Bangs
Birthday party, cheesecake, jelly bean, boom!
You symbiotic, patriotic, slam but neck
Right? Right!

It needs to be said that opinions differ on the exact wording in this song. In the section above, some people hear “I tripped a nice” rather than “I dreamt of knives” and “Mount St. Edelite” instead of “Mountains sit in a line.” No big deal; the sound takes precedence over the sense here. What Stipe—who apparently crafted the lyrics—may be aiming for is neither sense nor nonsense but something like the sound of sense, a rattling, clattering collage of verbal constructs. I wouldn’t call it “stream of consciousness,” though many do, because it’s more jumble than stream, but that too doesn’t matter. As for the four men whose initials are L.B., the Wikipedia entry for the song reports that Stipe dreamed of attending a party where those guys and everyone else possessed those initials. If you ask me, the words don’t read particularly well, but they’re not intended to. What matters is their role in the music—a critic once argued that many of Stipe’s vocal lines serve as merely another instrumental line, which isn’t exactly right but isn’t exactly wrong either—and this effect, while far from anything I know in Bernstein’s work, is something he might have appreciated.

I thought I’d mention this because Saturday was the 100th anniversary of Bernstein’s birth. If you haven’t heard the song, I suggest you give it a spin via the official music video, which includes a telltale nod to a conductor. And if you want a stellar example of Bernstein’s vocal writing, try “Glitter and Be Gay” (a worthy BBC Proms version is here), which comes from Candide, and which sticks in my mind even more tenaciously than R.E.M.’s nifty ditty.

‘The Americans’: Playing the game of Great Power politics in the 80s

Keri Russell and Matthew Rhys as Elizabeth and Philip Jennings in a somewhat stylized promotional image for The Americans.

Killer looks: Keri Russell and Matthew Rhys as Elizabeth and Philip Jennings in a somewhat stylized promotional image for The Americans. (Photo: via FX)

The Americans, an FX drama about two undercover Soviet agents living with their two children near Washington, D.C., in the 1980s, is hurtling toward the conclusion of its sixth and final season. Though it has always kept its hand in the action and intrigue of the spy game, its recent seasons have become more moody, brooding, punctuated with anxious silences—one acquaintance of mine felt it lost its mojo and stopped watching—yet the action has recently picked up again as the show maneuvers its pieces toward a resolution. Continue reading

How ’bout them robot cowboys?! A few notes on ‘Westworld’

Dolores (Evan Rachel Wood) in Season 1, Episode 5, of Westworld.

Not the farmer’s daughter anymore: Dolores (Evan Rachel Wood) in Season 1, Episode 5, of Westworld. (Photo: John P. Johnson/HBO)

In 1973, a movie called Westworld, written and directed by Michael Crichton, was released. It’s easy to say what it was about: two visitors to an Old West amusement park that’s mostly populated by androids are terrorized by a robot gunslinger run amok. It was straightforward, so simple as to seem nearly crude now, and nearly mindless (in comparison to the sophistication of early modern robot tales such as Karel Čapek’s R.U.R. and Fritz Lang’s Metropolis), but it embodied a long-bubbling fear about machines, automation, and the dehumanizing effects of technology (many shots are devoted to the highly computerized control room and the inscrutable exchanges of the technicians), and it made potent use of the entertainment world’s oldest special effect, actors, in the form of Yul Brynner’s black-clad, glint-eyed, swaggering gunslinger.

In October 2016, HBO launched a series called Westworld, created by Jonathan Nolan and Lisa Joy. (Spoilers lie ahead.) Much remains the same, but much is different. Continue reading

Bedlam’s fresh but respectful take on Shaw’s ‘Pygmalion’

Say a male phonetics professor rescues a female guttersnipe from the gutter, teaches her to speak the English of the upper classes, and passes her off as a duchess—what then? As nearly everyone will recognize, this is the situation presented by the Lerner and Lowe musical My Fair Lady and, before that, by Bernard Shaw’s play Pygmalion, which is currently being presented by the Bedlam company in New York. Nowadays the play seems more obvious than it must have when it was first presented, roughly 100 years ago, in part because the musical has made the story familiar, in part because the source myth—that of Pygmalion and Galatea—is itself still familiar, and in part because numerous debates have made us well aware of the role language and speech play in social distinctions. Yet, if Shaw’s play is obvious, it’s also subtly provocative, and it’s capable of resonating in ways not addressed by Shaw. Continue reading