“My fellow Americans, our long national nightmare is over,” Gerald Ford declared in 1974, after being sworn in as the 38th president of the United States. The current presidential election campaign—which has been far from a clean, issues-oriented matter—strikes me as something of a nightmare, and I’m tempted to say that it’s almost over, but the thing about nightmares is that you never really know where you are. Admittedly, the campaign itself is nearing its end. But where the vote count is concerned, anyone who expected to learn the outcome on the night of November 7, 2000, was frustrated in the extreme, and something of the kind may happen again. And when the election finally is settled and a new president takes office, who knows what will follow? Continue reading
The previous season of The Walking Dead ended with a cliffhanger. Many members of what I think of as Our Gang—the collection of characters at the center of the show’s long-running narrative—were captured by a group that calls itself the Saviors, and in retaliation for Our Gang’s having wiped out almost everyone at a Saviors outpost, their leader, the fearsome Negan, has lined up all the captives and intends, to put it flatly, to kill one of them. They’re on their knees, and he’s walking back and forth, hefting his barbed-wire-wrapped baseball bat, which he has fondly nicknamed Lucille, wondering how to decide. “Eeny, meeny, miny, moe,” he was reciting (if memory serves) when last we saw him. In attacking that outpost, Our Gang was trying to deal with a threat before it got worse, but I don’t think this development is meant to point out the risks of being pro-active. Tonight, when Season Seven of The Walking Dead begins (it’s on the AMC cable channel), the question raised by the cliffhanger will be answered, in a deliberate but deliberately random, and probably rather gruesome, act of violence. Apart from reducing the cast list by one, the setup and the payoff will likely serve mainly to assert yet again the show’s dark vision, in which zombies are a constant and potentially deadly nuisance but the real threat, like that hell spoken of in Sartre’s No Exit, is other people. Continue reading
Last spring, George Musser posted a fascinating article in Aeon called “Consciousness creep,” the gist of which is given by the dek (as we journalists call the story description beneath the headline): “Our machines could become self-aware without our knowing it. We need a better way to define and test for consciousness.” Here’s the conclusion (careful readers of my blog may recall that I quoted the same extract in March): Continue reading
We’re so conscious of movies as cultural expressions—their performers played up on magazine covers, their directors lionized or criticized or both, their storylines hashed over, their titles entering our language—that we’re apt to forget they’re also technological artifacts, the product of an ongoing and increasingly sophisticated set of developments involving chemistry, optics, mechanical and electrical engineering, and electronics. In the beginning, pictures didn’t move at all; then they did. Then we added sound, followed by color and other improvements. It’s a historical irony that the character of Norma Desmond, responding to someone’s remark that she used to be big, declared, “I am big. It’s the pictures that got small” at the outset of the 1950s, in which movies not only became brighter and sharper, as had been happening all along, but also got bigger and broader—widescreen formats, explored earlier, began to spread in that decade. Three-D, the ability to suggest depth, has arrived twice now. The tale continues.
Last night, HBO ran the premiere episode of Westworld, a new series, much anticipated in many quarters, that derives from a decades-old film written and directed by Michael Crichton. The episode was highly suggestive, proposing much, establishing little, apart from the expected idea of a theme park populated by human-seeming androids. Instead of attempting to write about it, which would take time I don’t have, I’ll simply list a few things I thought about before, during, or after watching it. Continue reading
Though the ability to play multiple roles is essential to the art of acting, there’s something uncanny about seeing the switch happen before us. Even when we know there’s some presentational trickery involved, as when separate performances by Tatiana Maslany are composited into a single scene on the BBC America drama Orphan Black, we’re beguiled by it. (This year, Emmy voters were too, giving Maslany the award for outstanding lead actress in a drama series.) The Peter Brook approach to A Midsummer Night’s Dream, first presented in 1970, in which many of the players of the court scenes also take on characters in the forest scenes, wins us over in part for thematic reasons but in part for purely theatrical reasons—this is a form of magic, especially appropriate to that play, but increasingly popular in other productions. Multicasting is part of the very idea of small companies such as New York’s Theatre Bedlam, which is currently using it in an adaptation of Sense and Sensibility. Whether this means something I’ll leave to others to decide, but I can’t help noticing that these theatrical demonstrations of multiple selves seem to have become more common during the last half century, roughly the same period in which our concern with authenticity has grown.
A recent presentation at the Brooklyn Academy of Music, Continue reading
Tinkerers are everywhere and probably always have been. The very idea of ham radio, for instance, was that it didn’t matter where you were; once you built or bought a transmitter, a receiver, and an antenna, you were set—you could chat with people in another part of the country, the continent, or the world. The network of hams formed an Internet before there was an Internet; when other channels failed or were blocked, hams were sometimes the first to spread the news of disasters and other events, much as the recent coup attempt in Turkey was reported on Twitter as it unfolded (by Zeynep Tufekci, among others). Where computers are concerned, there’s been no shortage of popular histories that have shown the far-flung origins of the devices on our desks and in our pockets. In 1984, Steven Levy published Hackers, a book tracing the hacker spirit in electrical engineers, computer programmers, electronics hobbyists, game creators, and phone phreaks around the country; many of these figures, in places ranging from Boston, Massachusetts, to Albuquerque, New Mexico, carried the flag of the personal-computer revolution. Levy’s book didn’t, as I recall, say much about women or the rest of the world, but Walter Isaacson, writing more broadly about computer history in his 2014 book, The Innovators, looked farther afield, describing the 19th-century British pioneers Charles Babbage and Ada Lovelace, 20th-century figures such as Alan Turing in Britain and Konrad Zuse in Germany, and a variety of women who followed, one way or another, in Ada’s footsteps. The popular histories may yet remain incomplete; Continue reading