It’s the mid-90s, and I’m visiting a colleague’s house after work. He has an account with an Internet service provider; I don’t, and he has offered to show me what’s out there. So he fires up his computer, and we chat over the hiss, squawk, and chime of two modems flirting by phone. Once they’ve mated, they fall silent, and we turn our attention to the Netscape Navigator web browser. My pal has already discovered and bookmarked a number of sites on the World Wide Web that interest him. He shows me a few, and then I, impatient for a broader view, ask him if there’s a directory of some kind, like the ever-growing lists of computerized bulletin-board systems. How do you find a new place to go on the web, if you don’t know about it ahead of time? Simple, he says, taking us to a page with the excitable name “Yahoo!” at the top. The whole thing is simply a handcrafted list of other websites, organized into categories—just what we want.
Where are you? The question is both easy to answer and not; it depends on what you think I mean. Maybe, dear reader, you would tell me you’re in Scottsdale, Arizona, or maybe you’d say you’re at home, or maybe you feel yourself to be inside your body, inside your head in fact, somewhere behind your eyes and between your ears. All these things and more—such as “I’m in my 62nd year” or “I’m in a good place right now”—are ways of saying where we are. You might even think to yourself, I’m in the first paragraph of your essay, waiting to see where you’re going with this. (I’m with you on that.)
Where am I? Continue reading
A possible goal: non ridere, non lugere, neque detestari, sed intelligere (not to laugh, to cry, or to condemn, but to understand). From Baruch Spinoza, Tractatus Theologico-Politicus.
From something that happened earlier today: It’s funny but kind of stupid when a timed pop-up ad gets in the way of a timed display ad on a web page. If I were one of those clever smarty-pants web writers who’s always talking about things that happen on the web, I might try to work up an essay about this. Continue reading
In the ars gratia artis view, works of art are their own end and shouldn’t serve any external purpose, but most of us use art all the time. J. S. Bach composed a piece of music, now known as the Goldberg Variations, that was reportedly meant to occupy the restless mind of a patron while he tried to get to sleep. More recently, composer Max Richter (whom I wrote about here) crafted an eight-hour-long project called Sleep, which is meant to be heard while sleeping. Continue reading
Tinkerers are everywhere and probably always have been. The very idea of ham radio, for instance, was that it didn’t matter where you were; once you built or bought a transmitter, a receiver, and an antenna, you were set—you could chat with people in another part of the country, the continent, or the world. The network of hams formed an Internet before there was an Internet; when other channels failed or were blocked, hams were sometimes the first to spread the news of disasters and other events, much as the recent coup attempt in Turkey was reported on Twitter as it unfolded (by Zeynep Tufekci, among others). Where computers are concerned, there’s been no shortage of popular histories that have shown the far-flung origins of the devices on our desks and in our pockets. In 1984, Steven Levy published Hackers, a book tracing the hacker spirit in electrical engineers, computer programmers, electronics hobbyists, game creators, and phone phreaks around the country; many of these figures, in places ranging from Boston, Massachusetts, to Albuquerque, New Mexico, carried the flag of the personal-computer revolution. Levy’s book didn’t, as I recall, say much about women or the rest of the world, but Walter Isaacson, writing more broadly about computer history in his 2014 book, The Innovators, looked farther afield, describing the 19th-century British pioneers Charles Babbage and Ada Lovelace, 20th-century figures such as Alan Turing in Britain and Konrad Zuse in Germany, and a variety of women who followed, one way or another, in Ada’s footsteps. The popular histories may yet remain incomplete; Continue reading
To paraphrase a line attributed to Trotsky, you may not be interested in tech, but tech is interested in you. The current issue of The Economist contains not only an editorial but also an entire multi-article special report on technology and politics (read my final note before clicking). The entire package is worth reading whether or not you already have a grasp of the promise and the threat of digital technologies. As the editorial concludes, “The original vision of the internet, as a self-governing cyber-Utopia, has long since been consigned to history.… But it remains a public good. The danger is that the centralisation of data may undo many of the democratic gains that social media and other technologies have brought.”
From “Seeing the Spectrum,” an article on autism, by Steven Shapin, in the 1/25/16 New Yorker:
There are obvious ways in which the history of autism can be seen as progressive: the quality of life for many people receiving a spectrum diagnosis has undoubtedly improved. Yet this same history has come under attack from proponents of so-called medicalization theory. This set of views, loosely linked to the work of Michel Foucault, criticizes the modern tendency to recategorize human behaviors as medical pathologies demanding expert diagnosis and care. For some writers and activists, medicalization is just a power grab, and its arch-villains are a greedy pharmaceutical industry and an arrogant psychiatric profession, which together have pushed pills for states of mind about which nothing can be done, or should be done, and which rightly belong to the realm of individual moral responsibility. The disease categories developed by modern psychiatry and psychology—such things as social anxiety disorder and mixed anxiety-depressive disorder—have been among the most popular targets for the critics of medicalization, as is autism.