One of the questions I always ask of stories is how they work. Who do they serve? Who benefits? Who, if anyone, is burdened or harmed by them? Who is uplifted? What modes or methods or structures do they employ? Stories—and metaphors, which are often just stories in miniature—are never neutral actors. They always seek some change, whether through resistance or encouragement or both.
We are surrounded with illustrative examples. The phrase “office politics” frames the critical work of negotiating information, power, and agency within an organization as mere gossip, thereby serving to uphold existing hierarchical structures and preventing (or arresting) structural change. Similarly, the “cloud” obscures the deep sea cables and data centers and massive carbon costs of digital experiences behind a vision of the kind of fluffy, ephemeral, and happy image that Bob Ross delighted in. Where “office politics” cloaks structural inequities in a sheen of disgrace, the “cloud” hides the very real and visceral harms of digital technologies behind a friendly facade.
But there’s a different story I want to talk about today, and it’s a timely one: this story says that a certain kind of technology is different from all other technologies by virtue of its wit. Where other machines merely follow the instructions given them, this new kind of machine learns and discovers and creates novel and surprising results. This tech is so smart that there’s in fact a risk that it becomes too smart and gains sentience—a possibility so dangerous it requires that we rapidly expand the capability of this technology so that we have a chance to stay a few steps ahead of it, so that we have a chance to make certain it serves us instead of itself.
I’m talking about machine learning. Which is itself one kind of story—one in which machines do something like “learn,” but which really means to memorize or put into storage, and includes nothing so pedestrian as understanding or interpreting. But the more common parlance—“artificial intelligence”—expands on that story to suggest that not only are the machines learning, but they have acquired the ability to think, or to intellectualize, implying that they have desires and personalities and behaviors. One way this story works is that by ascribing “thinking” to the machines, it triggers associations many people have with “higher” beings—whether species that are smarter than others, or people that are. (I’ll come back to this hierarchical notion of intelligence in a moment.)