story, they were about delivering news in as hard and fast a way as possible. “So and so is no longer in a relationship,” the story said, illustrating the news with an icon of a broken heart, one not unlike the icon of a broken hard drive that signals doom on an old Macintosh. Building the algorithm was one thing, I realized, but delivering stories that felt like they had been delivered by a human was another. I tried to intuit what model of the social world the stories assumed, and if it was one I recognized. Some of the stories didn’t seem like anything I would be instantly apprised of in real life: For example, that some acquaintances were having a party I wasn’t invited to, or that an old ex-boyfriend was now in a new relationship. While in real life I might find out about these things later, they justweren’t things that I needed or wanted to know immediately, as they happened.
I shared my concerns about the bluntness of News Feed with Pasha—that it wasn’t just telling me things quickly but telling me things I typically wouldn’t know about—and she said that she would take them back to the engineers. None of the stories were removed. I wondered, then, if News Feed and the future of Facebook would be built on the model of how social cohesion works—what is comfortable and relevant to you and what isn’t—or if it would be indifferent to etiquette and sensitivity. It turned out to be the latter, and I’m not sure Mark knew the difference. To him and many of the engineers, it seemed, more data is always good, regardless of how you get it. Social graces—and privacy and psychological well-being, for that matter—are just obstacles in the way of having more information.
As I worked with my fellow Customer Support Team members to help engineers test News Feed and work out the bugs, I began to see that we were trafficking in a new kind of programmed, automatic gossip, in which the mere act of updating your profile (or in this case, of Sam updating his profile and linking to me) becomes a story—online and off. The machine becomes the wandering bard, telling stories, real or something other. As Jean Baudrillard wrote, “The map becomes the territory.”
When Sam was done testing a few days later, he removed our relationship from the site and we went back to being single, to the disappointment of our coworkers. And in the meantime, News Feed slowly became the core of the Facebook product, occupyingthe center of the homepage and, increasingly, the center of our social lives.
• • •
In a sense, Thrax’s Facebook hack in spring 2006, which, in addition to making Facebook look like MySpace, also generated innocuous conversation posted to unsuspecting users’ walls (e.g., “Hey, nice shoes,” or, “This wall is now about trains.”) was the first to elide our speech, motivated by individual intention, with that of a machine’s. Unlike the usual viruses that create spammy posts that are trying to sell something, Thrax’s Facebook worm created conversational messages that sounded like posts a friend might have written. “The whole point of them was that they could have been real,” Thrax explained, describing the hack later to an adoring tech blogger. I doubt, however, that making a philosophical point was the hack’s main goal, as Thrax and the other hacker boys that came to Facebook rarely trafficked in philosophical arguments. They preferred instead to use the Internet to create and distribute as many “lulz,” or jokes, as possible. Lulz, on the Internet, were a goal in themselves, a new way of creating a scene and attracting attention from people waiting patiently to be entertained in front of their screens.
Thrax’s Facebook hack was just the latest in a long sequence of virtual scenes that he had made. He told me about them as we hung around the pool house that summer, tapping away on our laptops at the kitchen table or strumming on guitars in the dark on the living room couches. As
Larry Niven, Nancy Kress, Mercedes Lackey, Ken Liu, Brad R. Torgersen, C. L. Moore, Tina Gower