concept of technology measuring the invisible is something we accept at a fairly young age.
But where we get skittish is when sensors begin to track us. We don’t mind if air quality is measured for negative emissions, or if thermal patterns are tracked to portend weather patterns. But when things get personal, we flinch at first. It’s a bit like scraping your knee and seeing blood for the first time; it feels like something that’s supposed to stay inside of you has come out. Data has an unexpected intimacy upon revelation. It also cries out for comparison and action. You measure your resting heart rate knowing you’ll gauge the resulting number against an elevated reading during exercise.
And what about Hacking H(app)iness? Can we use technology to identify and predict emotion? Would we want to if we could?
Affective computing is a multidisciplinary field deriving from the eponymous paper 2 written by Rosalind W. Picard, director of Affective Computing Research at the MIT Media Lab. Picard’s work has defined the idea of measuring physical response to quantify emotion. While it’s easy to focus on the creepy factor of sensors or machines trying to measure our emotions, it’s helpful to see some applications of how this type of technology can and is already improving people’s lives.
In the New York Times article “But How Do You Really Feel? Someday the Computer May Know,” Karen Weintraub describes aprototype technology focused on autism created by Picard and a colleague that helped people with Asperger’s syndrome better deal with conversation in social settings. The technology featured a pair of glasses outfitted with a tiny traffic light that flashed yellow and red, alerting the wearer to visual cues they couldn’t recognized due to Asperger’s (things like yawning that indicate the person you’re speaking to is not interested in what you have to say). 3
It’s easy to imagine this type of technology being created for Google Glass. The famous American psychologist Paul Ekman classified six emotions that are universally expressed by humans around the globe: anger, surprise, disgust, happiness, fear, and sadness. Measuring these cues via facial recognition technology could become commonplace within a decade. Cross-referencing GPS data with measurement of these emotions could be highly illuminating—what physical location has the biggest digital footprint of fear? Should more police be made available in that area?
Picard’s boredom-based technology would also certainly be useful in the workplace. Forget sensitivity workshops; get people trained in using this type of platform, where when a colleague looks away while you’re speaking you get a big text message on the inside of your glasses that reads, “Move on, sport.” Acting on these cues would also increase your reputation, with time stamps noting when you helped someone increase their productivity by getting back to work versus waxing rhapsodic about the latest episode of Downton Abbey .
“The Aztec Project: Providing Assistive Technology for People with Dementia and Their Carers in Croydon” is a report from 2006 documenting sensor-based health solutions for dementia and Alzheimer’s patients in South London. The report starts off with the harrowing statistic that “there are currently some twenty-four million [Alzheimer’s disease] sufferers, a number that will double every twenty years until 2040.” 4 The scale of the population including patient families greatly increases this number, and thefinancial burden for all parties involved places significant stress on health costs.
My grandmother had Alzheimer’s, so I identify with the scenarios described in the report. Wandering is a standard behavior, with patients not recognizing they’ve left or entered a room or even their home. Accidents in the kitchen are common, as is forgetting to eat for days on end. The report identified that previous solutions, including things like locking
Dean Wesley Smith, Kristine Kathryn Rusch
Martin A. Lee, Bruce Shlain