A technique known as sentiment analysis is in the news following the latest Facebook privacy scandal.
Facebook is yet again on the back foot due to questionable user privacy practices, this time passing on information about emotionally vulnerable teens for advertising purposes. At the core of the scandal is the revelation that Facebook used sentiment analysis algorithms to target teens who felt worthless, presenting an opportunity for brands to market self-improvement products at a presumably receptive audience.
But though it’s come to light now through Facebook’s unsavory practices, sentiment analysis (also known as “opinion marketing”) is a standard weapon in the toolkit of both advertisers and political pollsters in the digital age. So what is it, and how is it applied to our online lives?
Andrew Piper is a professor in the Department of Languages, Literatures, and Cultures at McGill University, where he directs txtLAB, a project dedicated to understanding literature through computational approaches. From the start, says Piper, sentiment analysis has always been tied to marketing, and was first developed as a way to help brands figure out how people felt about the products they were discussing online.
At a basic level, someone performing the analysis would identify a key search term—for example “iPhone”—and then look for modifying words that refer to it in the text. The most simple way to find these is to look at adjacent words, but more complex analysis can also involve deconstructing the component parts of the sentence to find meaning.
“Increasingly we’re able to extract relationships between words in a grammatical sense … so you’re finding the thing you’re interested in, in this case the product, and then finding all the other things which are in a dependent relationship with that word,” Piper explained to Motherboard in a phone call.
Difficulties can arise in the case of irony and sarcasm, where words are often used to convey their opposite meaning (like “good job” or “it’s a hard life”). In longer texts clues can often be found from other indicators of emotional tone, but for short snippets computers often have a hard time, unless other background data can be factored in.
Still, with the enormous amount of media that we collectively post to the internet each day, understanding and even manipulating our emotional responses becomes more and more possible for anyone with the access and computational ability to do so. And the problem is that while academics working in this field must operate within a strict ethical framework, for private companies nothing similar exists.
“In the last Facebook scandal they were manipulating people’s timelines in order to test if their responses would change. That’s a classic ethics violation from a research perspective: You’re doing something to human beings without their knowledge that could impact their emotional wellbeing,” Piper said.
Sentiment analysis is clearly a goldmine for advertisers, allowing them to deliver promotional material at a moment which they judge to be most emotionally resonant to the target. While the targeting of vulnerable teenagers is seems to be beyond the pale for public opinion, these same practices can just as easily be applied to adults who are sad, lonely or depressed; there are tough questions to be asked about whether and how ethical guidelines could be applied, just as guidelines apply to advertising for alcohol and tobacco, or in schools.