Advertising in general often works by making you, the consumer, feel deficient in some way. Your laundry isn’t clean enough; buy our detergent instead. Your body isn’t thin enough; try our gym instead. Your dog isn’t organic enough; buy this food instead. But getting super granular and hitting teenagers — kids — specifically when they’re down is something else.
That’s exactly what Facebook is letting advertisers do, though, according to a report from The Australian [paywall].
The paper obtained an internal document from Facebook in which executives promote advertising campaigns specifically exploiting Facebook users’ emotional states. The document apparently outlines an array of teenagers’ emotional states that the company claims it can target based on how kids are using the service, including, “anxious,” “defeated,” “insecure,” overwhelmed,” “stressed,” and “worthless,” among other negative emotions.
The document reportedly also helps advertisers particularly target users at moments when they are interested in “looking good” or “losing weight,” and goes into detail abut its ability to capture and predict the emotional states of 6.4 million high schoolers, students, and young adults in Australia and New Zealand using its data mining and algorithms.
As Mashable points out, the behavior outlined in the Facebook document may run contrary to Australian advertising law, which prohibits the collection or disclosure of personal information relating to children ages 14 or other without explicit parental consent first.
“Facebook does not offer tools to target people based on their emotional state,” the company has said in a statement. “The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.”
Instead, this is research run amok, according to Facebook.
“Facebook has an established process to review the research we perform,” the company said, presumably referring to the protocol finalized in 2016. “This research did not follow that process, and we are reviewing the details to correct the oversight.”
Even if teens are not being actively targeted in the moments they feel their absolute worst, however, the fact remains that Facebook has access to — and collects data on — all our moods, all the time. Everything you post and share can be mined for “sentiment analysis” — the fancy term for figuring out what mood a human was in when they created some digital content.
Add to that the fact that Facebook has previously run large-scale experiments on users to try to manipulate their emotional state to see what happens, and the picture starts to look good for Facebook’s ability to advertise… and bad for users.