Facebook’s Algorithm Hacks Emotional Triggers



We all know about emotional triggers. It’s just that most of us don’t know we know about emotional triggers. We use emotional triggers with others all the time, quite subconsciously and for a wide range of reasons, along a spectrum from abject selfishness to altruistic benevolence. It’s possible, of course, to use knowledge of emotional triggers in a manipulative way.

But that’s not what we advocate. Even the notion of persuasion, a central tenet in The 7 Trigger to Yes methodology, is considered in the context of cooperative achievement.  If you need to persuade someone to take their medicine, that’s a good thing. Persuading someone to act against their own best interests – or, indeed, without the knowledge that persuasion itself is in play – is no longer persuasion. It’s manipulation.

So, what are we to make of a colossal corporation that seeks to use emotional triggers across its customer base without their knowledge and to its own advantage? Is it ultimately of benefit to its users, as the company claims?

In June of this year, Facebook now famously decided to adjust its algorithm, tweaking it to affect users emotionally. Specifically, the social media content performed two experiments. By determining the positive or negative nature of tons of specific messages, Facebook then reduced the level of positive content in the newsfeed of one set of users, while reducing the level of negative content in the newsfeed of another set.

In the end, less positive content in newsfeeds led to more negative status updates and less negative content in newsfeeds led to more positive status updates. However, the results were not incredibly significant and may have been overstated overall by the study. Nevertheless, Facebook’s content does seem to play into our emotions, triggering responses from people in varying degrees based on the nature of specific messages and the ecosystem surrounding individual user.

Even so, the blogosphere was plenty uneasy at the way Facebook performed its test. “How could they mess with people’s emotions without them being willing test subjects?” asked blogger Martin Bryant.

Just as quickly though, Facebook defenders fired back. Caitlyn Dewey of the Washington Post stated that users don’t want Facebook to stop tweaking the feed because experiments such as these actually make Facebook better for the user.

As the data manipulation dust settles, Facebook’s test leaves us with both moral and practical questions. Is it right to manipulate the emotions of others? If anything, Facebook’s decision certainly viewed the mass audience in a quantitative way. On the other hand, this style of treatment could still lead to individually better results for users, and it’s far from unusual. Most major media corporations perform some level of A-B testing to determine the sentiment of their viewers, which could be construed as A-B testing all its own. Heck, it’s not like Facebook wasn’t testing before!

Regardless, Facebook certainly could have been more persuasive – which is to say: cooperative. At this point, the Reciprocity Trigger dictates that Facebook users as well as the media will pay back Facebook’s behavior with less-than stellar support.

For more information on how Facebook can affect human emotions, check out our post on the emotional effects of social media sharing.

Share: