The Facebook 2012 Emoticon Experiment

The Facebook Emoticon Experiment ran in 2012 on  689,003 of its own users. More details about the experiments can be found in the link.

For a week, Facebook showed people less positive posts in the News Feed, and found they posted 0.1% fewer positive words in their own posts. A more depressing feed led people to publish very slightly more depressed updates. This teaches Facebook that emotions are contagious, that seeing happy updates might not make you sad like some suggest, and that showing them could make you use Facebook more. Anger arose for a number of reasons:

  • Facebook didn’t get express opt-in permission or even offer an opt-out, claiming the Data Use Policy users automatically opt into upon sign-up includes consent to “data testing, analysis, research” (though that line wasn’t added until four months after the study was conducted)
  • Facebook purposefully tried to depress people, rather than just testing to see what made them engage more with the service
  • Facebook didn’t have an independent ethics board pre-approve the test
  • One of the authors of the study has also received U.S. government funding to research “Modeling Discourse and Social Dynamics in Authoritarian Regimes” including how revolutions start.
  • Hiding negative posts to increase engagement is akin to self-serving censorship that makes Facebook a success theater where people can’t get help for issues in their lives
  • European regulators are looking into whether the experiment broke privacy laws or settlements Facebook has entered.
  • And a general fear that big tech companies have enormous power to influence society with very little transparency or control given to users

This has led others and I to call for more ethical experimentation and a bigger discussion of the morality of influence by these companies.

The Facebook 2012 Emoticon Experiment

On the opposite side, some of the positions supporting the experiment say:

  • These types of A/B tests are conducted all the time by companies, advertisers, politicians, charities, and more to find out how best to make us use, buy, vote, or donate. I mean, I said we were Facebook’s product testing guinea pigs way back in 2012.
  • Opt-ins would complicate the site and screw up experiment results. You can simply stop using products like Facebook if you don’t want to be a guinea pig
  • Regulation would slow down innovation and give Facebook a bad rap since people detest the NSA
  • The backlash against Facebook’s study will reduce transparency about this kind of research, leading companies to continue conducting but not share their findings.

Read the full article from here