Brand-Reputation_CleanBootDirtyBoot_modified

Despite the distraction of a glorious summer weekend in Vermont, you’re likely to have heard the emerging rumblings of another Facebook data scandal over the past 48 hours. If you are responsible for the brand reputation of an organization that uses a Facebook page, you ought to be concerned.

Facebook’s Secret Mood Experiment

I’ll recap briefly with an excerpt from the A.V. Club report, in case you’re still in re-entry mode:

“Scientists at Facebook have published a paper showing that they manipulated the content seen by more than 600,000 users in an attempt to determine whether this would affect their emotional state. The paper, “Experimental evidence of massive-scale emotional contagion through social networks,” was published in The Proceedings Of The National Academy Of Sciences. It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds—specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can!”

PNAS cover image

The Ethical Questions

In addition to Facebook, the research report authors involved included Cornell, and the University of California–San Francisco. As many of you know, research universities and other organizations receiving any form of federal funding must abide by guidelines for human subject research, including notably research plan approval by an Institutional Review Board (IRB) and informed consent by participants. Absent federal funding, Facebook is not governed by such regulations, although the research report claims that the terms of Facebook’s Data Use Policy constituted informed consent.

Regardless of the researchers’ claims, the reality is, as Katy Waldman of Slate put it: “Facebook intentionally made thousands upon thousands of people sad.” As I and anyone else who deals with IRBs knows, anything that causes a change in psychological status, anything that might possibly cause harm to human subjects, is examined extremely closely. (As it should be.) It is not yet known if minors were involved in the Facebook mood experiment.

Your Brand Reputation

The question for marketers is when does Facebook’s repeated privacy and data breaches constitute a liability for your brand?

This is going to be a more difficult question for some organizations and companies than others. At the moment, Facebook reaches a huge audience. The Vermont Department of Health and others who use social norms theory and a classic social marketing approach to public health communications have long been attentive to avoiding making tobacco or alcohol use look acceptable or “cool” in their advertising and communications. So too, I believe, must any nonprofit, public agency or values-based business advocating data privacy, child protection, trust and transparency begin to examine whether using Facebook for its marketing communications represents an implied endorsement of behaviors diametrically opposed to its brand values.

Resources

Everything We Know About Facebook’s Secret Mood Manipulation Experiment (The Atlantic)

Facebook’s Unethical Experiment (Slate)

Facebook tinkered with users’ feeds for a massive psychology experiment (A.V. Club)

Even the Editor of Facebook’s Mood Study Thought It Was Creepy (The Atlantic)

————
photo credit CC via flickr (modified):  Admerial Crunch