Facebook’s Controversial News Feed Emotions Study Draws Washington’s Attention

MarkWarner650A study Facebook conducted in 2012, along with Cornell University and the University of California-San Francisco, in which the researchers randomly selected 689,003 Facebook users and tinkered with the number of positive or negative stories that appeared in their News Feeds, has drawn quite a lot of attention over the past couple of weeks, most of it negative, and now the government is getting involved.

Sen. Mark Warner (D-Va.) wrote a letter (embedded below) to the Federal Trade Commission, asking the agency to explore the potential ramifications of the study, as well as to consider what types of oversight may be necessary for behavioral studies conducted by social networks.

Warner said in a release announcing the letter:

I come from the technology world, and I understand that social media companies are looking for ways to extract value from the information willingly provided by their huge customer base. I don’t know if Facebook’s manipulation of users’ News Feeds was appropriate or not. But I think many consumers were surprised to learn that they had given permission by agreeing to Facebook’s terms of service. And I think the industry could benefit from a conversation about what are the appropriate rules of the road going forward.

Reactions from Facebook executives to criticisms over the study have varied, with Facebook Data Scientist Adam Kramer, one of the study’s co-authors, responding with this Facebook post:

The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.

Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04 percent of users, or 1 in 2,500), for a short period (one week, in early 2012). Nobody’s posts were “hidden,” they just didn’t show up on some loads of Feed. Those posts were always visible on friends’ Timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses it.

And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.

Chief Operating Officer Sheryl Sandberg said last week:

This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication, we apologize. We never meant to upset you.

We take privacy and security at Facebook really seriously because that is something that allows people to share.

And one day after Sandberg’s comments, Head of Global Policy Management Monika Bickert chimed in:

You’ve pointed out a couple of interesting issues, and one is the tension between legislation and innovation. In the specific incident that you’re referring to — although I’m not really the best expert, and probably our public statements are the best source for information there — I believe that was a week’s worth of research back in 2012. And most of the research that is done on Facebook, if you walk around campus and you listen to the engineers talking, it’s all about, “How do we make this product better? How do we better suit the needs of the population using this product? And how do we show them more of what they want to see and less of what they don’t want to see?”

And that’s innovation. That’s the reason why when you look at Facebook or YouTube, you’re always seeing new features. And that’s the reason why if you have that annoying friend from high school who always posts pictures of their toddler every single day, you don’t see all those photos in your News Feed.

So it’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation. At the same time, if we want to make sure we don’t see that legislation, it’s incumbent upon us to make sure we’re transparent about what we’re doing and that people understand exactly why we’re doing what we do.

Warner’s letter to the FTC read:

I am writing to urge the FTC to fully explore the potential ramifications of the behavioral experiment performed by Facebook in 2012. Recent reports indicate that Facebook altered the News Feeds of almost 700,000 of its users to determine whether emotions and moods might be transmitted between users. As the collection and analysis of “big data” continues to increase, and as it assumes a larger role in the business plans of Internet-based companies, it is appropriate that we consider questions about what, if any, oversight might be appropriate, and whether best practices should be developed and implemented by the industry or by the FTC. Given the FTC’s leadership in this area, including your recent data brokers report and your previous oversight of Facebook’s privacy policies, I would be interested to know if this 2012 experiment violated Section 5 of the FTC Act or the 2011 consent agreement with Facebook.

According to reports, it is not clear whether Facebook users were adequately informed and given an opportunity to opt in or opt out. I also have concerns about whether or not Facebook responsibly assessed the risks and benefits of conducting this behavioral experiment, as well as the ethical guidelines, if any, that were used to protect individuals. According to Facebook, the effects of its News Feed manipulation were relatively modest, with as little as one-tenth of a percent of those studied showing any observable change in behavior. However, I am concerned that the exponential growth in the universe of social media consumers could place us on a slippery slope. Future studies like this, without proper oversight or appropriate review, could have a significant impact upon a large number of consumers.

The very fact that important questions remain unanswered highlights the lack of transparency around these business practices. For example, while Facebook may not have been legally required to conduct an independent ethical review of this behavioral research, the experiment invites questions about whether procedures should be in place to govern this type of research. To be certain, big data has the potential to help power economic activity and growth, while serving consumers in meaningful ways. Companies like Facebook may have to perform research on a broad scale in order to improve their products. However, because of the constantly evolving nature of social media, big data, and the Internet, many of these issues currently fall into unchartered territory.

I am not convinced that additional research is the answer. Public concerns may be more appropriately addressed through industry self-regulation. As the federal regulator with oversight of privacy and consumer-protection policies, I would be interested in your responses to the following questions:

  1. Does the FTC have a role to play in improving transparency, accountability, and consumer trust in industry’s use of big data.
  2. Are there better ways to educate consumers, or otherwise improve transparency, about the practices consumers agree to through their use of social media platforms? Are there incentives in place for companies to voluntarily create, or to consult with independent review boards, or to utilize other means of self-regulation before conducting studies such as this? Additionally, are there incentives that could encourage the hiring or designation of chief privacy officers at social media companies, or to establish other credible internal review programs?
  3. Does the FTC make any distinction between passively observing user data versus actively manipulating it? Should consumers be provided more of an explicit option to opt in or opt out of such studies? Additionally, is it appropriate for any research findings to be shared with participants prior to public dissemination?
  4. Does the FTC or another federal entity require any additional regulatory authority or technology in order to monitor this type of data mining?

I believe this conversation will be useful both for industry, which will learn more about consumer expectations, and for consumers, who will benefit from a reminder about online privacy and the potential commercial uses of their personal information. I look forward to your timely response to these questions.

Readers: Will Facebook wind up in hot water in Washington, D.C., due to its controversial study?

233238030-Warner-Letter-to-FTC-7-9-14

Related Stories
Mediabistro Course

Content Marketing 101

Content Marketing 101Almost 60% of businesses use some form of content marketing. Starting December 8, get hands-on content marketing training in our online boot camp! Through an interactive series of webcasts, content and marketing experts will teach you how to create, distribute, and measure the success of your brand's content. Register now!