Our Take: Facebook’s Social Experiment

cher desautel

Cher Desautel
Partner

facebook dislike

YOU MAY HAVE HEARD that Facebook participated in an experiment to test the relationship between incoming posts, user’s emotions and their own posts. Facebook changed the formula (algorithm) by which almost 700,000 users’ News Feeds appeared to show them more positive or more negative posts and then recorded if those users went on to make positive or negative posts themselves.

A Question of Consent

When users sign up for Facebook, they agree to its data use policy, which reads, “[I]n addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

Facebook claims this is informed consent, which may stand up legally, but doesn’t pass the ethical sniff test to us. Facebook didn’t give users the opportunity to opt in, and the data use policy doesn’t mention that Facebook will alter programming.

Facebook wasn’t just “using the information” and making observations. It was manipulating the user experience, which falls under interventional research. One of the major principles of research is to do no harm. Intentionally manipulating someone’s mood through research without consent is unethical.

Many of us consider ourselves members of the Facebook community, not consumers of Facebook content that can be changed as easily as, say, broadcast content, where there is a one-way relationship and the programmer is free to change the content—just as we are free to watch it or not watch it. The Facebook relationship, a two-way, highly participative exchange with our friends, is not supposed to be edited or manipulated, as long as it’s within reasonable bounds for online guidelines.

Good Research Practices

You could argue that because Facebook is a private company, it is not bound by the federal common rule, which would require “a description of any foreseeable risks or discomforts to the subject.”

This is a good reminder (as if we needed one) why, when it comes to research, DHC always follows all the rules. Not just the letter of the law, but the spirit of the law and the intent to work in a fully-informed environment.

We know those regulations are there for a reason. We always disclose when people are being recorded and always make people understand that they have the option to remain confidential or to opt out entirely. That’s leadership—doing the right thing, even when no one is looking or “making us” do the right thing.

Again, even if Facebook isn’t legally bound to conduct its research differently, its users deserve to be treated ethically.

What Now?

The story has been covered by The New York Times, Huffington Post, NPR—all expressing outrage that Facebook users have been treated like lab rats. What should Facebook do now? First off, it needs to stop going on the defensive as it has been and needs to go into full disclosure mode.

So what if Facebook thinks it’s covered by the disclaimer? Facebook violated its users’ trust. Now when I look at my News Feed, I can’t be sure if the posts about this friend breaking his leg and that friend missing her flight are authentically showing up. Or are they part of an experiment?

Full disclosure is the only hope the company has of beginning to rebuild that trust. Explain what you were trying to do. What were the conclusions? What was the purpose behind it? Give people access to your findings.

And in the future, always give people the option. Many people (myself included) wouldn’t have wanted to be a part of the research. But then, again, many people probably would have, and Facebook could have avoided this whole mess by asking them for permission.

Send this to a friend