In the era of big data and analytics, the conventional wisdom seems to be that businesses should try lots of different things, test the results and continuously optimize to develop new innovations and better customer experiences.

But some companies may be taking this advice a little too far.

The brouhaha started when Facebook admitted to a 2012 test that experimented with as many as 700,000 users’ news feeds to see how people responded after reading positive or negative stories, and then wrote up a scientific paper on it. Soon after, OKCupid! confessed to something similar—an experiment that tried showing two people as a “strong match” when they were actually a weak one, to see how people reacted. “When we tell people they are a good match, they act as if they are, even when they should be wrong for each other,” writes founder Christian Rudder in a blog post. Smaller social networks, such as Imgur, Reddit, and Twitch, are also getting into the act with their The Digital Ecologies Research Partnership (DERP).

In some cases, the experimentation is going even further. In Amsterdam, two designers set up a workspace for freelancers that started out with lots of perks and benefits, a la Google. It even included a pet rabbit. Over the course of a month, the perks (including the rabbit) gradually went away. The diminishing benefits were in response to user request, claimed the designers.

By the end of the month, the designers had converted the once-lush workspace to rows of numbered gray cubicles that were under the constant surveillance of a virtual boss who, appearing on an overhead screen, led the freelancers in a series of exercises every few hours, writes Fast Company. The designers also installed movement sensors and when a person got up for, say, a cup of coffee, the virtual boss barked that it wasn’t break time yet.

Ironically, the designers reported that the workers were more productive without all the perks and benefits. But that’s not exactly the point. The real question being, is this sort of thing okay? When does A/B testing cross the line into inappropriate social experimentation?

Social scientists have a long history of performing experiments on human subjects. Some of these experiments have led to valuable insights, such as the Stanford Prison Experiment (the one where a class was divided into “prisoners” and “guards” to see how they’d act) and the Milgram Experiment (the one where many people were willing to electrocute other people when they were told to).

Now, with easy access to enormous populations, such as the more than 1 billion Facebook users, social scientists see many opportunities. At the same time, social scientists today have ethical guidelines around experimentation with human subjects; in fact, experiments like Milgram and Stanford Prison might not have been allowed under modern guidelines. Essentially, researchers say, if an experiment has the potential to change someone’s emotional status, they’re supposed to have informed consent. Consent that goes beyond Facebook’s terms of use that mention such experimentation might happen.

Thanks in part to the backlash over the Facebook experiments—which even drew a Congressional letter to the Federal Trade Commission—work is now underway to develop ethical guidelines for online experimentation. The scientist who developed the Facebook experiment is involved in the work, as well as people from Microsoft Research, MIT, and Stanford, according to the New York Times.

“Such testing raises fundamental questions,” writes the Times. “What types of experiments are so intrusive that they need prior consent or prompt disclosure after the fact? How do companies make sure that customers have a clear understanding of how their personal information might be used? Who even decides what the rules should be? Existing federal rules governing research on human subjects, intended for medical research, generally require consent from those studied unless the potential for harm is minimal. But many social science scholars say the federal rules never contemplated large-scale research on Internet users and provide inadequate guidance for it.”

In addition, social network companies are starting to clean their own houses in terms of experimental ethics. Facebook is reportedly tightening up its experimental guidelines. And DERP specifically notes that it will only support research that “respects user privacy, responsibly uses data, and meets IRB [Institutional Review Board] approval.”

Facebook has been heavily criticized for its experiments, with 84 percent of surveyed users saying they had lost trust in Facebook, and 66 percent claiming they considered deleting their Facebook account because of it. Though it’s too soon to say what effect this all could have on people’s willingness to continue using Facebook, this shows that a company’s reputation is something you might not want to experiment with.

Related Posts