Once forced to conduct painstaking personal interviews with subjects, scientists can now sit at a screen and instantly play with the digital experiences of millions of Internet users. It is the frontier of social science experiments on people who may never even know they are subjects of study, let alone explicitly consent.
This is a new era, said Jeffrey T. Hancock, a Cornell University professor of communication and information science. I liken it a little bit to when chemistry got the microscope.
But the new era has brought some controversy with it. Professor Hancock was a co-author of the Facebook study in which the social network quietly manipulated the news feeds of nearly 700,000 people to learn how the changes affected their emotions. When the research was published in June, the outrage was immediate.
Now Professor Hancock and other university and corporate researchers are grappling with how to create ethical guidelines for this kind of research. In his first interview since the Facebook study was made public, Professor Hancock said he would help develop such guidelines by leading a series of discussions among academics,
corporate researchers and government agencies like the National Science
As part of moving forward on this, weve got to engage, he said. This is a giant societal conversation that needs to take place.
Scholars from MIT and Stanford are planning panels and conferences on the topic, and several academic journals are working on special issues devoted to ethics.
Microsoft Research, a quasi-independent arm of the software company, is a prominent voice in the conversation. It hosted a panel last month on the Facebook research with Professor Hancock and is offering a software tool to scholars to help them quickly survey consumers about the ethics of a project in its early stages.
Although the Federal Trade Commission, which regulates companies on issues like privacy and fair treatment of Internet users, declined to comment specifically on the Facebook study, the broader issues touch on principles important to the agencys chairwoman, Edith Ramirez.
Consumers should be in the drivers seat when it comes to their data, Ms. Ramirez said in an interview. They dont want to be left in the dark and they dont want to be surprised at how its used.
Facebook, which has apologized for its experiment, declined to comment further, except to say, Were talking with academics and industry about how to improve our research process.
Much of the research done by the Internet companies is in-house and aimed at product adjustments, like whether people prefer news articles or cat videos in their Facebook feeds or how to make Googles search results more accurate.
But bigger social questions are studied as well, often in partnership with academic institutions, and scientists are eager to conduct even more ambitious research.
The Facebook emotion experiment was in that vein. The brainchild of a company data scientist, Adam D. I. Kramer, but shaped and analyzed with help from Professor Hancock and another academic researcher, Jamie E. Guillory, it was intended to shed light on how emotions spread through large populations. Facebook deliberately changed the number of positive and negative posts in the subjects news feeds over a week in January 2012, then looked at how the changes affected the emotional tone of the users subsequent Facebook posts.
In another well-known experiment, Facebook sent voting reminders to 61 million American users on Election Day in 2010. Some users also saw a list of their friends who said they had already voted, and the researchers found that the specific social nudge prompted more of those people to go to the polls.
The study prompted some to suggest that Facebook had the power to sway election
More recently, the dating site OKCupid conducted experiments on its users, including one in which it hid profile text to see how it affected personality ratings, and another in which it told some daters they were a better or worse potential match with someone than the companys software actually determined.