Wednesday, August 13, 2014
Today's New York Times features an article aptly titled (in the print version) "Under the Microscope." The article describes researchers' attempts to grapple with the ethical issues relating to projects such as Facebook's experiment on its users, about which we have written previously here and here. According to the article, researchers both at universities and at in-house corporate research departments are collaborating on processes to formulate ethical guidelines that will inform future research that makes use of users' information.
The article states that Facebook has apologized for its emotion experiment, in which it manipulated users' feeds to see if those users' own posts reflected the emotional tone of the posts they were seeing. It's not really clear that Facebook apologized for experimenting on its users. As quoted on NPR, here is what Facebook's Sheryl Sandberg said on behalf of the company:
This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated . . . . And for that communication we apologize. We never meant to upset you.
As the Washington Post noted, Sandberg did not apologize for the experiement itself. Seen in its full context, Sandberg's statement is more akin to OKCupid's in-your-face admission that it experiements on its users, about which Nancy Kim posted here.
But the Times article focuses on Cornell University's Jeffrey Hancock, who collaborated with Facebook on the experiment. He seems to have no regrets. For Hancock, researchers' ability to data mine is to his field what the microscope was to chemists. Or, one might think, what the crowbar was to people doing research in the field of breaking and entering. Hancock is now working with people at Microsoft Research and others to lead discussions to help develop ethical guidelines applicable to such research.
The Times quotes Edith Ramirez, Chair of the Federal Trade Commission on the subject. She says:
Consumers should be in the driver’s seat when it comes to their data. . . . They don’t want to be left in the dark and they don’t want to be surprised at how it’s used.
By contrast, here is the Times's synopsis of Professor Hancock's views on how the ethical guidelines ought to be developed:
Companies will not willingly participate in anything that limits their ability to innovate quickly, he said, so any process has to be “effective, lightweight, quick and accountable.”
If the companies are subject to regulation before they can experiment on their users, it does not really matter whether or not they willngly participate. And the applicable standards have already been established under Institutional Review Board (IRB) rules. Significantly, as reported here in the Washington Post, although Professor Hancock works at Cornell, his participation in the Facebook study was not subject to Cornell's IRB review. In our previous posts, we have expressed our doubt that the Facebook study could survive IRB review (or that it yielded the information that it was supposedly testing for).
The Times article does not indicate that any of the people involved in devising rules for their own regulation have any expertise in the field of ethics. Why is letting them come up with their own set of rules in which they will "willingly participate" any better than expecting the wielders of crowbars to design rules for their safe deployment?