This is an archive page. What you are looking at was posted sometime between 2000 and 2014. For more recent material, see the main blog at http://laboratorium.net
I have been thinking a lot about the mechanics of how the Facebook emotional manipulation study was conducted, reviewed, and accepted for publication. I have found it helpful to gather in one place all of the various claims about who did what and what forms of review it received. I have bolded the relevant language.
What did the authors do?
PNAS authorship policy:
Authorship must be limited to those who have contributed substantially to the work. …
All collaborators share some degree of responsibility for any paper they coauthor. Some coauthors have responsibility for the entire paper as an accurate, verifiable report of the research. These include coauthors who are accountable for the integrity of the data reported in the paper, carry out the analysis, write the manuscript, present major findings at conferences, or provide scientific leadership to junior colleagues. Coauthors who make specific, limited contributions to a paper are responsible for their contributions but may have only limited responsibility for other results. While not all coauthors may be familiar with all aspects of the research presented in their paper, all collaborators should have in place an appropriate process for reviewing the accuracy of the reported results. Authors must indicate their specific contributions to the published work. This information will be published as a footnote to the paper. Examples of designations include:
- Designed research
- Performed research
- Contributed new reagents or analytic tools
- Analyzed data
- Wrote the paper
An author may list more than one contribution, and more than one author may have contributed to the same aspect of the work.
From the paper:
Author contributions: A.D.I.K., J.E.G., and J.T.H. designed research; A.D.I.K. performed research; A.D.I.K. analyzed data; and A.D.I.K., J.E.G., and J.T.H. wrote the paper.
Cornell press release:
… According to a new study by social scientists at Cornell, the University of California, San Francisco (UCSF), and Facebook, emotions can spread among users of online social networks.
The researchers reduced the amount of either positive or negative stories that appeared in the news feed of 689,003 randomly selected Facebook users, and found that the so-called “emotional contagion” effect worked both ways.
“People who had positive content experimentally reduced on their Facebook news feed, for one week, used more negative words in their status updates,” reports Jeff Hancock, professor of communication at Cornell’s College of Agriculture and Life Sciences and co-director of its Social Media Lab. …
Cornell University Professor of Communication and Information Science Jeffrey Hancock and Jamie Guillory, a Cornell doctoral student at the time (now at University of California San Francisco) analyzed results from previously conducted research by Facebook into emotional contagion among its users. Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” published online June 2 in Proceedings of the National Academy of Science-Social Science.
Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.
Adam Kramer’s statement for Facebook:
OK so. A lot of people have asked me about my and Jamie and Jeff’s recent study published in PNAS, and I wanted to give a brief public explanation. …
Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). … And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.
What did the IRB do?
PNAS IRB review policy:
Research involving Human and Animal Participants and Clinical Trials must have been approved by the author’s institutional review board. … Authors must include in the Methods section a brief statement identifying the institutional and/or licensing committee approving the experiments. For experiments involving human participants, authors must also include a statement confirming that informed consent was obtained from all participants. All experiments must have been conducted according to the principles expressed in the Declaration of Helsinki.
Susan Fiske’s email to Matt Pearce:
I was concerned about this ethical issue as well, but the authors indicated that their university IRB had approved the study, on the grounds that Facebook filters user news feeds all the time, per the user agreement. Thus, it fits everyday experiences for users, even if they do not often consider the nature of Facebook’s systematic interventions. The Cornell IRB considered it a pre-existing dataset because Facebook continually creates these interventions, as allowed by the user agreement.
Having chaired an IRB for a decade and having written on human subjects research ethics, I judged that PNAS should not second-guess the relevant IRB.
I regret not insisting that the authors insert their IRB approval in the body of the paper, but we did check that they had it.
Fiske’s email to Adrienne LaFrance:
Their revision letter said they had Cornell IRB approval as a “pre-existing dataset” presumably from FB, who seems to have reviewed it as well in some unspecified way. (I know University regulations for human subjects, but not FB’s.) So maybe both are true.
Cornell’s statement (again):
Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.
Kramer’s statement (again):
While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.
This post rolls up all of the major primary sources for the Facebook emotional manipulation study, along with selected news and commentary.
- “Experimental evidence of massive-scale emotional contagion through social networks” as PDF and as HTML (received Oct. 23, 2013, approved March 25, 2014, publication date June 17, 2014)
- Adam Kramer (Facebook)
- Jamie Guillory (UCSF Center for Tobacco Control Research and Education, previously Cornell)
- Jeffrey Hancock (Cornell Communications and Information Science)
- June 10 Press release announcing study, with correction re: funding source (original press release)
- June 30 Statement
- Human subjects policy
- IRB approval flowchart
- Policy on social media research
- Press comments by Susan Fiske (Princeton, PNAS editor for study): to Adrienne LaFrance (“pre-exisiting dataset”), to Matt Pierce (“PNAS should not second-guess the relevant IRB”), to @ZLeeily (similar)
UCSF: (Guillory became affiliated with UCSF only after the study was conducted)
- Current Data Use Policy (last revised November 15, 2013)
- Data Use Policy as of the time of the study (revised September 23, 2011)
- June 29 statement by Adam Kramer (confirmed to be Facebook’s statement on the matter)
- Editorial policies including (ii) author credits and (vii) human subjects policies
- Editorial Expression of Concern
Previous Facebook studies:
- Effect of rainfall on emotional content
- Voter turnout (Nature, 2012)
- Social structure of networks (Physica A, 2011) (see also Michael Zimmer’s discussion)
- What Makes Us Click? Demonstrating Incentives for Angry Discourse with Digital-Age Field Experiments (Journal of Politics, 2012)
- Designing and Deploying Online Field Experiments (WWW: International World Wide Web Conference 2014)
- Reed Albergotti and Elizabeth Dwoskin, Facebook Study Sparks Soul-Searching and Ethical Questions (Wall Street Journal, June 30, 2014)
- Reed Albergotti, Facebook Experiments Had Few Limits (Wall Street Journal, July 2, 2014)
- Facebook emotion study examined by privacy commissioner (CBC, July 3, 2014)
- Jessica Corbett, New questions, few answers in Cornell’s Facebook experiment (Ithaca Voice, July 3, 2014)
- Lisa Fleisher, Irish Data Privacy Watchdog To Probe Facebook’s Research Methods (Wall Street Journal, July 10, 2014)
- Lorenzo Francheschi-Bichhierai, The Mystery of the Facebook Manipulation Study’s Military Connection (Mashable, July 2, 2014)
- Brian Fung, The journal that published Facebook’s psychological study is raising a red flag about it (Washington Post, July 3, 2014) (PNAS “does not intend to investigate the study further”)
- Samuel Gibbs, Facebook policy head says emotion experiments were ‘innovative’ (The Guardian, July 3, 2014)
- Vindu Goel, As Data Overflows Online, Researchers Grapple With Ethics (New York Times, August 12, 2014)
- Kashmir Hill, Facebook Manipulated 689,003 Users’ Emotions For Science (Forbes)
- Kashmir Hill, Facebook Doesn’t Understand The Fuss About Its Emotion Manipulation Study (Forbes)
- Kashmir Hill, Facebook Got Permission To Do ‘Research’ On Users 4 Months After Emotion Manipulation Study (Forbes)
- Kashmir Hill, Ex-Facebook Data Scientist: Every Facebook User Is Part Of An Experiment At Some Point (July 7, 2014)
- Kashmir Hill, Facebook’s Chief Critic Wants Government To Investigate Websites Turning Users Into Guinea Pigs (July 17, 2014)
- J. Rai Krishna, Sandberg: Facebook Study Was ‘Poorly Communicated’ ( Wall Street Journal, July 2, 2014)
- Hannah Kuchler, UK data regulator probes Facebook over psychological experiment (Financial Times, July 1, 2014)
- Adrienne LaFrance, Even the Editor of Facebook’s Mood Study Thought It Was Creepy (The Atlantic)
- Robinson Meyer, Everything We Know About Facebook’s Secret Mood Manipulation Experiment (The Atlantic)
- David Auerbach, Here Are All the Other Experiments Facebook Plans to Run on You (Slate, June 30, 2014)
- Michael Bernstein, The Destructive Silence of Social Computing Researchers (Medium, July 7, 2014)
- Whitney Erin Boesel, Facebook’s Controversial Experiment: Big Tech Is the New Big Pharma (Time, July 3, 2014)
- danah boyd, What does the Facebook experiment teach us? (July 1, 2014)
- Cornelius Puschmann & Engin Bozdag, Staking out the unclear ethical terrain of online social experiments, Internet Policy Review (Nov. 26, 2014)n
- Amy Bruckman, Annoying Internet Users in the Name of Science (July 8, 2014)
- Arthur Caplan and Charles Seife, Facebook Experiment Used Silicon Valley Trickery (NBC News, June 30, 2014)
- Nicholas Carr, The Manipulators: Facebook’s Social Engineering Project (Los Angeles Review of Books, Sept. 14, 2014)
- Chris Chambers, Facebook fiasco: was Cornell’s study of ‘emotional contagion’ an ethics breach? (The Guardian, July 1, 2014)
- Robert Chirgwin, Trick-cyclists rally round in defence of Facebook emoto-furtling study (The Register, July 2, 2014)
- Thomas Claburn, Facebook Researchers Toy With Emotions: Wake Up (Information Week, June 30, 2014)
- Kate Crawford, The Test We Can—and Should—Run on Facebook (The Atlantic, July 2, 2014)
- Jenny Davis, Facebook Has Always Manipulated Your Emotions (The Society Pages, June 30, 2014)
- Sebastian Deterding, Frame Clashes, or: Why the Facebook Emotion Experiment Stirs Such Emotion (Tumbling Conduct, June 29, 2014)
- Sebastian Deterding, The Facebook Loophole (Medium, July 1, 2014)
- Dan Diamond, The Outrage Over Facebook’s ‘Creepy’ Experiment Is Out-Of-Bounds — And This Study Proves It (Forbes, July 1, 2014)
- Robert Dingwall, On the ethics of Facebook – and drawing the right conclusions (Social Science Space, July 16, 2014)
- Ed Felten, Facebook’s Emotional Manipulation Study: When Ethical Worlds Collide (Freedom to Tinker, June 30, 2014)
- Ed Felten, Privacy Implications of Social Media Manipulation (July 1, 2014)
- Ed Felten, On the Ethics of A/B Testing (Freedom to Tinker, July 8, 2014)
- Tarleton Gillespie, Facebook’s algorithm — why our assumptions are wrong, and our concerns are right (Culture Digitally, July 4, 2014)
- Dan Gillmor, Being a Facebook ‘Lab Rat’ Is The Tradeoff We’ve Made (Talking Points Memo, July 2, 2014)
- David Gorski, Did Facebook and PNAS violate human research protections in an unethical experiment? (Science-Based Medicine, June 30, 2014)
- Noah Grand, Overstating and Understating the Influence of Facebook (Science of News, July 4, 2014)
- Mary Gray, When Science, Customer Service, and Human Subjects Research Collide. Now What? (Culture Digitally, July 9, 2014)
- Mary Gray et al., MSR Faculty Summit 2014 Ethics Panel Recap (transcript of July 14 panel discussing Internet research ethics post-Facebook experiment)
- James Grimmelmann, As Flies to Wanton Boys (The Laboratorium, June 28, 2014)
- James Grimmelmann, Illegal, Immoral, and Mood-Altering (Medium, July 23, 2014)
- Beki Grinter, That Facebook Study (Beki’s Blog, July 8, 2014)
- John M. Grohol, Emotional Contagion on Facebook? More Like Bad Research Methods (Psych Central)
- Mike Gurstein, Facebook Does Mind Control (Gurstein’s Community Informatics, July 1, 2014)
- Matthew Herper, Dear Facebook, Please Experiment On Me (Forbes, June 30, 2014)
- Stephanie Harriman and Jigisha Patel, The ethics and editorial challenges of internet-based research (BMC Medicine, July 15, 2014)
- Kashmir Hill, 10 Other Facebook Experiments On Users, Rated On A Highly-Scientific WTF Scale (Forbes, July 10, 2014)
- Kashmir Hill, After The Freak-Out Over Facebook’s Emotion Manipulation Study, What Happens Now? (July 10, 2014)
- Michael Hiltzik, Facebook on its mood manipulation study: Another non-apology apology (Los Angeles Times, July 2, 2014)
- Michael Hiltzik, Facebook’s user manipulation study: Why you should be very afraid (Los Angeles Times, June 30, 2014)
- David Hunter, Consent and ethics in Facebook’s emotional manipulation study (July 1, 2014)
- Alan Jacobs, the Empire strikes back (The New Atlantis, July 3, 2014) (with exchange in comments with Tal Yarkoni)
- Jeffrey P. Kahn, Effy Vayena, and Anna C. Mastroianni, Opinion: Learning as we go: Lessons from the publication of Facebook’s social-computing research (Proceedings of the National Academy of Sciences, September 23, 2014)
- Brian Keegan, The Beneficence of Mobs: A Facebook Apologia (BrianKeegan.com, July 2, 2014)
- Robert Klitzman, Did Facebook’s experiment violate ethics? (CNN, July 2, 2014)
- Maria Konnikova, Did Facebook Hurt People’s Feelings? (The New Yorker, July 3, 2014)
- Adrienne LaFrance, How Much Should You Know About How Facebook Works? (The Atlantic, August 20, 2014) (interview with Jeffrey Hancock)
- Clifford Lampe, Facebook Is Good for Science (Chronicle of Higher Education, July 8, 2014)
- Jaron Lanier, Should Facebook Manipulate Users? (New York Times, June 30, 2014)
- George Lawton, Why Is It Ethical Not to Test for Emotional Impact? (Torque, September 8, 2014)
- Natasha Lennard, OkCupid and Facebook Aren’t the Only Ones Manipulating You, but That’s No Excuse (Vice, July 29, 2014)
- Farhad Manjoo, The Bright Side of Facebook’s Social Experiments on Users (New York Times, July 2, 2014)
- Mike Masnick, Law Professor Claims Any Internet Company ‘Research’ On Users Without Review Board Approval Is Illegal (TechDirt, September 24, 2014)
- Brian Merchant, The Facebook Manipulations (Motherboard, July 3, 2014)
- Michelle Meyer, How an IRB Could Have Legitimately Approved the Facebook Experiment—and Why that May Be a Good Thing (The Faculty Lounge, June 29, 2014)
- Michelle Meyer, Misjudgements will drive social trials underground (Nature, July 16, 2014)
- Nick Montfort, The Facepalm at the End of the Mind (Post Position, July 13, 2014)
- Andrés Monroy-Hernández, A system designer’s take on the Facebook study – a response to danah boyd’s blog post (Social Media Collective, July 7, 2014)
- Frank Pasquale, Facebook’s Model Users (July 3, 2014)
- Frank Pasquale, Social Science in an Era of Corporate Big Data (Concurring Opinions, July 4, 2014)
- Chris Peterson, Nature and/of the News Feed (Medium, July 2, 2014)
- Jules Polonetsky & Omer Tene, The Facebook Experiment: Gambling? In This Casino? (Re/Code, July 2, 2014)
- Erica Portnoy, Facebook Study a Rare Public Reminder of Corporate Big Data’s Unaccountable Power (Equal Future, July 2, 2014)
- Nathaniel Poor, How to Circumvent Your IRB in 4 Easy Steps (/dev/culture, July 4, 2014)
- Galen Pranger, Why the Facebook Experiment is Lousy Social Science (Medium, August 28, 2014)
- Cornelius Puschmann and Engin Bozdag, All the world’s a laboratory? On Facebook’s emotional contagion experiment and user rights (Humboldt Institute, June 30, 2014)
- Emilee Rader, The effects of the “deep news feed” (Bitlab, July 2, 2014?)
- Scott Robertson, Facebook’s Going to Be OK, but Science Is Taking a Hit (Medium, July 9, 2014)
- Jay Rosen, Facebook’s controversial study is business as usual for tech companies but corrosive for universities (Washington Post, July 3, 2014)
- Jay Rosen, “I’ve been following the fallout …” (Facebook post, July 5, 2014)
- Jay Rosen, Jay Rosen to journalists and editors: ‘Facebook has all the power. You have almost none’ (World Editors Forum, July 10, 2014) (interview)
- Jay Rosen, Why Do They Give Us Tenure? (Jay Rosen’s PressThink, Oct. 25, 2014)
- Timothy Ryan, On the ethics of Facebook experiments (Washington Post, July 3, 2014)
- Michael Sacasas, The Facebook Experiment, Briefly Noted (The Frailest Thing, July 2, 2014)
- Matthew Salganik, After the Facebook emotional contagion experiment: A proposal for a positive path forward (Freedom to Tinker, July 7, 2014)
- Christian Sandvig, Corrupt Personalization (Multicast, June 26, 2014)
- Stuart Schechter and Cristian Bravo-Lillo, Using Ethical-Response Surveys to Identify Sources of Disapproval and Concern with Facebook’s Emotional Contagion Experiment and Other Controversial Studies (July 15, 2014)
- Zachary M. Schrag, A Bit of Historical Perspective on the Facebook Flap (Institutional Review Blog, June 30, 2014)
- David Ayman Shamma, Experiments, Data, and the Scientific Ecosystem. (Medium, July 8, 2014)
- Evan Selinger and Woodrow Hartzog, How to Stop Facebook From Making Us Pawns in Its Corporate Agenda (July 1, 2014)
- Micah Sifry, Why Facebook’s ‘Voter Megaphone’ Is the Real Manipulation to Worry About (TechPresident, July 3, 2014)
- Micah Sifry, Facebook Wants You to Vote on Tuesday. Here’s How It Messed With Your Feed in 2012. (Mother Jones, October 31, 2014)
- Daniel Solove, Facebook’s Psych Experiment: Consent, Privacy, and Manipulation (LinkedIn, June 30 2014)
- Jenny Stromer-Galley, Facebook Users or Lab Rats?: Ethical Research in the Age of Big Data (July 1, 2014)
- Zeynep Tufekci, Facebook and Engineering the Public (Medium, June 29, 2014)
- Duncan J. Watts, Stop complaining about the Facebook study. It’s a golden age for research (The Guardian, Monday 7, 2014)
- Duncan J. Watts, Lessons Learned From the Facebook Study (Chronicle of Higher Education, July 9, 2014)
- Dave Winer, About Facebook users and Facebook (Scripting News, July 4, 2014)
- D. Yvette Wohn, Emotion Contagion or Conforming to Social Norms? Are we misinterpreting Facebook’s psych experiment? (June 29, 2014)
- Janet Vertesi, The Real Reason You Should Be Worried About That Facebook Experiment (Time, July 2, 2014)
- Lee Vinsel, What’s Really Behind The Facebook Psyche Experiment Controversy (Taming the American Idol, July 2, 2014)
- Paul Voosen, In Backlash Over Facebook Research, Scientists Risk Loss of Valuable Resource (Chronicle of Higher Education, July 1, 2014)
- Paul Voosen, Big-Data Scientists Face Ethical Challenges After Facebook Study (Chronicle of Higher Education, December 15, 2014)
- Tal Yarkoni, In Defense of Facebook (Citation Needed, June 28, 2014)
- Shoshana Zuboff, Dark Facebook: Facebook’s Secret Experiment in Emotional Manipulation Provides a Fresh Glimpse of its Radical Politics and Absolutist Ambitions (The Summons, July 1, 2014)
- Michael Corey, A Sociologist Working at Facebook (OrgTheory.net, Jan. 14, 2014) (first-hand account of researcher working at Facebook)
- Andrew Ledvina, 10 ways Facebook is actually the devil (AndrewLedvina.com, July 4, 2014)
- Reimaging Facebook’s Emotional Contagion Study (The Orbital Eccentric, July 3, 2014) (redrawn version of results chart in study)
- Electronic Privacy Information Center, Complaint to Federal Trade Commission (July 3, 2014)
- American Psychological Association statement on informed consent (June 30, 2014)
- Letter from Sen. Mark R. Warner to Federal Trade Commission (July 9, 2014)
- James Grimmelmann and Leslie Meltzer Henry, Letter to Proceedings of the National Academy of Sciences (July 17, 2014)
- James Grimmelmann and Leslie Meltzer Henry, Letter to the Office for Human Research Protections (July 17, 2014)
- James Grimmelmann and Leslie Meltzer Henry, Letter to the Federal Trade Commission (July 17, 2014)
- James Grimmelmann and Leslie Meltzer Henry, Letter to Facebook (July 24, 2014), and Response from Edward Palmieri, Associate General Counsel, Privacy, Facebook (August 25, 2014)
- James Grimmelmann and Leslie Meltzer Henry Letter to OkCupid (July 30, 2014)
- James Grimmelmann and Leslie Meltzer Henry Letter to Maryland Attorney General Douglas F. Gansler (September 23, 2014)
OK Cupid experiments:
- Christian Rudder, We Experiment On Human Beings! (OK Trends, July 28, 2014)
- David Auerbach, Big Data Is Overrated (Slate, July 30, 2014)
- David Banks, Don’t Opt Out, Take Back (Cyborgology, August 4, 2014)
- Jeff Bercovici, OkCupid’s Christian Rudder On Human Experiments And Getting Ugly People Dates (Forbes, Sept. 9, 2014)
- Christopher Caldwell, OkCupid’s venal experiment was a poisoned arrow (Financial Times, August 1, 2014)
- Tim Carmody, Why don’t OKCupid’s experiments bother us like Facebook’s did (Kottke.org, July 28, 2014)
- Tim Carmody, The problem with OKCupid is the problem with the social web, (Kottke.org, August 1, 2014)
- Gregory Ferenstein, OkCupid draws illustration of users as guinea pigs, literally (VentureBeat, July 30, 2014)
- Brian Fung, OkCupid reveals it’s been lying to some of its users. Just to see what’ll happen. (Washington Post Switch Blog, July 28, 2014)
- Dan Gillmor, Is the internet now just one big human experiment? (The Guardian, July 29, 2014)
- Kashmir Hill, OkCupid Lied To Users About Their Compatibility As An Experiment (Forbes, July 28, 2014)
- Kashmir Hill, How OkCupid Informed Users They’d Been Part Of An Experiment (Forbes, July 29, 2014)
- Robert Howell, OK, Stupid–OK Cupid’s Ethical Confusion (The Daily Sabbatical, August 25, 2014)
- Selena Larson, Everyone’s A Lab Rat In OkCupid’s Labyrinth Of Love (ReadWrite, July 28, 2014)
- Dave Karpf, On the Ethics of A/B Testing (Shouting Loudly, July 31, 2014)
- Sam Machkovech, Did OKCupid’s dating-results experiment help an Arsian find love? (Ars Technica, July 30, 2014)
- Chadwick Matlin, Matchmaker, Matchmaker, Make Me A Spreadsheet (FiveThirtyEight, Sept. 9, 2014)
- Martin Robbins, Does OKCupid need our consent? (The Guardian, July 30, 2014)
- Jay Rosen, Listener’s guide to Christian Rudder explaining why OkCupid experimented with unwitting users (PressThink, August 3, 2014)
- Natasha Singer, OkCupid’s Unblushing Analyst of Attraction (N.Y. Times, Sept. 6, 2014)
- Casey Sullivan, OkCupid experiment may violate FTC rules on deceptive practices (Reuters, July 30, 2014)
- Charlie Warzel, OkCupid Data Scientist: “I’m Not Playing God” (BuzzFeed, July 28, 2014)
- Molly Wood, Looking for Love on the Web, as It Experiments With You (New York Times, July 28, 2014)
- Cat Zakrzewski, Why OKCupid’s Experiments Aren’t The Same As Facebook’s (TechCrunch, July 30, 2014)
Interviews with Christian Rudder about OKCupid experiments:
- All Things Considered (July 29, 2014)
- BigThink (September 9, 2014)
- The Takeaway (July 30, 2014)
- TLDR (July 31, 2014) (Transcript)
Ashley Madison study:
- American Sociological Association Press Release (August 19, 2014)
- Belinda Luscombe, Cheaters’ Dating Site Ashley Madison Spied on Its Users (Time, August 19, 2014)
Facebook’s Modified Research Policy:
- Mike Schroepfer, Research at Facebook (Facebook Newsroom, October 2, 2014)
- Reed Albergotti, Facebook Tightens Oversight of Research (Oct. 2, 2014)
- Lorenzo Francheschi-Biccherai, Legal Experts Unimpressed with Facebook Oversight Promises (Mashable, Oct. 3, 2014)
- Vindu Goel, Facebook Promises a Deeper Review of Its User Research (New York Times, October 2, 2014)
- Michelle Meyer, Facebook Announces New Research Policies (The Faculty Lounge, October 2, 2014)
- Jay Rosen, [untitled] (Ello, October 3, 2014)
Most recent update: 9:05 PM, Monday June 30
If you were feeling glum in January 2012, it might not have been you. Facebook ran an experiment on 689,003 users to see if it could manipulate their emotions. One experimental group had stories with positive words like “love” and “nice” filtered out of their News Feeds; another experimental group had stories with negative words like “hurt” and “nasty” filtered out. And indeed, people who saw fewer positive posts created fewer of their own. Facebook made them sad for a psych experiment.
I first saw the story on Facebook, where a friend picked it up from the A.V. Club, which got it from Animal, which got it from the New Scientist, which reported directly on the paper. It’s exploding across the Internet today (e.g. MetaFilter), and seems to be generating two kinds of reactions: outrage and shrugs. I tend more towards anger; let me explain why.
Facebook users didn’t give informed consent: The study says:
[The study] was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.
The standard of consent for terms of service is low. But that “consent” is a legal fiction, designed to facilitate online interactions. (See Nancy Kim and Margaret Jane Radin’s books for more.) It’s very different from informed consent, the ethical and legal standard for human subjects research (HSR). The Federal Policy for the Protection of Human Subjects, a/k/a the Common Rule, requires that informed consent include:
(1) A statement that the study involves research, an explanation of the purposes of the research and the expected duration of the subject’s participation, a description of the procedures to be followed, and identification of any procedures which are experimental;
(2) A description of any reasonably foreseeable risks or discomforts to the subject; …
(7) An explanation of whom to contact for answers to pertinent questions about the research and research subjects’ rights, and whom to contact in the event of a research-related injury to the subject;
(8) A statement that participation is voluntary, refusal to participate will involve no penalty or loss of benefits to which the subject is otherwise entitled, and the subject may discontinue participation at any time without penalty or loss of benefits to which the subject is otherwise entitled.
Facebook’s actual Data Use Policy contains none of these, only general statements that “we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” and “We give your information to the people and companies that help us provide, understand and improve the services we offer. For example, we may use outside vendors to … conduct and publish research.” Neither of these comes close to a “description of the procedures to be followed” or a “description of any reasonably foreseeable risks or discomforts,” and the Data Use Policy doesn’t even attempt to offer a contact for questions or an opt-out.
Federal law requires informed consent: To be sure, the Common Rule generally only applies to federally funded research, and Facebook is a private company. But that’s not the end of the story. The paper has three co-authors: Facebook’s Adam Kramer, but also Jamie Guillory from UCSF and Jeffrey Hancock from Cornell. UCSF and Cornell are major research universities and receive large sums of federal funding. Both of them have institutional review boards (IRBs), as required by the Common Rule: an IRB examines proposed research protocols to make sure they protect participants, obtain informed consent, and otherwise comply with ethical and legal guidelines.
I don’t know whether the study authors presented it to an IRB (the paper doesn’t say), but it strikes me as the sort of research that requires IRB approval. It further strikes me that the protocol as described is problematic, for the reasons described above. I don’t know whether I’m more afraid that the authors never obtained IRB approval or that an IRB signed off on a project that was designed to (and did!) make unsuspecting victims sadder.
The study harmed participants: The paper also argues:
[The study software] was adapted to run on the Hadoop Map/Reduce system (11) and in the News Feed filtering system, such that no text was seen by the researchers.
This claim misses the point. For an observational study, automated data processing is a meaningful way of avoiding privacy harms to research subjects. (Can robot readers cause a privacy harm? Bruce Boyden would say no; Samir Chopa would say yes.) But that is because in an observational study, the principal risks to participants come from being observed by the wrong eyes.
This, however, was not an observational study. It was an experimental study—indeed, a randomized controlled trial—in which participants were treated differently. We wouldn’t tell patients in a drug trial that the study was harmless because only a computer would ever know whether they received the placebo. The unwitting participants in the Facebook study were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise. That’s psychological manipulation, even when it’s carried out automatically.
This is bad, even for Facebook: Of course, it’s well know that Facebook, like other services, extensively manipulates what it shows users. (For recent discussions, see Zeynep Tufekci, Jonathan Zittrain, and Christian Sandvig). Advertisers and politicians have been in the emotional manipulation game for a long time. Why, then, should this study—carried out for nobler, scientific purposes—trigger a harsher response?
One reason is simply that some walks of life are regulated, and Facebook shouldn’t receive a free pass when it trespasses into them simply because it does the same things elsewhere. Facebook Beacon, which told your Facebook friends what you were doing on other sites, was bad everywhere but truly ugly when it collided with the Video Privacy Protection Act. So here. Whatever you think of Facebook’s ordinary marketing-driven A/B testing is one thing: what you think of it when it hops the fence into Common Rule-regulated HSR is quite another. Facebook has chosen to go walking in a legal and ethical minefield; we should feel little sympathy when it occasionally blows up. (That said, insisting on this line would simply drive future research out of the academy and into industry, where our oversight over it will be even weaker. Thus …)
A stronger reason is that even when Facebook manipulates our News Feeds to sell us things, it is supposed—legally and ethically—to meet certain minimal standards. Anything on Facebook that is actually an ad is labelled as such (even if not always clearly.) This study failed even that test, and for a particularly unappealing research goal: We wanted to see if we could make you feel bad without you noticing. We succeeded. The combination of personalization and non-rational manipulation may demand heightened legal responses. (See, e.g., Ryan Calo, or my thoughts on search engines as advisors.)
The real scandal, then, is what’s considered “ethical.” The argument that Facebook already advertises, personalizes, and manipulates is at heart a claim that our moral expectations for Facebook are already so debased that they can sink no lower. I beg to differ. This study is a scandal because it brought Facebook’s troubling practices into a realm—academia—where we still have standards of treating people with dignity and serving the common good. The sunlight of academic practices throws into sharper relief Facebook’s utter unconcern for its users and for society. The study itself is not the problem; the problem is our astonishingly low standards for Facebook and other digital manipulators.
This is a big deal: In 2006, AOL released a collection of twenty million search queries to researchers. Like the Facebook study authors, AOL thought it was protecting its users: it anonymized the users’ names. But that wasn’t sufficient: queries like “‘homes sold in shadow lake subdivision gwinnett county georgia” led a reporter straight to user No. 4417749. Like Facebook, AOL had simply not thought through the legal and ethical issues involved in putting its business data to research purposes.
The AOL search-query release became known as the “Data Valdez” because it was a vivid and instantly recognizable symbol of the dangers of poor data security. It shocked the public (and the industry) into attention, and put search privacy on the map. I predict, or at least I hope, that the Facebook emotional manipulation study will do the same for invisible personalization. It shows, in one easy-to-grasp lesson, both the power Facebook and its fellow filters hold to shape our online lives, and the casual disdain for us with which they go about it.
UPDATE: The study was presented to an IRB, which approved it “on the grounds that Facebook filters user news feeds all the time, per the agreement.” See @ZLeeily, with hat tips to Kashmir Hill and @jon_penney.
UPDATE: Kashmir Hill reports:
Professor Susan Fiske, the editor at the Proceedings of the National Academy of Sciences for the study’s publication, says the data analysis was approved by a Cornell Institutional Review Board but not the data collection. “Their revision letter said they had Cornell IRB approval as a ‘pre-existing dataset’ presumably from Facebook, who seems to have reviewed it as well in some unspecified way,” writes Fiske by email.
UPDATE: For much more on the IRB legal issues, see this detailed post by Michelle Meyer. She observes that the Common Rule allows for the “waiver or alteration” of informed consent for research that poses “minimal risk” to participants. The crucial issue there is whether the study “could not practicably be carried out without the waiver or alteration.” Meyer also has an extended discussion of whether the Common Rule apples to this research—following the Cornell restatement, it is much less clear that it does.
UPDATE: I’ve created a page of primary sources related to the study and will update it as more information comes in.
The Second Circuit’s decision in Authors Guild v. HathiTrust is out. This, as a reminder, is the offshoot of the Google Books litigation in which the Authors Guild inexplicably sued Google’s library partners. The trial judge, Harold Baer, held for the libraries in 2012 in a positively exuberant opinion:
I cannot imagine a definition of fair use that would not encompass the transformative uses made by Defendants’ MDP [Mass Digitization Project] and would require that I terminate this invaluable contribution to the progress of science and cultivation of the arts that at the same time effectuates the ideals espoused by the ADA.
The Second Circuit’s opinion drops the grand rhetoric, but otherwise the bottom line is basically the same: mass digitization to make a search engine is fair use, and so is giving digital copies to the print-disabled. The opinion on appeal is sober, conservative, and to the point; it is the work of a court that does not think this is a hard case.
On full-text search:
- Factor 1: “[T]he creation of a full‐text searchable database is a quintessentially transformative use” because it serves a “new and different function.” Authors write to be read, not to be searched.
- Factor 2: The nature of the copyrighted work fades into irrelevance for transformative uses.
- Factor 3: Since full-text search requires copying full books, the copying isn’t excessive in light of the use. True, HathiTrust makes four copies of each book, two live and two in tape backup, but those are appropriate precautions against Internet outages and natural disasters. (It’s nice to see a court recognize that strict copy-counting is a fool’s errand in light of modern IT; better to focus, as the court here does, on the uses those copies enable.)
- Factor 4: “[T]he full‐text‐search use poses no harm to any existing or potential traditional market … .” Book reviews do not substitute for sales of a book, even when they convince readers not to buy the book; so here. There is no lost licensing market because full-text search is not a substitute for books in the first place. (No citation to American Geophysical!) And while the Authors Guild says there’s a risk of a security breach, saying so doesn’t make it so: the harm from a hypothetical breach is pure speculation.
On print-disabled access:
- Factor 1: Providing access to the print-disabled is not transformative: “By making copyrighted works available in formats accessible to the disabled, the HDL [HathiTrust Digital Library] enables a larger audience to read those works, but the underlying purpose of the HDL’s use is the same as the author’s original purpose.” But providing such access is still a favored use: there is a national policy of promoting access, reflected in the Chafee Amendment and recognized by the Supreme Court.
- Factor 2: Irrelevant again, even though the use isn’t transformative. (Factor 2 never matters for published expressive works.)
- Factor 3: The scanned images—and not just the OCR’ed text—are useful to print-disabled readers. Some readers are print-disabled because they need greater magnification or stronger color contrast than paper provides, others because they can’t turn pages. Scanned images help them both. (It’s nice to see a court take the diversity of disabilities seriously; Dan Goldstein’s advocacy here clearly helped.)
- Factor 4: There is no market for selling books to the print-disabled; only a small percentage of books are published in accessible formats and even for those authors typically forego their royalties. (The Authors Guild’s war against text to speech has come back to bite it.)
These holdings merely affirm the District Court’s conclusions, but they are still a big deal. The Second Circuit’s decisions are binding precedent in New York, the nation’s publishing capital, and are highly influential beyond. Five judges have now upheld the legality of scanning books to make a search engine; none has disagreed.
The other major points in the opinion all consist of declining to decide:
- The Authors Guild lacks standing to sue on behalf of its members. The case continues, thanks to the international organizations and the individual plaintiffs, but ouch. By pressing the Google Books cases, the Authors Guild has undercut its ability to take legal action on behalf of “authors” in general. In a real sense, it is legally weaker than when the case started.
- Preservation uses aren’t ripe for consideration because the court has already held that hanging on to four copies is fully justified by the operation demands of providing full-text search. That only leaves printing replacement copies for lost or damaged ones when they’re unavailable for purchase at a fair price, but since it’s not clear whether or when that would happen—let alone whether it would happen to one of the remaining plaintiffs’ books—the issue isn’t ripe to decide.
- Since Michigan has suspended the orphan works project (showing orphaned works to non-disabled patrons) and has no plans to reinstate it in the same form, those issues aren’t ripe either. The libraries dodged a bullet here; if they want to try again, it will be on terms of their choosing.
The opinion is a green light for library search engine digitization. It is an even greener light for making books and other works accessible to the disabled. And there was great rejoicing at the DPLA and the Internet Archive. There is not very much new in the opinion, but its very lack of novelty sends a strong signal that these uses are now clearly established.
What next? The Authors Guild could ask for rehearing, or petition for certiorari. I personally don’t like those odds, but I have never really understood the Guild’s decision-making process around this case, so who knows? The opinion sends a strong signal that the case against Google, also on appeal to the Second Circuit, is also likely to go in favor of scanning. At the very least, if the two cases are to be distinguished, it will have to be on narrow grounds: that Google makes commercial uses or shows snippets. Even that would provide clear guidance for digitizers. The holding may also cast a shadow on other search, education, and access cases, for example the Georgia State e-reserves case.
There are many reasons to love George Takei. But this is not one of them:
"Solar Frickin' Roadways"—I like the sound of that. Worth a look. Dare to dream, I say. https://t.co/cpnogIgeo8— George Takei (@GeorgeTakei) May 23, 2014
Takei’s tweet helped the Solar Roadways project’s Indegogo fundraising campaign blow past its million-dollar goal. Their plan is to replace asphalt road surfaces with durable solar panels with embedded LEDs and sensors, turning highways into smart power-generating grids that can melt snow and give drivers safety warnings. Each individual hexagonal panel is capable of cranking out only a few watts, but if you do out the math on replacing all our highways with the the panels, it comes out to a ludicrous sum, well more than the United States’s entire current energy consumption.
It pains me, then, to say that the idea behind Solar Roadways isn’t just crazy; it’s obviously crazy. It’s Troy Hurtubise crazy; it’s Dr. Evil crazy. All the hype around solar panels and LEDs simply disguises the fact that Solar Roadways fundamentally misunderstands what a road is.
A road is a system for distributing moving loads into the ground.
That is its one indispensable job: to allow people and vehicles to travel atop it while absorbing the forces they create. I don’t claim to be a civil engineer. But I know enough high-school physics to be capable of asking questions whose answers are nowhere to be found on the Solar Roadways site, in any technical documentation, or anywhere at all on the Internet (so far as I can tell). Solar Roadways has simply not attempted to address the bread-and-butter engineering problems that highway builders have spent decades dealing with. Here are a few:
- What do the solar panels rest on? The sublayer beneath them has to be made of something. That something will receive the forces transmitted downwards through the panels, and that something will degrade over time. To fix it, you’ll have to remove and restore the panels.
- Cars and trucks will put their weight unevenly, on different parts of the panels. How resistant are the panels to bending? To shearing? These are different from the simpler question that Solar Roadways does discuss: how much weight they can take before being crushed.
- Cars and trucks will accelerate and brake and push against the air; they have to push off against something. Solar Roadways has thought about traction. But receiving the force from vehicles is only half the problem, because they must also transmit the force to the sublayer. How are they anchored to it? How will the anchors deal with the immense lateral force of a braking tractor-trailer?
- Water will get between and below the panels, especially in climates where the panels are supposed to melt snow. What happens when that water expands as it freezes? What happens when it freezes and unfreezes repeatedly?
I am not suggesting that these are insurmountable engineering challenges. We live in an age of near miracles. I am simply suggesting that they are challenges, and that they are obviously challenges. Not to see them fleshed out in the slightest is deeply discouraging, because it means that Solar Roadways is not approaching a gigantic engineering problem as an engineering problem. Even the Hyperloop—the Hyperloop!—came with a design document that tried (if not always successfully) to think through the engineering issues. When your futuristic transport technology is bigger vaporware than the Hyperloop, you have a problem.
It is not as though these are exotic problems, like building a quantum computer. These are familiar problems; they are the bread and butter of highway engineers. But no one has asked in detail, “What problems do road-builders currently solve, and how will solar roadways deal with these same issues?” Asphalt, for all its other issues, distributes moving loads quite well for its price point. To work as roads, Solar Roadways will need to replicate that success. Leave the solar panels and electronics aside for a moment: if building road surfaces out of thick glass was a good idea in its own right, we would be much more familiar with glass roads.
But assume even that these issues are all in the end solvable. Will they be solvable at cost? Almost certainly not. Solar power engineering faces its own significant design constraints. We are making progress in bringing the cost down, but still this is hard stuff. These are completely different design constraints than highway engineering faces. Why on earth would you insist on solving both sets of problems simultaneously in the same surface, if you didn’t have to?
There is no plausible future in which solar-panel roads make more sense than solar panels plus roads. There are plenty of other places to put solar panels, and plenty of other ways to make highways smarter. Solar sidewalks. Solar medianstrips. Maybe these are also terrible ideas. But they are unambiguously better ideas than solar roadways.
Science and engineering, done right, can be beautiful and amazing. But it doesn’t run in the other direction. Something can be as cool and as awesome as solar frickin roadways, and still not work as science. I understand the impulse that made people open their wallets to support this appeal to progressive technology as a solution to humanity’s self-inflicted woes. But science is the pursuit of truth, not truthiness, and solar roadways are scienciness, not science.