Image credit Kevin Krejci
When it first emerged that Facebook had conducted psychological manipulation experimentation on 600,000 of its users without their consent, lots of people I expected to be outraged basically shrugged. Facebook is essentially an advertising company, after all. And as anyone who’s ever caught an episode of Mad Men knows, advertisers have been hard at work figuring out how to manipulate our emotions for decades. That doesn't make what Facebook did ok, but it places the company's troubling actions within a much larger context—that of an entire industry premised on manipulating our feelings, all towards the end of getting us to spend money.
But now something much more sinister has emerged about the Facebook manipulation pilot: One of the researchers who worked on the project, Cornell University’s Jeffrey Hancock, has received Department of Defense funding for Minerva Institute research aimed at identifying how ideas spread, towards the end of arresting and halting dissent and political movements.
SCG News identified the connection:
In the official credits for the study conducted by Facebook you'll find Jeffrey T. Hancock from Cornell University. If you go to the Minerva initiative website you'll find that Jeffery Hancock received funding from the Department of Defense for a study called "Cornell: Modeling Discourse and Social Dynamics in Authoritarian Regimes". If you go to the project site for that study you'll find a visualization program that models the spread of beliefs and disease.
Cornell University is currently being funded for another DoD study right now called "Cornell: Tracking Critical-Mass Outbreaks in Social Contagions" (you'll find the description for this project on the Minerva Initiative's funding page).
In June 2014 the Guardian published a must-read essay by Nafeez Ahmed about the militarization of the academy by way of military funding, paying particularly close attention to the field of anthropology. Ahmed described the Cornell project, funded by the Department of Defense, undertaken by the same person who worked on the Facebook emotions study:
The project will determine "the critical mass (tipping point)" of social contagions by studying their "digital traces" in the cases of "the 2011 Egyptian revolution, the 2011 Russian Duma elections, the 2012 Nigerian fuel subsidy crisis and the 2013 Gazi park protests in Turkey."
Twitter posts and conversations will be examined "to identify individuals mobilised in a social contagion and when they become mobilised.”
Facebook's experiment to manipulate users’ emotions is disturbing enough on its own. When viewed within the context of the DoD’s related research pertaining to identifying powerful movement leaders online, it becomes positively chilling.