The concept of “social engineering” is one that Western society has struggled with for at least a century. Karl Popper, the famous philosopher of science, wrote in his work The Open Society and Its Enemies about the pitfalls of social engineering and distinguished between different forms. The field of cybernetics attempted to quantify and model mass behaviors. More well known, however, are its appearances in the literary world of science fiction.
Science fiction authors have generally depicted a pessimistic vision of worlds shaped by social engineering. Futuristic or space-traveling societies are often portrayed as tightly controlled, with little to no concern for governmental invasions into privacy. George Orwell, in his classic novel 1984, placed the reins of social engineering in the hands of post-nuclear-apocalypse governments. The dystopian outcome of such a scenario has become so well-known and well-feared that we refer to it as Orwellian. Aldous Huxley’s Brave New World merges social and genetic engineering to demonstrate how genetics might enforce a socially-engineered class structure.
One novel that portrays a successfully beneficial form of social engineering is Isaac Asimov’s Foundation series. The series opens with our being introduced to one Hari Seldon, the master of a fledgling field called “psychohistory”. Seldon’s psychohistory converted all known information about psychology of the masses into a set of mathematical operators and symbols in order to predict the course of galactic history. He used his knowledge to predict the fall of the Galactic Empire followed by a 30,000 year interregnum. However, those equations allowed Seldon to see all the turning points in the historical flow, allowing him to construct a path in history that would reduce the time between galactic empires to just 1,000 years. He took fate into his hand to direct the course of history for the next millennium in humanity’s favor. Even in this case, however, the pre-condition for success was that only those committed to the cause could ever be allowed to study the field in the future. While a less righteous soul might wield the knowledge for selfish interests, Seldon was more concerned that the foreknowledge of events would itself affect the flow of history adversely.
With the advent of “Big Data”, social engineering may be getting its day in court. In the May/June 2014 issue of MIT Technology Review, there is an article by Nicholas Carr entitled “The Limits of Social Engineering”. Carr discusses the research being performed by Alex Pentland, the Toshiba Professor of Media, Arts, and Sciences at MIT. Pentland is an expert in data science, and recently published a book called Social Physics about how the existence of “Big Data” has opened the door to converting the social sciences into a more empirical discipline, as physics is. (Hence the title of the book.) The technology of computer networks allows social scientists access to precise data about the movements, choices, and interactions of massive numbers of individuals. Pentland refers to the capture and analysis of this data as “reality mining”.
Pentland’s ambitious and exciting research does not stop at mere sociological observation. Just as in physics we can both observe and manipulate, Pentland believe that, through social physics, we can use technology to propagate social engineering. In an experiment described in the article, Pentland and associates used “social physics” to make extremely detailed observations about things such as work productivity, energy level, and interpersonal interactions at a bank’s call center. They then used their observations to show that tweaking the coffee-break schedule improved the call center’s productivity. Pentland suggests that using networks to incentivize behaviors can help corporations or governments tailor society to their liking.
Carr raises two very serious concerns with the idea of using Big Data to create social engineering. The more obvious concern is the invasion of privacy. In order for this kind of research to be successful, researchers will need access to very personal data from large numbers of subjects. Even if you set up systems where users can opt to maintain whatever degree of privacy they prefer, corporations (and governments?) are unlikely to give up the advantages of having access to reams of private data.
Recent events have borne out this concern. In a study recently published in PNAS, researchers demonstrated that they were able to use Facebook News Feeds to affect the emotional states of Facebook users. There has been much debate (and some angry ranting) about whether Facebook’s Data Use Policy’s inclusion of using personal data for “research” satisfies the concept of informed consent for a study of this Nature. However, why aren’t we all enraged at Facebook? Facebook has demonstrated that they are happy to use our personal data for their personal gain. Edward Snowden taught us that the NSA is invading everybody’s data, not just suspects. This study is only the one that has been revealed to us, because it was an academic study. Facebook (and others) are likely investing a great deal of time and money in manipulating our behaviors to their advantage. Are we prepared to sacrifice our privacy for the utility these things are providing? Are the corporations (et al.) getting the better end of that deal? (It seems that both questions are answered with yes.)
The second concern Carr raises is that social physics is a reductionist analysis that doesn’t take into account any of the underlying social dynamics behind its observations. In his words: “Defining social relations as a pattern of stimulus and response makes the math easier, but it ignores the deep, structural sources of social ills.” Many of the underlying social and historical issues, such as class assignment and access to money and resources, limit the choices that people can make, undercutting a purely black-box approach to the data.
As I read Carr’s article, all of these science-fiction classics came to mind, as did all the questions they raise. Most authors felt strongly that placing this kind of power in the hands of individuals or small groups does not lead to better societies; it merely generates societies that are the most beneficial for the group in power. Online entities (such as Facebook) frequently condition their services on our sacrificing our right to privacy. We have become so accustomed to online services that we have forgotten what we have to sacrifice to do it. I find the fact that Google can tailor ads to my email inbox frightening.
The door to social engineering, be it harmful or helpful, can only be opened to the extent we are willing to share information with the world. Are we ready to virtually tear down the walls of our homes? Or is it already too late to go back?