The End of Theory in Digital Social Research?
Computer code, software and algorithms have sunk deep into the “technological unconscious” of our contemporary “lifeworld.” How might this affect academic research in the social sciences and the formation of the professional identities of academics? These are important questions for researchers working in Digital Media and Learning, asking us to consider how the digital devices and infrastructures that we study might actually be shaping our practices, shaping our production of knowledge, and shaping our theories of the world.
Professional work across the natural, human and social sciences is now increasingly mediated and augmented by computer coded technologies. This is perhaps most obvious in the natural sciences and in developments such as the vast human genome database. As Geoffrey Bowker has argued in Memory Practices in the Sciences, such databases are increasingly viewed as a challenge to the idea of the scientific paper (with its theoretical framework, hypothesis and long-form argumentation) as the “end result” of science:
The ideal database should according to most practitioners be theory-neutral, but should serve as a common basis for a number of scientific disciplines to progress. … In this new and expanded process of scientific archiving, data must be reusable by scientists. It is not possible simply to enshrine one’s results in a paper; the scientist must lodge her data in a database that can be easily manipulated by other scientists.
The algorithmic techniques of sorting, ordering, classification and calculation associated with computer databases have become a key part of the infrastructures underpinning contemporary big science.
The coding and databasing of the world does not, though, end with big science. Across multiple disciplines “big data” are now being generated and mobilized by a variety of humanly operated as well as automated systems. Social scientific research, the humanities, and the production of knowledge and theory across disciplines are all now affected by software code, algorithms and the data they mediate.
For some enthusiastically wired commentators this is tantamount to “the end of theory”—the triumph of computers, data and quantification over disciplinary expertise:
This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.
While it may be sensible to caution against the “cyberbole” of such announcements, it is clear that these technologies and techniques are now increasingly interwoven with what academic researchers do. The emergence of approaches such as “digital sociology” and “digital social research” reflects disciplinary anxieties about the relevance of social science in the age of social media data and analytics.
A “redistribution of methods” in digital social research, for example, has involved social science researchers acknowledging the proliferation of new devices and formats for the documentation of social life. These include the use of Twitter and blogs to document everyday activities, mobilizing search engine analytics to reveal massive population trends and social behaviours over time, the analysis of Instagram images to “zoom into” cultural and social patterns, and the study of social network formation on Facebook, and so on. These platforms enable the routine generation of data about social life and make available new forms of social data, analysis and visualization. The capacity to mobilise data graphically as visualizations and representations, or “database aesthetics,” amplifies the rhetorical, argumentative and persuasive function of data.
These developments have led to both optimistic and pessimistic visions of the future of social research. On the one hand, these technologies grant us much greater empirical, analytical and argumentative capacity; on the other, they seriously threaten established social research and concentrate data analysis and knowledge production in a few highly resourced research centres, including the R&D labs of corporate technology companies.
So who even does social research any more? Instead of social scientists, the new social experts of the social media environment are the “algorithmists” and big data analysts of Google, Facebook and Amazon, and of new kinds of data analysis start-ups, intermediaries and “policy labs” that work across the technology, social scientific and policy fields. Algorithmists are experts in the areas of computer science, mathematics, and statistics, as well as aspects of law, economics and social research, who can undertake big data analyses and predictions.
It is notable that Facebook, for example, has a Data Science Team that is responsible for “what Facebook knows” and can “apply math, programming skills, and social science to mine our data for insights that they hope will advance Facebook’s business and social science at large.” The team is run by Facebook’s “in-house sociologist” who is “confident that exploring this resource will revolutionize the scientific understanding of why people behave as they do.”
In the face of such developments utilizing complex programming and data to advance social scientific methods and understanding, then, what is the role of the scholar conducting independent research? The work of researchers in universities is already subject to “metricization” from an assortment of measuring and calculating devices including bibliometrics, citation indices, workload models, transparent costing data, research and teaching quality assessments, and commercial university league tables, many increasingly enacted via code, software and algorithmic forms of power.
These devices play a large part in the formation of academics' professional identities. Gary Hall, writing in an article titled “#MySubjectivation,” argues that, besides the metricization of the academy, corporate social media platforms such as Twitter and Facebook are also now influencing how academics create, perform and circulate research, knowledge and theory. Academics are encouraged to be self-entrepreneurial bloggers and Tweeters, utilizing social media platforms and open access publishing environments to extend their networks, drive up citations and promote their professional profiles. Hall articulates how today’s new media are constitutive of a particular emergent “epistemic environment.” The epistemic environment of “traditional” academic knowledge production was based on the Romantic view of single authorship and creative genius materialized in writing, long-form argumentation, and the publication of books.
New social media infrastructures, however, are reshaping the epistemic environment of contemporary scholarly knowledge production. This is subsequently affecting the ways in which higher education professionals think, act and identify themselves, and thus how they research, how they generate knowledge, and how they theorize and explain the world. As Gary Hall states it, the emerging epistemic environment:
invents us and our own knowledge work, philosophy and minds, as much as we invent it, by virtue of the way it modifies and homogenizes our thought and our behaviour through its media technologies.
To put it more bluntly, academics are becoming their data, as mediated through complex coded infrastructures and devices. Geoffrey Bowker has written that “if you are not data, you don't exist”; the same is true for academics. The result is the production of new knowledge of the world that have been co-produced with software code and algorithms.
Whether we are confronting the “end of theory” as computer coded software devices and sophisticated algorithms increasingly mediate, augment, and even automate academic practice and knowledge production remains an open question. Is academic work really being homogenized and manipulated by the media machines of Google and Facebook, and is disciplinary expertise and knowledge production being displaced to the “algorithmists” of private R&D labs and commercial technology firms?
For researchers in Digital Media and Learning the task is to be open and alert to the current redistribution of research across these new infrastructures, devices, experts and organizations, and to recognize how our knowledge, theories and understandings of the world we are studying are being mediated, augmented and even co-produced by software code and algorithmic power.
Banner image credit: infocux Technologies http://flic.kr/p/dSHr87