The most recent series of the popular conspiracy drama “Homeland” features a shadow intelligence agency dedicated to producing and circulating fake news and computational propaganda via fake social media user accounts. Run by a TV shock-jock whose authority seems to surpass that even of the CIA, and who bears obvious resemblance to Steve Bannon, the agency is primarily staffed by young coders and programmers, who have been tasked with waging a secret information war against an incoming President-elect.
This part of the plot of “Homeland” dramatizes quite well troubling current events whereby computer coding is now understood to be playing a significant role in political events, with massive implications for civic life and citizenship.
The issue of how social media has participated in filtering people’s exposure to political perspectives has become one of the defining debates in the wake of the Brexit referendum in the U.K. and the U.S. election, although through the role of social media in political turbulence has been unfolding for several years.
An article in the tech culture magazine Wired on the day of the U.S. election even asked readers, uncharacteristically, to consider the “dark side of tech”:
Even as the internet has made it easier to spread information and knowledge, it’s made it just as easy to undermine the truth. On the internet, all ideas appear equal, even when they’re lies. … Social media exacerbates this problem, allowing people to fall easily into echo chambers that circulate their own versions of the truth. … Both Facebook and Twitter are now grappling with how to stem the spread of disinformation on their platforms, without becoming the sole arbiters of truth on the internet.
The social media researcher Jonathan Albright has carefully documented the emergence of a right wing fake news mega-network, arguing that it acts as a vast algorithmic, data-mining “micro-propaganda machine” via social media platforms.
As a consequence, it has been claimed that “platforms like Twitter and Facebook now provide a structure for our political lives,” as Phil Howard, a sociologist of information and international affairs, has argued. He claims that social algorithms allow “large volumes of fake news stories, false factoids, and absurd claims” to be “passed over social media networks, often by Twitter’s highly automated accounts and Facebook’s algorithms.”
Fake news, post-truth alternative facts, computational propaganda and political bots are the defining digital media problems of our current time.
In response to these emerging challenges, a number of media and education scholars have begun to consider how we might educate young people to inhabit and make sense of a such an environment, David Buckingham and danah boyd among them. Neither has easy solutions. boyd wonders whether existing approaches to media literacy have backfired. She says that addressing so-called fake news is “going to require a cultural change about how we make sense of information, whom we trust, and how we understand our own role in grappling with information. Quick and easy solutions may make the controversy go away, but they won’t address the underlying problems.”
My sense is we might also need to rethink dominant approaches to teaching kids to code in this emerging context of fake news and computational propaganda.
As the “Homeland” plot indicates, coding skills can be put to all kinds of purposes. Media-literate young programmers with extreme political views may not have quite the power that “Homeland” suggests. But, as the brilliant and terrifying 2016 documentary film “Zero Days” has shown, the cyber-warfare plots of “Homeland” may not be far from the reality.
Which is why it strikes me that the learning to code movement needs to be clearer in its commitment to new forms of digital literacy and digital citizenship education. England was one of the first countries to embed coding in the curriculum for all schools. For many of its original advocates, knowing how computers, code and algorithms work would be valuable for informed citizenship.
The reality, though, is that coding in the curriculum, and many other learning to code schemes, have tended to overemphasize either economically valuable skills for the software engineering sector, or high-status academic computer science knowledge and skills. There has been far too little focus on enabling young people to appreciate the social consequences of code and algorithms.
In many ways, this may reflect the kind of professional culture of software engineering. As the media philosopher Ian Bogost has noted, software engineers often possess little knowledge of the wider social contexts in which their products will work:
An engineer is a professional who designs, builds, and maintains systems. But to engineer means skillfully, artfully, or even deviously contriving an outcome. To engineer is to jury-rig, to get something working more or less, for a time. Sufficiently enough that it serves an immediately obvious purpose, but without concern or perhaps even awareness of its longevity.
There is now a rising tide of concern that learning to code initiatives, like the software engineering sector, may have lost sight of the social effects of technical systems.
A recent newspaper piece ran with the headline “Don’t teach your kids coding, teach them to live online.” Its opening lines read:
“Don’t teach your kids coding,” says New York Times journalist Thomas Friedman. “Well – teach it if you want. But before you teach them coding, teach them digital civics: how to talk to one another on the internet, how to understand fact from fiction.” … Friedman identified a problem that education systems are only now beginning to wrestle with. Life is largely lived online, and schools do not prepare children for it. It’s not just about keeping them safe from predators, cyberbullies, porn and identity theft: it’s also about having an ethical framework, and the skills to assess the reliability of information.
In the U.K., the House of Lords has recently issued a similar assessment of British education, claiming it does not adequately prepare children to “thrive online.”
These concerns seem to suggest that recent moves to embed computing and coding, and the understanding of how algorithms work, has not led to any real progress in educating young people to be critically aware consumers and producers of digital media.
“Homeland” is a dramatic exaggeration, for sure. But, unfolding political turbulence associated with growing youthful expertise in programming and algorithm design suggests the need to urgently revisit the purposes to which coding and computing for kids schemes are being put.
We need to get beyond the rather naïve and utopian ideal that if you can understand what an algorithm is and how to make one, then you can program the computer rather than it programming you. A different sort of knowledge is required to understand the social power of algorithms.
We also need to consider to what extent coding in schools should be synonymous with software development and engineering, and get realistic with young people about the intense digital labour conditions in the software sector — or even the psychopathic managers who run it.
Coding can, of course, be fun and creative for kids. But, it can also lead to harmful and destructive effects. It needs to be taught with a firmer commitment to exploring the social consequences and power of the products of programming.
If kids need to learn to code, it should be for digital citizenship, not to become complicit with computational propaganda.
Banner image credit: Christiaan Colen