The Need for Frank Discussions About Digital Identity, Trust

I have been writing profiles of core members of the design team working on developing the Center for Solutions to Online Violence over the course of the past year and have been asking this group of educators to reflect on the lessons learned about abusive and threatening online behavior. This month, I spoke to Associate Professor Rebecca Richards, a rhetoric and writing specialist at St. Olaf College in Minnesota, about how she brings her experiences as a former public school teacher in urban and rural settings to her scholarly thinking about the challenges that teens face in negotiating their online lives.

“My appointment is in English, as well as in Women’s and Gender Studies and Media Studies, working at the intersection of those fields with digital communities, feminist communities, and queer communities,” Richards explained. However, she also described how her commitment to responding to “more news breaking about online harassment” was rooted in more than just her academic skillset. She characterized this call to action as “partially academic and mostly personal” and an obligation for any “human being who loves people.” She observed that it was extremely common to see “students and family members” in her own inner circles who were “challenged with how to help young people navigate murky terrain” with bullying is moving online and the ubiquitous digital communication giving students few options to connect with peers.

Recently, technology companies like Facebook and Google have been tasking their engineers with developing systems that use machine learning algorithms to target abusive behavior and disrupt the networks of abusers. For example, the Association for Computational Linguistics hosted a Workshop on Abusive Language Online with activists, advocates, policy critics, and technologists to discuss the hard problems associated with automated detection involving natural language processing and the potential risks to free speech and open participation.

Richards expressed hesitation about adopting technological solutions too uncritically. “If you look at interpersonal abuse and violence without the digital aspect, what we know is that it takes many forms and that the experiences of people never follow a single algorithm. By asking ‘does this count?’ we might tick off a discursive box, but it is only a stopgap measure. When you create this category of digital abuse online, you often find types of violence that do not fit into a bureaucratic category, which can make it harder for people to come forward and identify their lived experiences, which might lead them to be labeled as pathological — or unique to the individual.” Richards pointed to scholarship on emotional abuse and obstacles to getting victims “the resources they need” in cautioning enthusiasts for “machine reading of online abuse” to formalize “benchmarks that define online harassment, online violence, and cyberbullying.”

Although the term “rhetoric” is often miscast as indicative of superfluous or deceptive words, Richards suggested that framing approaches to online identity rhetorically could be very beneficial to students struggling with acceptance. She cited the work of Douglas Eyman as being influential in her own attempts to “put rhetoric back into a non-perjorative framework” and “move backward to reframe meaning-making in a more holistic way.” She suggested that the work of Judith Butler on gender construction and performativity could reclaim exploring behavior that might be otherwise dismissed as inauthentic. “I believe identity is not fixed. It is flexible and fluid, particularly for digital adolescents.” She described how teachers and mentors might be “still trying to map” student conduct without having “the best framework” for “figuring out your gender identity and playing with it” in a time when the dominant culture “reterritorializes the body.” She asserted that a digital rhetoric approach could “give new productive spaces” while “also allowing certain performances to fall by the wayside.” As she noted, “the CSOV is trying to address how people play with digital identities and what happens when that identity keeps circulating beyond your own control.”

“Because I see myself primarily as a teacher, I’d love to see the CSOV continue to develop more tools for teachers who are using digital communication as primary sources and as practice spaces. We need more teacher materials and teacher support to prevent instructors from inadvertently putting their students in harm’s way or using identity materials inappropriately in their teaching practice. One thing that brought me to that project was beginning my career in 2002 as a high school teacher during an explosion of social media and calls for educators to incorporate Web 2.0. There was so much pressure to produce multimodal texts and perform public rhetoric. But, there was not a lot of time for K-12 teachers to reflect critically and collaborate when that push came and the exigency of benchmarks driven by certain kinds of initiatives.”

Richards admitted that even the most well-intentioned teachers can adopt problematic practices. “This has happened to me, when I was not quite prepared.” She argued that there can be many unintended consequences from “publishing a public blog, commenting on news articles, creating a hashtag, or encouraging analysis of a social media issue and circulating a document.”

As someone creating a database of materials about female heads of state and a research scholar analyzing the defeat of Hillary Clinton in the 2016 election, Richards worries about the harm done to young people by the campaign’s toxic combination of “sexist actions in digital spaces” and “a very superficial framing of what feminism looks like and what is empowerment is and what female power should be.” She asserted that more elevated “debates and gendered performances of oratory” might often be “sound-bitten into memes,” as in the case of the Hillary Clinton “shimmy shake” that suggests that women can only respond to sexism with a smile and shaking it off.

In encouraging women and people of color to pursue STEM careers, Richards insisted that it was important to “navigate issues with honesty” and acknowledge “some choppy waters” for young people in underrepresented groups. “It’s about encouraging people’s skill sets while also giving them the lay of the land of what they are up against.” To explain her point, she used a metaphor from Sara Ahmed about a teacup thrown against a wall. “It breaks, and you have a broken teacup, but no one talks about the wall or why the teacup was thrown in the first place. In encouraging women and other minorities in STEM, we must also talk to them about the fact that there are walls and practices of exclusion that do harm. It is important for them to hear and for us to be transparent, which I try to be with the young people who I mentor. I would hope that K-12 teachers would do the same.”

“It is all about going back to the basic challenge for grant applicants  and the issue of ‘what is trust?’ and ‘what do we mean by trust?’ We need to talk to young people about what trust really is in their online activities and online writing when they don’t have interpersonal face-to-face longitudinal relationships.” Richards expressed her concern that too often young people “assume goodwill.”

“We need to have more frank communications about wanting to do something good for the world and engage in social movements while mentoring how to set limits to their own trust. Caring about social issues, feminism, environmentalism, and anti-racism should not be approached with a boundless sense of trust. Trust is something earned; it is sometimes slow, challenging work with affective and intellectual labor done in fits and starts, and that’s okay. In the era of digital education and technologies of learning, there’s a lot of speed in play, but I would like to see teachers, myself included, slow down. And, to do that, we need people — parents, students, administrators, and legislators — to support this shift of pace.”