I want to take a moment to examine how data collection has changed for us who teach and assess students.
In the digitally augmented classroom, there should be concern for both corporate privacy and interpersonal privacy. While we have limited control over the corporate tracking and data-collection that takes place, it is possible to allow varying levels of interpersonal privacy in the digital classroom. To make participation highly visible, down to seeing who contributed what line in a paper or slide in a slideshow, brings in echos of the dreaded panopticon.
Often, when I speak to people who are teaching and moving toward digital or online learning either in terms of individual projects or using a learning management system in higher education settings, they get really excited about being able to better track student work. I’ve heard multiple times that the specific place that is generating this excitement is the ability to “really be able to see who contributed what and how long they spent on it.” I am still in the process of trying to be okay with this sentiment, but so far I am failing because often, the same people who are excited about their own ability to see student data and information get upset or overwhelmed when thinking about student privacy when it comes to the unnamed super-villain company that stores or sells all the student data from learning management systems and web and mobile apps.
I keep asking myself, “what gives?” Here is where my thinking is. Lurking has been removed from our digital experience due to the massive amount of tracking and data collection that goes on. This is the same for online classroom spaces, maybe even more so because many of us dream of a classroom with measurable 100% participation and digital media seem to be capable of forcing that to happen. There is a big question of privacy we need to be asking about in these spaces, though, as critically as we do the privacy policies of big companies. This lack of self-reflection was something I couldn’t help but think about when listening to the recent talk given at Data & Society by Elana Zeide on Student Privacy & Big Data, especially as it relates to lurking.
Private information in privacy information management is understood to be “information that makes people feel some level of vulnerability, thereby resulting in the desire to control further dissemination of that information” (Child and Starcher 2015). Going into a classroom space and being open to learning requires, at times, that a student release “private information.” Additionally, the student will be producing data she or he will share, such as assignments, potentially grades and other disclosures that happen in the learning space. At the same time, as digital tools and learning analytics better show gaps in learning, it is important that students are informed of the scope of data being collected and how that information moves with them, and what the implications or purpose of that information and data moving might be. We need to do this so students are able to have a sense of safety and control of what they want to control and to create spaces that can remain areas of productive lurking.
If there are anxious students who feel uncertain of their abilities in a course, the need to do the work digitally and have all the time and effort they put in tracked, might place additional anxiety on the student. For students who are already at risk, this might be heightened anxiety especially if they continue to underperform. The ability to be quiet, and to lurk in a space provides a level of comfort.
While the effects lurking with regards to computer anxiety and information privacy concerns have been explored on social media accounts (Osatuyi 2015), the concerns laid out in the paper also exist in the classroom. I imagine the link is so seamless because the move to the digital classroom is a move toward social pedagogy. The boundaries of privacy will expand as a community is created, but that does not mean there is an expectation of no boundaries. Tracking everything has the potential to create unanticipated boundaries for students because if all information might be used, then all of it might be used in a negative way even if the intention is positive. When students are working with different skill levels it seems unreasonable to assume equal participation. However, “vicarious learning” (Dennen 2008) might be taking place, a form of digitally enabled peer-to-peer learning through observation. This low stakes engagement seems ideal for those at risk students who might feel anxiety because they are underperforming or not as advanced as other students in the classroom space. Another thing the digital traces of student work do not capture is meaningful reflection time. Often in classroom spaces, we are able to pinpoint the students who will speak up and those who will stay quiet. A digital learning space is not a guarantee that these things will be reversed.
We mistake numbers for meaningful engagement. If we take all recorded interaction as meaningful and assessable in learning spaces, we risk limiting participation by allowing all interactions, even frivolous or meaningless ones, impact the final learning goal. Additionally, by prioritizing what can be seen and measured over the potentially reflective lurking, we do not honor the time a student might need to make learning meaningful for them and their learning goals.
Child, Jeffrey T., and Shawn C. Starcher. “Fuzzy Facebook privacy boundaries: Exploring mediated lurking, vague-booking, and Facebook privacy management.” Computers in Human Behavior 54 (2016): 483-490.
Dennen, Vanessa Paz. “Pedagogical lurking: Student engagement in non-posting discussion behavior.” Computers in Human Behavior 24, no. 4 (2008): 1624-1633.
Osatuyi, Babajide. “Is lurking an anxiety-masking strategy on social media sites? The effects of lurking and computer anxiety on explaining information privacy concern on social media platforms.” Computers in Human Behavior49 (2015): 324-332.
Banner image: r2hox