Editor’s note: The following is a discussion between Data & Society Research Institute researchers Monica Bulger and Mikaela Pitcan and Jade Davis, associate director of Digital Learning Projects at LaGuardia Community College. In light of the travel ban and recent border demands to view social media accounts, the scholars discuss students who might be vulnerable in the new environment and also how this might be a moment for teachers and students to reconsider teaching practice and approaches to digital literacy. Monica: What prompted this interview is we were talking the other day about unintended consequences of using personalized
I remember the first year I started teaching. It was exhilarating and confusing and led me to a mini-existential crisis of sorts that I imagine often when you walk into a class with a bunch of faces staring at you who assume you have all the answers and the key to their future. Why else would they be there? I had a conversation with a dear friend and I asked her the point of teaching. She said: “To remember that students don’t know what they don’t know, but that they are in that space to learn
As a researcher who actively engages in tech policy, Seda Gürses considers how a variety of actors may disrupt online wellbeing. She also brings an international perspective to her collaborative work. As part of the Trust Challenge team launching the Center for Solutions to Online Violence, Gürses contributes her expertise as a computer scientist and privacy advocate. Now based at Princeton, she previously held positions at New York University and the University of Leuven. In an interview with DML Central, Gürses mused about the fact that her earliest digital literacy experiences had been shaped by childhood experiences.
I want to take a moment to examine how data collection has changed for us who teach and assess students. In the digitally augmented classroom, there should be concern for both corporate privacy and interpersonal privacy. While we have limited control over the corporate tracking and data-collection that takes place, it is possible to allow varying levels of interpersonal privacy in the digital classroom. To make participation highly visible, down to seeing who contributed what line in a paper or slide in a slideshow, brings in echos of the dreaded panopticon. Often, when I speak to
Random autobiographical story: When in school, as part of our elocution classes, we had a dragon for a teacher, who used to prowl around with a menacing looking wooden measuring scale, as we obediently enunciated our words and practiced tongue twisters in an attempt to improve our diction and pronunciation. “She sells shellfish on the sea shore” the class chanted in a well-trained chorus. Every voice a whimper, trying to find comfort in remaining anonymous. But, in the middle of the group chanting, the scale rose and there would be silence. One petrified kid was stared
Recent news reports have begun to reveal how various analytics companies are now data mining millions of children. The learning analytics company Knewton, for example, claims that 4.1 million students are now using its proficiency-based adaptive learning platform, which has served 3.5 billion total recommendations between May 2013 and May 2014 alone. The role of these predictive analytics platforms and recommender systems in education is increasingly causing political and parental concerns, largely related to privacy. Less acknowledged, however, is the increasingly autonomous and automated capacity of the software algorithms working in the background of these platforms.
I recently realized that it was time to move. My oldest son is 7 and he’d learned that “everything was on the Internet’ from a schoolmate, and wanted to see if our “house” was. We lived in a medium size apartment complex where the apartments all share the same address. We were the ground level apartment, with a townhouse above us, but inside the complex. I put the address into Google, switched on streetview, and, much to my surprise, used the little arrows to tour my apartment complex. When I made my way to our front
Around this time last year, thanks to the whistleblower Edward Snowden, U.S. citizens found out just how little online privacy they have. For those outside the U.S., the revelations were potentially even worse news as they weren’t granted the protection of the U.S. constitution. The Snowden revelations were, and are, shocking. We need to reform government and its relationship to the digital world we inhabit. This will likely be a slow process best achieved through democratic means. The chances of doing so are relatively good, as we have some of the best minds of our generation
The seduction of ‘Big Data’ lies in its promise of greater knowledge. The large amounts of data created as a by-product of our digital interactions, and the increased computing capacity to analyse it offer the possibility of knowing more about ourselves and the world around us. It promises to make the world less mysterious and more predictable. This is not the first time that new technologies of data have changed our view of the world. In the nineteenth century, statistical ‘objective knowledge’ supplanted the personal knowledge of upper-class educated gentlemen as the main way in which
Across the US, schools are back in session. My university started back last week, and, as I usually do, I spent the first day of class discussing the syllabus, explaining to students my expectations for them and the course. When I do this, I always devote some time to explain the different digital tools we will use in the course and suggest additional tools students can use to make their lives easier. In this additional tools category, I like to emphasize the importance of backing up one’s data and in one of my classes I suggested