I recently spent a week with my colleagues at the Digital Pedagogy Lab (DPL), an annual event bringing together teachers, administrators, designers, librarians, and other educators thinking very critically about 1) the purpose of education and the work they do within it and 2) the (often deeply troubling) relationship between technology and education. This was my third year attending DPL, and when the 2018 program was announced, I was delighted to see that the number of tracks had grown to include a week long deep dive into digital privacy. As a cyberfeminist working in the digital learning field, this topic has been extremely important to me for some time. When I began a graduate program in international policy in 2013, I had already made a habit of covering my laptop webcam with a sticky-note. At the time, most of my peers interpreted this behavior as unwarranted paranoia (except for the students in the nonproliferation and terrorism studies program). Since then, the relationship between surveillance capitalism and the outcome of the 2016 US presidential election has resulted in some consciousness-raising around the dangers of ad-revenue driven platforms’ and predatory terms of use and privacy policies. But, even nowadays, when I talk to friends and colleagues about the importance of privacy (I tend to bring it up — a lot), I’m often met with this familiar response:

“I have nothing to hide, so I don’t care.”

In that case, if you could just give me the keys to your house, as well as your social and mother’s maiden name, that would be great.

We also tend to think of privacy as being controlled at the individual level, but privacy is also extremely relational. For example, anytime you download an app and allow it to access your contact book, that decision has implications for those contacts. What does that data point mean for your Black Lives Matter activist friend, or your undocumented neighbor, or your trans colleague?

Or, specific to an education context, I’ve also heard:

“Students don’t care about privacy — they’re putting all their information out there, anyway.”

They’re actually not putting all their information out there. The data collected and sold by platforms typically includes:

  • given data — data that you provide/volunteer
  • extracted data — data that is taken from you without you volunteering it (your location, for example, anytime you open the Facebook app)
  • inferred data — assumptions that a platform makes about you based on the first two categories

At DPL, these were just a few of the privacy-related myths that we identified and debunked as a group. We were a small enough cohort that we were able to move through the week very conversationally, and our teachers gave use the space to ask hard questions and go down rabbit holes. This format was exactly what I had been craving, because I have found that understanding the digital privacy landscape is all about knowing 1) what questions to ask and 2) who to ask them to. Answering one question begets ten new questions, and everyday there is more to learn. This work is harrowing, and at times I have felt like the more I know, the more despair I feel (see this, and this, and this, and this). Therefore, building in action — steps one can take to combat the dangers — is essential to operating in this space. Throughout the week, we focused on incremental changes that can be made at the individual level (to be safer in the digital spaces within which live and work), as well as how to center learners in conversations (and decisions) around privacy and security within our institutions.

One of my goals for the week was to become more effective at seeding conversations about digital privacy in my communities. How do I get relatives, friends, students, administrators, faculty — particularly those who are new to this topic, or those who don’t think privacy is important — in the room? How do I get them to stay in the room? At DPL, our teachers used a helpful framework to separate out:

  • myths (pervasive misnomers around digital privacy, why it’s important, and what people believe about it).
  • risks (the actual dangers).
  • strategies (how to protect against those risks).

As an example, let’s take one of the myths from above: “Students don’t care about privacy — they’re putting all their information out there, anyway.”

My colleague and teacher, Chris Gilliard, shared how he engages his students when they say they don’t care about their privacy online. He argues (and I agree with him) that students do, in fact, care. They might care differently, and they don’t know what they don’t know, but they care. The “students don’t care” myth is particularly dangerous because it ties into another myth we often hear from Silicon Valley: “privacy kills innovation” In response to that, @funnymonkey said it best:

“If you claim that protecting privacy impairs your ability to innovate and weakens your business model, you fail to understand innovation, and your business might be predicated on exploitation.”

Chris asks his students: “What kind of online behavior is creepy?” Their responses reveal what they do care about — what behavior they find unacceptable. He then frames the conversation accordingly and points to examples of how platforms enact those problematic behaviors. He also does things like:

  • have students define “privacy” and “online identity”.
  • have students Google their names and see what comes up. Go through Google’s privacy settings and show students what’s there (and what they can do about it).
  • have students investigate the mapping function of their smartphones (if they have them). If they don’t disable location services, their phones know where they’ve been and store that information for several months. And even when location services are disabled, there’s still plenty of locational data that can still be extracted.  
  • problematize learning analytics by talking to students about how predictions about their academic performance are made.  
  • talk to students about information asymmetry: Ask them how they feel about platforms knowing a lot more about them than they know about the platforms and their black box algorithms. How do they feel about their information being shared (and sold) inappropriately?

As I continue learning and processing my week at DPL, I endeavor to cling tightly to hope through conversations and action.


css.php