There’s an interesting paper by Becky Kazansky, ‘It depends on your threat model’: the anticipatory dimensions of resistance to data-driven surveillance. The author critiques ‘anticipatory data practices’, a collection of techniques that include my own work, as presented to civil society activists. It opens “While many forms of data-driven surveillance are now a ‘fact’ of contemporary life amidst datafication, obtaining concrete knowledge of how different institutions exploit data presents an ongoing challenge, requiring the expertise and power to untangle increasingly complex and opaque technological and institutional arrangements.”
Her work, advances my thinking and concerns that I’ve had since our expriment with threat modeling the privacy of Seattle residents. In particular, it shows some of the limits of the Four Question Framework. I see two particular limits with the questions “what are we working on” and “what are we going to do about it”.
First, normal people don’t “work on” technology the way technologists do. They’re working on living their lives, and hoping technology helps. Most people don’t have a good way to understand the technology they’re using, and especially not at reasonable cost. Much of the technology which influences our lives and our privacy and security is outside of our control — for example, the electronic locks on an apartment building, the RFID readers on a bus, the cameras and microphones on telephone poles. That’s technology working on us, not us working on technology.
Second, the options available to normal people are often exceptionally limited: phones are tracked by phone companies despite anything we do (except possibly putting them in airplane mode, which seemed to be getting more “nuanced” every year). Living without a phone or without Facebook is hard, and even if you’re not on Facebook, Facebook maintains a shadow profile of you. We might spend hours configuring apps, “opting out”, protecting ourselves, only to discover that the latest IoS turns on Bluetooth or iCloud again. If you use a Pi-Hole to protect your home from the devices in it, things randomly break in hard to debug ways. You can’t
Her paper is thoughtful and challenging. We need to do better at helping real people navigate the world and protect themselves. There’s an interesting approach in Apple’s recent guidance, Device and Data Access when Personal Safety is At Risk. (I hope to have more to say about that soon.)
I do think that some degree of modeling and analysis is helpful for protecting normal folks, and even activists, who want to have a digital life. But the tooling we have isn’t yet what they need.
Photo: Jamie Street.