Privacy Is Not Your Responsibility

Opinion

The idea that you have control is an insidious illusion.

Butch Dill/Associated Press

Privacy invasion is the subject of an excellent article last week from The Times’s newsroom about an app from the University of Alabama that’s “using location-tracking technology from students’ phones” at football games “to see who skips out and who stays.” Basically: Alabama’s football team routinely obliterates weak competitors at home games; students leave the stadium early to go drink; the school wants them to stay. So they built an app that issues rewards points for staying, which, in turn, gives users access to better tickets for the big games at the end of the year.

It’s an interesting example of a truly consenting privacy invasion. The students know their location is being monitored and are getting something in return. This ostensibly straightforward exchange of goods and services is probably why Alabama’s athletic director told The Times that “privacy concerns rarely came up when the program was being discussed with other departments and student groups.”

It may seem as if there are no real losers here, but as Adam Schwartz, a lawyer for the Electronic Frontier Foundation, said in the article, the app sets a bad precedent by offering an incentive for students to give up their privacy. “A public university is a teacher, telling students what is proper in a democratic society,” he argued.

I’ve been thinking a lot about Schwartz’s critique in recent days and how it could apply broadly to Silicon Valley, advertisers, and those who seek to exploit data. Whether it’s downloading an app or monitoring browsing behavior with cookies or pixel trackers, the decision to use this technology is framed as something you opt into. It’s a choice. But when technology is everywhere, it’s hard to know how much free will we really have. Over at OneZero, Colin Horgan makes this argument eloquently:

The more indispensable an internet connection becomes, the less choice we have, which means we have less and less autonomy, a key element of being capable of exercising control over our lives. It’s difficult to expect someone to gain control over something they’ve never had the option to control in the first place.

Privacy is too often framed around choice and consent by those who are doing the invading. It puts the onus on the user. But the more I write and report on the issue, the more unfair this “personal responsibility” frame seems. We frequently present privacy as something you can protect yourself (see: this letter’s “tips of the week” section). But often tips like password managers or ad blockers feel like applying a Band-Aid to a gunshot wound. Even with privacy best practices, engaging with most technology and living life online means having your data exploited constantly — there’s no choice.

To really change the privacy conversation, the burden of data protection needs to move from individuals to institutions. Horgan argues that to do this, we need to shed the language of traditional tech marketing, which depicts users as being in control of their data (meanwhile, it’s the technology that’s described as “vulnerable” when it is hacked, exposing our information). “We need to realize it’s not technology that is vulnerable; it’s us,” he writes. “Every discussion of privacy has to begin there: We have no control. We are powerless and alone.”

That illusion of control is insidious.

The University of Alabama app may seem a minor thing, but it’s representative of the way bigger institutions offer incentives to exploit the very people they ought to protect. We deserve better. In the University of Alabama’s case, this means not putting students in a position where they’ll be rewarded if they sacrifice their location data with better seats at bowl games. For tech companies, it means not making us hand over our data as a requirement for participation in today’s internet.

Powered by Blogger.