If I were to rewrite the iconic 1983 Police hit song “I’ll be watching you” for the information age, it might go something like this:
Every pic you take,
Every move you make,
Every post you make,
Every word you say,
They’ll be watching you.
Oh can’t you see,
They’re using more than cookies.
They’re with you on every road,
On every app you download.
You clicked “I agree”,
And now they see what you see,
They’re tracking you,
And all your pictures too,
They’re always watching you.
Now before you accuse me of conspiracy theories and hand me a brand new tinfoil hat, it’s important to understand that I’m not the only one who feels this way. Zeynep Tufekci, a person who is much smarter than me and who is also an associate professor in the Faculty of Library Science at the University of North Carolina, just published a compelling op-ed in the New York Times about this very issue. Her piece was inspired by Strava, the run and bike route tracking app. You see, Strava found itself inadvertantly in a bit of hot water when it was discovered that you could reverse engineer the location of secret military and CIA bases by identifying the running routes of service members.
Tufekci, rightly, writes in her op-ed about the dangers of the large scale data collection that represents the monetization strategy for most of the social web. Combined with the opaque machine learning algorithms that constantly collect and analyze data, she says that even if users fully read and consent to the terms of service (and who actually reads those anyway), the terms of service actually can’t account for all possible uses of data and all possible breaches of privacy. She writes:
Part of the problem with the ideal of individualized informed consent is that it assumes companies have the ability to inform us about the risks we are consenting to. They don’t. Strava surely did not intend to reveal the GPS coordinates of a possible Central Intelligence Agency annex in Mogadishu, Somalia — but it may have done just that. Even if all technology companies meant well and acted in good faith, they would not be in a position to let you know what exactly you were signing up for.
Now let that sink in for a moment – even the companies may not be able to know all of the consequences of their data collection. So how can they protect user data from that which they cannot possibly predict?
Now the more we rely on apps for various parts of our lives which used to be part of the private sphere: fitness tracking, directions to a friends house, personal messages, television, meal planning… etc… the more we are offering data to a variety of companies – many of whom may share our data and who may use it deliberately or otherwise for purposes we have no say over. So while some people may choose to give their data in some contexts for the use of a specific app or platform, it’s hard to make that choice if you don’t know exactly what you are consenting to. Would you consent to giving the location of your children away to just anyone? Would you consent to someone using a camera in your house to see when you are sleeping? Would you consent to someone listening in on all your conversations, including the private ones? If the answer is no, or even “maybe not under certain circumstances” then you may just want to remove most apps (including Facebook) from your phone, because there is evidence that this sort of privacy breach is already happening regularly.
In light of all of this, Tufekci suggests strict control and regulations in favor of the public that dictates what data should be stored, and that ensure that data is only stored when it is in the vested interest of the consumer, rather than the company. It will take concerted effort and activism by consumers for this to occur, but I would agree that it’s in all of our best interests to demand this from technology companies before we’re no longer in a position to do so.