The internet of things: It can help us manage our energy use even when we’re away from home, it can help you let people into your house remotely or make receiving packages easier. It can help you monitor your own family, home security issues or grocery use.. and it can help others monitor you.
A recent Gizmodo investigation, for example, revealed that Amazon’s smart doorbell/home security system Ring had major security vulnerabilities even despite a company pledge to protect user privacy. Gizmodo was able to uncover the locations of thousands of ring devices within a random Washington DC area. While only the Ring users who chose to use the Neighbors app were revealed, this still represents a major vulnerability which is ripe for exploitation.
Pop quiz: What do climate change and social media privacy have in common?
If you said, “a distracting and inaccurate focus on individual actions” you’re correct! Congratulations! Pat yourself on the back and pour yourself a congratulatory beer, glass of wine, coffee, or soda.
Note: This post is for Laura, by request – hi Laura!
Note 2: This post is far more philosophical than I normally go, but I thought, what the heck, why not have fun with it?
Your online Shadow.
Everyone has one. Even if you take care to not use platforms like Facebook, it’s highly likely that you have a shadow profile following you around the internet.
Platforms like Facebook and Google do it best. They collect all the data you and your friends or colleagues give them when you use their free services (But Google maps is SO CONVENIENT!) and they combine that with collected data about the other sites you visit (even after you log out) or where you’re logging in from, or whether you’re on a mobile device or a computer. They combine all of this data, and start to make predictions about you, which are either confirmed or adjusted depending on your online habits, and the habits of those you are connected to. This is precisely why so many people think their phones are secretly listening to them – and then delivering ads based on something they said. Your phone is not listening to you. It’s more troubling than that. Your shadow profile has revealed your secrets (she’s not very discrete!).
Researchers at MIT, who are at the forefront of autonomous vehicle technology, have noticed that paradoxically when a little bit of assistive technology is added to a car, drivers become less safe. In other words, when people feel like technology is behind they wheel they are more likely to be more distracted drivers and thus many of the autonomous technologies that are intended to make people more safe actually do the opposite.
This is a classic unintended consequence of technology, like the ones described by Edward Tenner in his 1997 book Why Things Bite Back. To combat this issue, the smart folks at MIT decided to put a human facing camera in a vehicle, which would look for distracted driving and compensate accordingly, as seen in this YouTube video. Rather than asking, what are the social and psychological reasons that drive people to engage in distracted driving, so that these reasons might be minimized, instead the best solution was determined to be adding another layer of technological assistance to the issue. Technology to solve the problem created by technology.
Over the last two years or so, the Canadian Government has been openly exploring the issue of how some government processes, such as the processing of lower risk or routine immigration files can be made more efficient through the use of AI (machine learning) algorithmic processes.
The good news is that the adoption of these systems has so far been guided by a digital framework which includes making the processes and software open by default whenever possible. These guidelines hint at a transparency that is necessary to mitigate algorithmic bias.
I usually only post to this blog once per week, but a news story caught my eye today since it concerns my sector (higher education), my country (Canada) and my passion (technology critique).
Mount Royal University in Calgary, Alberta is going to be the first organization in Canada to install an AI system for the purposes of security. This system consists of a network of cameras and a machine learning algorithm that spends the first few weeks learning what “normal” movement looks like on campus, then uses that baseline to detect if there might be a security issue. Deviations from normal in this case, signal a potential “threat” or at least an event worth looking into. As described by the Vice-President, Securities management in a recent CBC article:
“when that pattern breaks, what it does, that screen comes to life and it shows the people in the security office where the pattern is now different and then it’s up to a human being to decide what to do about it,”
The Dunning-Kruger effect refers to a type of cognitive bias in which people assess their own knowledge of a topic or subject area as being greater than it actually is. Psychologists note that it tends to occur frequently in those people with a small amount of knowledge on a topic. In other words, it takes a certain amount of knowledge before we can actually know how little we know.
In developers’ conferences and earnings calls, the biggest of the big tech companies are trying to develop unique value propositions that paint them as friendly, responsive, and attuned to the needs of their customers. Then the mainstream technology media (often overworked, understaffed and reliant on the good graces of big tech for continued access to stories), generally reports these messages at face value. News in the last week focused on Facebook’s pivot toward community groups, Google’s exciting universal translator or Amazon’s claim that small and medium sized business partners made on average 90K last year through their platform.
The beauty of participatory and social media has always been its ability to connect people. That is also the great evil of these platforms.
Social media allows what Barry Wellman calls networked individualism. In contrast to geographic or familial communities where we are brought together through accidents of fate like where or with whom we were born, networked individuals are not forced to conform to community norms to fit in. Instead, they can use network technologies to maintain their individual quirks and find others who share their unique interests and ideals in online communities. This is a beautiful and terrifying vision.