Digital Surveillance Is Class Warfare

(MOTHERBOARD) The many phones, laptops, and sundry gadgets most of us use every day track our location, our habits, and our preferences. The privacy threats may be pervasive, but not everybody is affected equally: the poor and marginalized get the worst of it.

That’s according to a new study from researchers at the Data & Society Research Institute, which surveyed 3,000 US adults about their online habits. It concluded that people living in households making under $20,000 annually use a smartphone as their main internet-browsing device much more often (63 percent) than wealthier families (21 percent). This finding alone doesn’t paint a complete picture, however, and a separate 2017 Pew Research study also showed that black and Hispanic people are more reliant on their phones than white people.

The problem with this, though, is that your phone is a little spy in your pocket, much more so than the average laptop. If you’re not meticulous with your privacy settings, apps can send all sorts of personal information back to their corporate makers, including your location. The poor who rely on smartphones are thus more exposed to this surveillance.

“Thanks to reduced access to certain privacy controls on an app versus its desktop version, and these devices’ vulnerability to data collection via location sharing and in-store tracking, a greater amount of data is collected on marginalized groups,” Data & Society researcher and study co-author Mary Madden said in an interview.

The survey didn’t tackle why this disparity in phone use exists, but it might be because many providers offer sign-up deals that include a cheap smartphone while desktops still cost hundreds of dollars and require a home internet connection. Wealth disparity compels people to use a technology that further exposes them to the very same economic forces that disadvantage them, in a way that can best be described as systemic.

Data collected from poor and marginalized people can be used to target ads to people in rocky circumstances, Madden said—Facebook recently presented research to advertisers on how the platform can identify depressed teens—or to train algorithms which could eventually be used to discriminate against these groups; for example, when deciding insurance premiums or bail eligibility. Much work has been done to show that biases in data can flow up so that decision-making algorithms manifest certain terrible human prejudices.

Read More @ Source

Share This Article…

→ Get It On Amazon ←