Our personal data and the ways private companies harvest and monetize it plays an increasingly powerful role in modern life.
Throughout 2023, corporations have continued to collect our personal data, sell it to governments, use it to reach inferences about us, and exacerbate existing structural inequalities across society.
Earlier this year, we filed comments with the U.S. National Telecommunications and Information Administration addressing the ways that corporate data surveillance practices cause discrimination against people of color, women, and other vulnerable groups.
Thus, data privacy legislation is civil rights legislation.
In early October, a bad actor claimed they were selling stolen data from the genetic testing service, 23andMe.
When it comes to corporate data surveillance, users' incomes can alter their threat models.
Lower-income people are often less able to avoid corporate harvesting of their data, as some lower-priced technologies collect more data than other technologies, whilst others contain pre-installed malicious programmes.
Lower-income people may suffer the most from data breaches, because it costs money and takes considerable time to freeze and monitor credit reports, and to obtain identity theft prevention services.
Disparities in whose data is collected by corporations leads to disparities in whose data is sold by corporations to government agencies.
As we explained this year, even the U.S. Director of National Intelligence thinks the government should stop buying corporate surveillance data.
Structural inequalities affect whose data is purchased by governments.
When government agencies have access to the vast reservoir of personal data that businesses have collected from us, bias is a likely outcome.
This year we've also repeatedly blown the whistle on the ways that automakers stockpile data about how we drive-and about where self-driving cars take us.
There is an active government and private market for vehicle data, including location data, which is difficult if not impossible to de-identify.
Police have seized location data about people attending Black-led protests against police violence and racism.
Further, location data can have a disparate impact on certain consumers who may be penalized for living in a certain neighborhood.
Technologies developed by businesses for governments can yield discriminatory results.
Earlier this year, the Government Accountability Office published a report highlighting the inadequate and nonexistent rules for how federal agencies use face recognition, underlining what we've said over and over again: governments cannot be trusted with this flawed and dangerous technology.
Developments throughout 2023 affirm that we need to reduce the amount of data that corporations can collect and sell to end the disparate impacts caused by corporate data processing.
The pervasive ecosystem of data surveillance is a civil rights problem, and as we head into 2024 we must continue thinking about them as parts of the same problem.
This Cyber News was published on www.eff.org. Publication date: Sun, 24 Dec 2023 17:43:06 +0000