Tech

Inside a Misfiring Government Data Machine


Last week, WIRED published a series of in-depth, data-driven stories about a problematic algorithm The Dutch city of Rotterdam was implemented with the aim of rooting out subsidy fraud.

Cooperation with lighthouse reporta European organization specializing in investigative journalism, WIRED gained access to the inner workings of algorithm follow the freedom of information law and discover how it assesses who is most likely to commit fraud.

We’ve found that the algorithm discriminates on the basis of race and gender—unfairly giving women and minorities a higher risk score, which can lead to damaging investigations. substantial harm to the personal lives of the claimants. An interactive article delves into the nature of the algorithm, taking you through two contrived examples to show that while race and gender are not among the factors included in the algorithm, other data, such as a person’s Dutch proficiency, which can act as a proxy for discrimination.

The project shows how algorithms designed to make governments more efficient—and often heralded as fairer and more data-driven—can covertly amplify biases ​social. The WIRED and Lighthouse investigations also show that other countries are same failed test methods of finding scammers.

“Governments have been embedding algorithms into their systems for years, whether it’s a spreadsheet or some fancy machine learning,” says Dhruv Mehrotra, an investigative data reporter at WIRED who worked on the project. “But when an algorithm like this is applied to any form of punitive and predictive law enforcement, it becomes massively impactful and pretty scary.”

The impact of a Rotterdam algorithm-driven investigation can be very difficult, as seen in the case of a mother of three facing interrogation.

But Mehrotra says the project was only able to highlight such injustices because WIRED and Lighthouse had the opportunity to test how the algorithm works — countless other systems that work with impunity under cover. transport of the bureaucracy. He says it’s important to realize that algorithms like the one used in Rotterdam are often built on systems that are inherently unfair.

“Typically, algorithms are just optimizing a technology that is already penalized for welfare, fraud, or policy,” he said. “You don’t want to say that if the algorithm is fair it will be fine.”

It is also important to recognize that algorithms are becoming increasingly common at all levels of government, yet their operations are often completely hidden from those most affected.

Another investigation that Mehrota did in 2021, before he joined WIRED, shows how crime prediction software is used by some police departments Black and Latinx communities are unfairly targeted. In 2016, ProPublica reveal shocking deviations in algorithms used by some courts in the United States to predict which offenders are most at risk of recidivism. Other problematic algorithms determine which school the child attends, recommend companies to hireAnd decide which family’s mortgage application is approved.

Of course, many companies also use algorithms to make important decisions, and these decisions are even less transparent than those in government. this is a growing movement to hold companies accountable for algorithmic decision-making and law enforcement that require greater visibility. But the problem is complex, and making the algorithms fairer can be just the opposite. sometimes make things worse.

newsofmax

News of max: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button
Immediate Matrix Immediate Maximum