. Earth Science News .
ROBO SPACE
Amid reckoning on police racism, algorithm bias in focus
By Rob Lever, with Manon Billing in Paris
Washington (AFP) July 5, 2020

A wave of protests over law enforcement abuses has highlighted concerns over artificial intelligence programs like facial recognition which critics say may reinforce racial bias.

While the protests have focused on police misconduct, activists point out flaws that may lead to unfair applications of technologies for law enforcement, including facial recognition, predictive policing and "risk assessment" algorithms.

The issue came to the forefront recently with the wrongful arrest in Detroit of an African American man based on a flawed algorithm which identified him as a robbery suspect.

Critics of facial recognition use in law enforcement say the case underscores the pervasive impact of a flawed technology.

Mutale Nkonde, an AI researcher, said that even though the idea of bias and algorithms has been debated for years, the latest case and other incidents have driven home the message.

"What is different in this moment is we have explainability and people are really beginning to realize the way these algorithms are used for decision-making," said Nkonde, a fellow at Stanford University's Digital Society Lab and the Berkman-Klein Center at Harvard.

Amazon, IBM and Microsoft have said they would not sell facial recognition technology to law enforcement without rules to protect against unfair use. But many other vendors offer a range of technologies.

- Secret algorithms -

Nkonde said the technologies are only as good as the data they rely on.

"We know the criminal justice system is biased, so any model you create is going to have 'dirty data,'" she said.

Daniel Castro of the Information Technology & Innovation Foundation, a Washington think tank, said however it would be counterproductive to ban a technology which automates investigative tasks and enables police to be more productive.

"There are (facial recognition) systems that are accurate, so we need to have more testing and transparency," Castro said.

"Everyone is concerned about false identification, but that can happen whether it's a person or a computer."

Seda Gurses, a researcher at the Netherlands-based Delft University of Technology, said one problem with analyzing the systems is that they use proprietary, secret algorithms, sometimes from multiple vendors.

"This makes it very difficult to identify under what conditions the dataset was collected, what qualities these images had, how the algorithm was trained," Gurses said.

- Predictive limits -

The use of artificial intelligence in "predictive policing," which is growing in many cities, has also raised concerns over reinforcing bias.

The systems have been touted to help make better use of limited police budgets, but some research suggests it increases deployments to communities which have already been identified, rightly or wrongly, as high-crime zones.

These models "are susceptible to runaway feedback loops, where police are repeatedly sent back to the same neighborhoods regardless of the actual crime rate," said a 2019 report by the AI Now Institute at New York University, based a study of 13 cities using the technology.

These systems may be gamed by "biased police data," the report said.

In a related matter, an outcry from academics prompted the cancellation of a research paper which claimed facial recognition algorithms could predict with 80 percent accuracy if someone is likely to be a criminal.

- Robots vs humans -

Ironically, many artificial intelligence programs for law enforcement and criminal justice were designed with the hope of reducing bias in the system.

So-called risk assessment algorithms were designed to help judges and others in the system make unbiased recommendations on who is sent to jail, or released on bond or parole.

But the fairness of such a system was questioned in a 2019 report by the Partnership on AI, a consortium which includes tech giants including Google and Facebook, as well as organizations such as Amnesty International and the American Civil Liberties Union.

"It is perhaps counterintuitive, but in complex settings like criminal justice, virtually all statistical predictions will be biased even if the data was accurate, and even if variables such as race are excluded, unless specific steps are taken to measure and mitigate bias," the report said.

Nkonde said recent research highlights the need to keep humans in the loop for important decisions.

"You cannot change the history of racism and sexism," she said. "But you can make sure the algorithm does not become the final decision maker."

Castro said algorithms are designed to carry out what public officials want, and the solution to unfair practices lies more with policy than technology.

"We can't always agree on fairness," he said. "When we use a computer to do something, the critique is leveled at the algorithm when it should be at the overall system."


Related Links
All about the robots on Earth and beyond!


Thanks for being here;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
SpaceDaily Contributor
$5 Billed Once


credit card or paypal
SpaceDaily Monthly Supporter
$5 Billed Monthly


paypal only


ROBO SPACE
New jellyfish robots can outswim their inspiration
Washington DC (UPI) Jul 02, 2020
Engineers have developed soft-bodied robots that can outswim their biological inspiration - jellyfish. The jellyfish robots, described this week in the journal Advanced Materials Technologies, were engineered with pre-stressed polymers, a new method that lends strength to soft-bodied robots. "Our previous work focused on making soft robots that were inspired by cheetahs - and while the robots were very fast, they still had a stiff inner spine," Jie Yin, an assistant professor of mechani ... read more

Comment using your Disqus, Facebook, Google or Twitter login.



Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle

ROBO SPACE
UN rights chief slams virus response in China, Russia, US

More than 160 dead in Myanmar jade mine landslide

US installing AI-based border monitoring system

Hungary enlists army in fight against virus joblessness

ROBO SPACE
Europe radioactivity likely linked to nuclear reactor: UN watchdog

Precise measurement of liquid iron density under extreme conditions

ThinKom demonstrates IFC antenna interoperability with LEO, MEO and GEO satellites

Rocket Lab to launch Kleos Space data collecting payload

ROBO SPACE
Anammox bacteria generate energy from wastewater while taking a breath

Soft coral garden found in Greenland's deep sea

Unorthodox desalination method could transform global water management

Ethiopia says on track to fill mega-dam as African Union pushes for deal

ROBO SPACE
Arctic plants may not provide predicted carbon sequestration potential

In the Arctic, spring snowmelt triggers fresh CO2 production

Gnawing beavers could accelerate thawing of Arctic permafrost

The magnetic history of ice

ROBO SPACE
Nepal offers locust bounty as swarms threaten crops

Antibiotic use on crops isn't being monitored in most countries

U.S. beekeepers saw unsually high summertime colony losses in 2019

China dog meat festival goes ahead but virus takes a toll

ROBO SPACE
How volcanoes explode in the deep sea

Typhoon changed earthquake patterns

12 killed as rainstorms batter southern China

More than a million hit by India monsoon floods

ROBO SPACE
Senegal capital fights shoreline developers

Cameroon's President Biya under pressure over human rights

Renewed clashes in Tunisia's deprived south

Burkina army says it has destroyed two jihadist 'bases'

ROBO SPACE
Racism in the UK: the effects of a 'hostile environment'

Early peoples in Pacific Northwest were smoking smooth sumac

In the wild, chimpanzees are more motivated to cooperate than bonobos

Archaeologists find ancient circle of deep shafts near Stonehenge









The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.