Samsung Banner sticky Advertisement
  • Home
  • Articles
  • IBM quits facial recognition, joins call for police reforms

IBM quits facial recognition, joins call for police reforms

9th June 2020
"IBM is getting out of the facial recognition business, saying it’s concerned about how the technology can be used for mass surveillance and racial profiling."

IBM is getting out of the facial recognition business, saying it’s concerned about how the technology can be used for mass surveillance and racial profiling.

Ongoing protests responding to the death of George Floyd have sparked a broader reckoning over racial injustice and a closer look at the use of police technology to track demonstrators and monitor American neighborhoods.

IBM is one of several big tech firms that had earlier sought to improve the accuracy of their face-scanning software after research found racial and gender disparities. But its new CEO is now questioning whether it should be used by police at all.

“We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,” wrote CEO Arvind Krishna in a letter sent Monday to U.S. lawmakers.

IBM’s decision to stop building and selling facial recognition software is unlikely to affect its bottom line, since the tech giant is increasingly focused on cloud computing while an array of lesser-known firms have cornered the market for government facial recognition contracts.

“But the symbolic nature of this is important,” said Mutale Nkonde, a research fellow at Harvard and Stanford universities who directs the nonprofit AI For the People.

Nkonde said IBM shutting down a business “under the guise of advancing anti-racist business practices” shows that it can be done and makes it “socially unacceptable for companies who tweet Black Lives Matter to do so while contracting with the police.”

Krishna’s letter was addressed to a group of Democrats who have been working on police reform legislation in Congress fueled by the mass protests over Floyd’s death. The sweeping reform package could include restrictions on police use of facial recognition.

The practice of using a form of artificial intelligence to identify individuals in photo databases or video feeds has come under heightened scrutiny after researchers found racial and gender disparities in systems built by companies including IBM, Microsoft and Amazon.

IBM had previously tested its facial recognition software with the New York Police Department, although the department has more recently used other vendors. It’s not clear if IBM has existing contracts with other government agencies.

Many U.S. law enforcement agencies rely on facial recognition software built by companies less well known to the public, such as Tokyo-based NEC or the European companies Idemia and Cognitec, according to Clare Garvie, a researcher at Georgetown University’s Center on Privacy and Technology.

A smaller number have partnered with Amazon, which has attracted the most opposition from privacy advocates since it introduced its Rekognition software in 2016.

Krishna’s letter called for police reforms and noted that “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling” and human rights violations.

Civil liberties advocates have raised concerns in recent weeks about the use of surveillance technology to monitor protesters or to enforce rules set to curb the coronavirus pandemic.

Even before the protests, U.S. senators this year had been scrutinizing New York facial recognition startup Clearview AI following investigative reports about its practice of harvesting billions of photos from social media and other internet services to identify people.

Joy Buolamwini, a researcher at the Massachusetts Institute of Technology whose research on facial recognition bias helped spur IBM’s re-examination of the technology, said Tuesday she commends the congressional police reform package for seeking restrictions on the use of police body cameras to scan people’s faces in real-time.

But she said lawmakers can go further to protect people from having governments scan their faces on social media posts or in public spaces without their knowledge.

“Regardless of the accuracy of these systems, mass surveillance enabled by facial recognition can lead to chilling effects and the silencing of dissent,” Buolamwini wrote in an email sent from Boston’s city hall, where she was testifying in support of a proposed ban on facial recognition use by municipal agencies. San Francisco and several other U.S. cities have enacted similar bans over the past year.

Also Read:

TWITTER ADDS FACT-CHECK WARNINGS TO TRUMP TWEETS FOR THE FIRST TIME

TRUMP THREATENS TWITTER OVER FACT CHECKS

WITH RARE CANDOR, EMPLOYEES PROTEST FACEBOOK'S TRUMP POLICY

TWITTER HAS MORE TOOLS TO USE AGAINST TRUMP, IF IT CHOOSES IT CHOOSES

WITH RARE CANDOR, EMPLOYEES PROTEST FACEBOOK'S TRUMP POLICY

  • Tags :
Compiled by : Debashish S Neupane Debashish S Neupane