Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations

Civil rights advocates say laws need to catch up with AI technology

LEILA FADEL, HOST:

Artificial intelligence has led to huge leaps in surveillance technology in recent years. Many experts are worried that it's gotten so good it threatens to upend the balance of power between citizens and their government. NPR's Geoff Brumfiel has more on why some experts think the law needs to catch up with the tech.

GEOFF BRUMFIEL, BYLINE: If you're curious whether artificial intelligence could be used to build a dystopian surveillance state where the government can track your every move, the answer is yes. And we know this because one government has already done it, China.

DARREN BYLER: The state has invested around $100 billion to build these systems in Xinjiang.

BRUMFIEL: Darren Byler is an anthropologist at Simon Fraser University who's worked for over a decade in the Chinese province of Xinjiang. It's home to millions of people who are part of the Uyghur ethnic group. Beijing considers Uyghurs a threat, and they've installed a massive surveillance network to track them. Byler's seen it in action.

BYLER: There are automated, sort of passive camera systems that are just watching all the time and watching movement. And you don't even realize that you're being watched.

BRUMFIEL: The exact details of how the system works aren't known, but Byler says based on academic papers in Chinese law enforcement journals and interviews he's done, it's clear that AI plays a big role. It's used to watch things like cell phones and license plates and recognize patterns in people's behaviors.

BYLER: They can really track the registered population of the region, which is 25 million people, everywhere that they are. And so, you know, you can plug in their name into the system and see where they are at any point in time. And then you can roll it back and see where they were the day before and the day before that.

BRUMFIEL: And one of the most important AI tools underpinning this giant surveillance apparatus is facial recognition. Now, you might not even think of facial recognition as AI, but it is. Modern facial recognition tools like what opens your phone are actually powered by machine-learning algorithms that were developed just a few years before things like ChatGPT. For the past decade, companies have been building ever more powerful facial recognition software. In China, it's being used by the state security apparatus, but in the West, it's increasingly being used by law enforcement.

JOSEPH COURTESIS: It's not every case, but I think it's becoming more and more routine. You know, the amount of images that we have access to now, policing has changed.

BRUMFIEL: Joseph Courtesis is a retired New York City police inspector who ran the city's real-time crime center. Courtesis says that New York police are not using facial recognition at all like they do in China. There are lots of departmental rules. They only use it to generate leads, not as evidence in court. Policy says they're only supposed to use footage related to crimes, and they compare faces mostly to images in a mug shot database.

COURTESIS: These were images of individuals who have - had been arrested for photographing offenses and their images were lawfully obtained.

BRUMFIEL: Courtesis is now consults for one of the nation's largest facial recognition companies. It's called IDEMIA. Their algorithms are among the best out there.

COURTESIS: These algorithms are performing over 99% accurate.

BRUMFIEL: That's according to an evaluation by the National Institute of Standards and Technology. It showed that IDEMIA's face recognition technology is 99.88% accurate, though there are many real-world factors like camera resolution and lighting, which means it won't work that well in every scenario. Still, he argues, the software has huge advantages over the old system, which often involved humans perusing giant books of mug shots and trying to pick out the perp based on memory.

COURTESIS: We had archaic ways of doing it, often led to dead ends. It was probably not very accurate at all. And may - I'm not going to say has, but may have contributed to wrong identifications.

BRUMFIEL: But as China's surveillance system shows, facial recognition can be a lot more than just an upgrade to using mug shots and lineups. Nate Freed Wessler is with the American Civil Liberties Union. He says especially as AI continues to improve, this technology is giving the police the ability to surveil citizens at a scale they've never had before.

NATE FREED WESSLER: I'm concerned. I'm deeply concerned about the potential of this technology to really upset the kind of balance of power between us the people and the government that we've long expected.

BRUMFIEL: The ACLU opposes all uses of facial recognition by law enforcement, but it's particularly worried about live facial recognition. AI facial recognition is now so strong it can enable mass real-time tracking. Several U.S. police departments have experimented with live technology, though it appears not on a wide scale. But elsewhere, it is starting to get used. This spring in London, England, police deployed a live facial recognition system in some public spaces. So far this led to two arrests, according to department statistics. They were made by scanning more than 80,000 faces. The system didn't identify the faces of most of the people it saw, but in theory it could.

WESSLER: Police have never had that capability across the entire U.S. population, and that's the specter that we're worried about.

BRUMFIEL: Only a handful of states and municipalities have laws preventing live tracking or putting any limits on facial recognition, says Clare Garvie with the National Association of Criminal Defense Lawyers. She says more cities and states need to seriously consider how to regulate facial recognition's use by police.

CLARE GARVIE: Where are we drawing the lines? Where do we build inefficiencies back into a technology that has created massive efficiency in law enforcement?

BRUMFIEL: Should police be able to surveil a protest to learn more about who shows up? Should they be able to stake out a house with a camera and identify everyone who enters and leaves? These are real scenarios. And Garvie argues that the technology could challenge fundamental constitutional rights to privacy, assembly and association.

GARVIE: We are always putting the technological cart before the legal horse, so to speak, and the consequence is on real people.

BRUMFIEL: Garvie doesn't think the U.S. is headed towards a dystopian surveillance state anytime soon, but she says the best way to keep it that way is to pass strong laws that regulate AI for surveillance.

Geoff Brumfiel, NPR News.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

Geoff Brumfiel
Geoff Brumfiel works as a senior editor and correspondent on NPR's science desk. His editing duties include science and space, while his reporting focuses on the intersection of science and national security.