[ad_1]
About 150 University of Michigan students and community members gathered at the Annenberg Auditorium Monday evening for a discussion with Kashmir Hill, technology reporter for The New York Times. In conversation with Shobita Parthasarathy, director of the Science, Technology, and Public Policy program, Hill discussed the consequences of technological innovation on consumer privacy.
Hill, whose writing has appeared in The New Yorker and The Washington Post, joined The New York Times in 2019 after working at companies including Gizmodo Media Group and Forbes Magazine. Her journalistic expertise centers on the consequences of new technologies including facial recognition technology, artificial intelligence surveillance and genetic analysis.
The conversation centered on Hill’s recently published book, “Your Face Belongs to Us”, which documents the rise of technology startup Clearview AI and the dangerous implications of its facial recognition product.
The book’s focus was shaped by an anonymous tip Hill received that claimed Clearview AI scraped billions of photos from websites to create a facial recognition tool, allowing people to upload a photo of an individual and see all available online photos of that person instantly. The company began selling their product to the police and — according to the tip Hill received — hundreds of state government departments across the country have adopted the software.
“There was this legal memo from a lawyer that they had hired to basically write a memo for police reassuring them that it was legal to use the tool,” Hill said. “I’m reading this and I’m just like, how does this exist?” The company was called Clearview AI. I had never heard of it. No one I talked to in the startup industry had heard of them.”
Big tech companies including Google and Facebook developed facial recognition technology in as early as 2011. Hill said unlike Clearview AI, Google and Facebook made the conscious decision not to release it due to concerns of malicious application.
“This was the one thing where they said, ‘There’s a line that we don’t want to cross,’ and what set Clearview apart was not that they were better at the technology,” Hill said. “It was that they were willing to cross that ethical line.”
Hill and Parthasarathy touched on the Detroit Police Department’s use of facial recognition software to identify criminal suspects since 2017. Hill explained how Detroit first came into her focus when she heard about a man named Robert Williams who was arrested for larceny in January of 2020, despite having never stolen anything.
“(Williams) had been identified by the facial recognition system,” Hill said. “They’d run his image, and they had done very little additional police work. The shoplifting crime he’d been arrested for was somebody who went into the Shinola in downtown Detroit and had stolen watches. And so they said, ‘OK, this proves he must be involved. He sold a watch once, and it’s a face recognition match,’ and that’s how he ended up being arrested.”
Since Williams had an alibi, the charges against him were dropped, and he later sued the Detroit Police Department. Williams’s case is one of many like it, exemplifying the potential harms of facial-recognition software in policing.
“I think a lot of people like the idea of facial recognition being used to solve crimes if it is a reliable method and if it’s being used in a responsible way,” Hill said. “The thing is, it just never stops there.”
Hill continued by giving a theoretical example of facial recognition technology being used outside of police departments. She explained how the use of this technology can easily encroach further on consumer privacy.
“Are you comfortable with (the) idea of having a private dinner, you’re gossiping, or you’re sharing some private instant information, and somebody next to you gets interested and then snaps a photo of you and now knows who you are and understands the context of your conversation,” Hill said. “I mean, right now, that is possible, and we just don’t have any laws or regulations against something like that.”
When asked about how the U.S. government can help protect consumer information, Hill stated that Illinois’ new privacy law is a good starting point. The law’s inclusion of private right of action ensures that companies are held accountable for using an individual’s biometric information without consent, by imposing a $5,000 fine per violation.
“It’s an effective way to make sure that companies aren’t using your face print or your voice print, which is a real concern now with generative AI, without consent,” Hill said. “I think something like that should really be a nationwide law, or every state should pass something like that to protect people.”
Both Parthasarathy and Hill noted that Europe — unlike the United States — has exhibited a strong sense of proactivity towards safeguarding consumer privacy. The European Union has passed comprehensive privacy laws, such as the General Data Protection Regulation, but the United States still lacks a federal privacy law.
“We’re a step ahead when developing the technology, and they’re a step ahead in developing the laws,” Hill said. “I wish we would adopt the laws as much as they adopt the technical dimensions.”
In an interview with The Michigan Daily after the event, LSA sophomore Jiyi Hong said the event revealed invasions of privacy she was previously not aware of.
“The Madison Square Garden thing about how they facial recognize everyone … I thought was crazy,” Hong said. “I’ve been to Madison Square Garden myself. … I didn’t know how affected I would be by that.”
Business sophomore Cynthia Li said in an interview with The Daily that Hill’s insights will have a pronounced influence on her technology-related decision making moving forward.
“You don’t think about how car companies can track your information and then sell it to insurance companies to see how much they could charge you,” Li said. “Will it change whether or not I buy a car? Probably not. But it makes you think about which car company (will) track more.”
Daily News Contributor Akshara Karthik can be reached at karthika@umich.edu.
Related articles
[ad_2]
Source link