TSA’s facial recognition technology raises questions about bias and security


0


The Transportation Security Administration’s (TSA) trial rollout of biometric facial recognition technology at airport security checkpoints has raised questions about the risks it poses to travelers’ privacy and the possibility of discrimination.

TSA’s program gives passengers the option to insert their ID into a machine while looking at a camera until a screen below flashes, “Photo Complete.” Then the passenger goes through the control. The TSA says the images are then deleted and the camera only turns on when a person places their ID card in the scanner, according to Jason Lim, the TSA’s manager of identity management.

The TSA is testing the technology 16 major airports, and the desk insists that the program is voluntary. However, on March 14 interview with Kyle Arnold of The Dallas Morning NewsTSA Administrator David Pekoske said that if the TSA gets its way, biometric screening technology won’t end up being optional.

But even without mandatory facial recognition, fears of delay or poor treatment by TSA employees can lead travelers to submit when they’d rather not. “When there’s a power imbalance between powers, consent isn’t really possible,” says Meg Foster, a justice fellow at the Georgetown Law Center on Privacy and Technology. “How does TSA expect them to see and read an inconspicuous notice, let alone tell a TSA agent that they want to opt out of facial recognition? Especially when it may not be clear to them what the consequences of opting out will be.”

The trading volume of ‘TSA emphasizes that there is no “discernible difference in the algorithm’s ability to recognize passengers based on things like age, gender, race and ethnicity” due to the use of advanced camera technology. But earlier research paints a different picture. In a 2018 to study on the facial analysis of three commercial gender classification algorithms by the Gender Shades Project at the Massachusetts Institute of Technology, researchers found that “darker-skinned females are the most misclassified group.” An independent assessment from the National Institute of Standards and Technology in 2019 also found that facial recognition technologies are “least accurate on women of color.

In addition, the TSA risks the privacy of travelers by collecting and sending personal data to the Department of Homeland Security (even if the TSA claims that the data is anonymized, encrypted and eventually deleted).

The Cybersecurity and Infrastructure Security Agency noted last year that the federal government is failing to recruit enough cyber talent, making government information security vulnerable to hacks. The last few years have seen numerous data breaches of federal agencies, including a violation of the US Department of Transportation just last week.

As William Owen, the communications director at the Surveillance Technology Oversight Project, explains: “Even if the plan is to eventually delete all photos and IDs, the data collection during the current pilot program leaves more than enough time to capture the most sensitive passenger information farms. in danger.”

These problems led a group of five senators to write a later to Pekoske in February of this year about the TSA’s “alarming use of facial recognition technology” at US airports. “America’s civil rights are threatened if the government deploys this technology on a massive scale, without sufficient evidence that the technology is effective on people of color and does not violate Americans’ right to privacy,” the letter said.

Sens. Jeff Merkley (D–Ore.), Cory Booker (D–NJ), Bernie Sanders (I–Vt.), Ed Markey (D–Mass.), and Elizabeth Warren (D–Mass.) called for an immediate “halt” of the TSA’s tech deployment until questions about discrimination, transparency and data retention are answered. Similarly, Jeramie Scott of the Electronic Privacy Information Center recommended an external audit to determine “that the technology does not disproportionately affect certain groups and that the images are deleted immediately.”

Like all TSA techniques, facial recognition is only one piece of security theater. Requiring passengers to partially undress, limiting how much shampoo they can fly with, and being subject to random searches have not made the skies any safer. Using an algorithm to match their faces to government identification won’t either. But unlike that other scandal, it will potentially expose them to a government data breach.

Source link


Like it? Share with your friends!

0
ncult

0 Comments

Your email address will not be published. Required fields are marked *