I Opted Out of Facial Recognition at the Airport—It Wasn’t Easy

The announcement came as we began to board. Last month, I was at Detroit’s Metro Airport for a connecting flight to Southeast Asia. I listened as a Delta Air Lines staff member informed passengers that the boarding process would use facial recognition instead of passport scanners.

As a privacy-conscious person, I was uncomfortable boarding this way. I also knew I could opt out. Presumably, most of my fellow fliers did not: I didn’t hear a single announcement alerting passengers how to avoid the face scanners.

To figure out how to do so, I had to leave the boarding line, speak with a Delta representative at their information desk, get back in line, then request a passport scan when it was my turn to board. Federal agencies and airlines claim that facial recognition is an opt-out system, but my recent experience suggests they are incentivizing travelers to have their faces scanned—and disincentivizing them to sidestep the tech—by not clearly communicating alternative options. Last year, a Delta customer service representative reported that only 2 percent of customers opt out of facial-recognition. It’s easy to see why.

As I watched traveler after traveler stand in front of a facial scanner before boarding our flight, I had an eerie vision of a new privacy-invasive status quo. With our faces becoming yet another form of data to be collected, stored, and used, it seems we’re sleepwalking toward a hyper-surveilled environment, mollified by assurances that the process is undertaken in the name of security and convenience. I began to wonder: Will we only wake up once we no longer have the choice to opt out?

visit here
visit homepage
visit our website
visit site
visit the site
visit the website
visit their website
visit these guys
visit this link
visit this page
visit this site
visit this site right here
visit this web-site
visit this website
visit website
visit your url
visite site
watch this video
web link
web site
website link
what do you think
what google did to me
what is it worth
why not check here
why not find out more
why not look here
why not try here
why not try these out
why not try this out
you can check here
you can look here
you can try here
you can try these out
you can try this out
you could check here
you could look here
you could try here
you could try these out
you could try this out
your domain name
your input here
have a peek at this web-site
have a peek here
Check This Out
this contact form
navigate here
his comment is here
check over here
this content
have a peek at these guys
check my blog
More about the author

Until we have evidence that facial recognition is accurate and reliable—as opposed to simply convenient—travelers should avoid the technology where they can.

The facial recognition plan in US airports is built around the Customs and Border Protection Biometric Exit Program, which utilizes face-scanning technology to verify a traveler’s identity. CBP partners with airlines—including Delta, JetBlue, American Airlines, and others—to photograph each traveler while boarding. That image gets compared to one stored in a cloud-based photo-matching service populated with photos from visas, passports, or related immigration applications. The Biometric Exit Program is used in at least 17 airports, and a recently-released Department of Homeland Security report states that CBP anticipates having the ability to scan the faces of 97 percent of commercial air passengers departing the United States by 2023.

This rapid deployment of facial recognition in airports follows a 2017 executive order in which President Trump expedited former President Obama’s efforts to use biometric technology. The Transportation Security Administration has since unveiled its own plan to improve partnership with CBP and to introduce the technology throughout the airport. The opportunity for this kind of biometric collection infrastructure to feed into a broader system of mass surveillance is staggering, as is its ability to erode privacy.

Proponents of these programs often argue that facial recognition in airports promotes security while providing convenience. But abandoning privacy should not be a prerequisite for achieving security. And in the case of technology like facial recognition, the “solution” can quickly become a deep and troubling problem of its own.

For starters, facial recognition technology appears incapable of treating all passengers equally at this stage. Research shows that it is particularly unreliable for gender and racial minorities: one study, for example, found a 99 percent accuracy rate for white men, while the error rate for women who have darker skin reached up to 35 percent. This suggests that, for women and people of color, facial recognition could actually cause an increase in the likelihood to be unfairly targeted for additional screening measures.

Leave a Reply

Your email address will not be published.