Why EFF Doesn’t Support Bans On Private Use of Face Recognition
Deeplinks 2021-01-20
Summary:
Government and private use of face recognition technology each present a wealth of concerns. Privacy, safety, and amplification of carceral bias are just some of the reasons why we must ban government use.
But what about private use? It also can exacerbate injustice, including its use by police contractors, retail establishments, business improvement districts, and homeowners.
Still, EFF does not support banning private use of the technology. Some users may choose to deploy it to lock their mobile devices or to demonstrate the hazards of the tech in government hands. So instead of a prohibition on private use, we support strict laws to ensure that each of us is empowered to choose if and by whom our faceprints may be collected. This requires mandating informed opt-in consent and data minimization, enforceable through a robust private right of action.
Illinois has had such a law for more than a decade. This approach properly balances the human right to control technology—both to use and develop it, and to be free from other people’s use of it.
The menace of all face recognition technology
Face recognition technology requires us to confront complicated questions in the context of centuries-long racism and oppression within and beyond the criminal system.
Eighteenth-century “lantern laws” requiring Black and indigenous people to carry lanterns to illuminate themselves at night are one of the earliest examples of a useful technology twisted into a force multiplier of oppression. Today, face recognition technology has the power of covert bulk collection of biometric data that was inconceivable less than a generation ago.
Unlike our driver’s licenses, credit card numbers, or even our names, we cannot easily change our faces. So once our biometric data is captured and stored as a face template, it is largely indelible. Furthermore, as a CBP vendor found out when the face images of approximately 184,000 travelers were stolen, databases of biometric information are ripe targets for data thieves.
Face recognition technology chills fundamental freedoms. As early as 2015, the Baltimore police used it to target protesters against police violence. Its threat to essential liberties extends far beyond political rallies. Images captured outside houses of worship, medical facilities, community centers, or our homes can be used to infer our familial, political, religious, and sexual relationships.
Police have unparalleled discretion to use violence and intrude on liberties. From our nation’s inception, essential freedoms have not been equally available to all, and the discretion vested in law enforcement only accentuates these disparities. One need look no further than the Edward Pettus Bridge or the Church Committee. In 2020, as some people took to the streets to protest police violence against Black people, and others came out to protest COVID-19 mask mandates, police enforced their authority in starkly different manners.
Face recognition amplifies these police powers and aggravates these racial disparities.
Each step of the way, the private sector has contributed. Microsoft, Amazon, and IBM are among the many vendors that have built face recognition for police (though in response to popular pressure they have temporarily ceased doing so). Clearview AI continues to process faceprints of billions of people without their consent in order to help police identify suspects. Such companies ignore the human right to make informed choices over the collection and use of biometric data, and join law enforcement in exac
Link:
https://www.eff.org/deeplinks/2021/01/why-eff-doesnt-support-bans-private-use-face-recognitionFrom feeds:
Fair Use Tracker » DeeplinksCLS / ROC » Deeplinks