FTC Rightfully Acts Against So-Called “AI Weapon Detection” Company Evolv
Deeplinks 2024-12-06
Summary:
The Federal Trade Commission has entered a settlementt with self-styled “weapon detection” company Evolv, to resolve the FTC’s claim that the company “knowingly” and repeatedly” engaged in “unlawful” acts of misleading claims about their technology. Essentially, Evolv’s technology, which is in schools, subways, and stadiums, does far less than they’ve been claiming.
The FTC alleged in their complaint that despite the lofty claims made by Evolv, the technology is fundamentally no different from a metal detector: “The company has insisted publicly and repeatedly that Express is a ‘weapons detection’ system and not a ‘metal detector.’ This representation is solely a marketing distinction, in that the only things that Express scanners detect are metallic and its alarms can be set off by metallic objects that are not weapons.” A typical contract for Evolv costs tens of thousands of dollars per year—five times the cost of traditional metal detectors. One district in Kentucky spent $17 million to outfit its schools with the software.
The settlement requires notice, to the many schools which use this technology to keep weapons out of classrooms, that they are allowed to cancel their contracts. It also blocks the company from making any representations about their technology’s:
- ability to detect weapons
- ability to ignore harmless personal items
- ability to detect weapons while ignoring harmless personal items
- ability to ignore harmless personal items without requiring visitors to remove any such items from pockets or bags
The company also is prohibited from making statements regarding:
- Weapons detection accuracy, including in comparison to the use of metal detectors
- False alarm rates, including comparisons to the use of metal detectors
- The speed at which visitors can be screened, as compared to the use of metal detectors
- Labor costs, including comparisons to the use of metal detectors
- Testing, or the results of any testing
- Any material aspect of its performance, efficacy, nature, or central characteristics, including, but not limited to, the use of algorithms, artificial intelligence, or other automated systems or tools.
If the company can’t say these things anymore…then what do they even have left to sell?
There’s a reason so many people accuse artificial intelligence of being “snake oil.” Time and again, a company takes public data in order to power “AI” surveillance, only for taxpayers to learn it does no such thing. “Just walk out” stores actually required people watching you on camera to determine what you purchased. Gunshot detection software that relies on a combination of artificial intelligence and human “acoustic experts” to purportedly identify and locate gunshots “rarely produces evidence of a gun-related crime.” There’s a lot of well-justified suspicion about what’s really going on within the black box of corporate secrecy in which artificial intelligence so often operates.
Even when artificial intelligence used by the government isn’t “snake oil,” it often does more harm than good. AI systems can introduce or exacerbate harmful biases that have massive negative impacts on people’s lives. AI systems have been implicated with falsely accusing people of welfare fraud, increasing racial bias in jail sentencing as well as policing and crime prediction, and
Link:
https://www.eff.org/deeplinks/2024/12/ftc-rightfully-acts-against-so-called-ai-weapon-detection-company-evolvFrom feeds:
Fair Use Tracker » DeeplinksCLS / ROC » Deeplinks