From Dangerous Minds:
“A school in Florida was forced into shutdown after an AI-based weapon detection system mistakenly triggered an entire campus lockdown by mistaking a clarinet for a firearm.”
The software was ZeroEyes, and it allows for human review for protection against a false positive. But in this case (like the Maryland chip case) the humans failed to discern that the “gun” wasn’t a gun.
While this may be a failure of AI weapons detection software, it is also a failure of the human reviewers.
