O
8

My city's new crime prediction AI flagged my block for extra patrols based on a single noise complaint

It took me 3 months and a formal public records request to get the data audit trail. The system used a 5 year old arrest record from a previous tenant as a 'predictor', and weighted a neighbor's loud party call as a 'pattern'. This isn't just a glitch, it's a design choice that punishes people for where they live. How can we trust these tools when their core logic is built on biased historical data? Has anyone else had to fight a bad algorithmic decision from their local government?
3 comments

Log in to join the discussion

Log In
3 Comments
wyatt_green31
Ugh, my town's parking app did the same thing.
5
iris574
iris57419d ago
Call the city's finance office directly (they can fix it).
6
claire_sullivan
Look, the system has to start somewhere. Using old data is just how these things learn, and a noise complaint plus past trouble at an address is a logical flag. It's not about punishing you, it's about using all available info to guess where cops might be needed. If we only used perfect, brand new data, the system would never make a single prediction. The whole point is to stop crime before it happens, even if that means a few extra patrols on your block.
3