Body camera video equivalent to 25 million copies of “Barbie” is collected but rarely reviewed. Some cities are looking to new technology to examine this stockpile of footage to identify problematic officers and patterns of behavior.
Yea I share the same concerns about the “AI”, but this sounds like a good thing. It’s going through footage that wasn’t going to be looked at (because there wasn’t a complaint / investigation), and it’s flagging things that should be reviewed. It’s a positive step
What we should look into for this program is
how the flags are being set, and what kind of interaction will warrant a flag
what changes are made to training as a result of this data
how the privacy is being handled, and where the data is going (ex. Don’t use this footage to train some model, especially because not every interaction is out in the public)
Yea I share the same concerns about the “AI”, but this sounds like a good thing. It’s going through footage that wasn’t going to be looked at (because there wasn’t a complaint / investigation), and it’s flagging things that should be reviewed. It’s a positive step
What we should look into for this program is