As part of public conversation about the traffic death of #JaahnaviKandula, it has emerged that the Seattle Police Department had been doing an initial trial of an AI product for scamming audio in body camera footage looking for significant words. The product is called Truleo. The initial contract was canceled a few days after the police cruiser event that killed Jaahnavi.
The laughter on the video is horrifying. RantWoman keeps trying to keep in mind traumatizing events, frustration at seeing how some situations are handled, and the value, painful as it is of being able to analyze many aspects of a horrific situation.
RantWoman admits to a certain reflexive thought that if the president of the police union hates a product it is probably doing something valuable for everyone, community and police.
RantWoman written elsewhere that claims that the laughter was about what happens sometimes when lawyers get hold of a situation. RantWoman just imagines that lawyerly phrasing like "not in SPOG contract" probably had something to do with the abrupt cancellation of the Seattle PD's contract with Truleo.
RantWoman is definitely not swimming in every stream about public conversation about use of this technology. Nor has RantWoman looked back to find any media or city council discussion of this product and the trial program. RantWoman also pretty ruthlessly prefers to balance the power of AI with severe skepticism about both built-in biases and asking the wrong questions of data.
That said, RantWoman also thinks Seattle and the police union SHOULD agree to careful testing of the value and tradeoffs of this software. RantWoman has no idea whether the product has any competitors, but sure, someone should check.
The software used in Seattle is developed by a company called Truleo. As a public service, RantWoman is wading into the company's website, fully conscious that it is MARKETING material.
Quoting directly from the website:
"Truleo is an audio analytics company that leverages AI-driven analysis to analyze police body camera footage and generate greater efficiencies and deeper insights for law enforcement teams. We are a group of ambitious people working on audacious technology.
"We are proud of the culture we’ve created across the United States. Our team is made up of dynamic, humble engineers, data scientists, sales people and business leaders. Our culture is comprised of seven core principles: curiosity, optimism, candor, work ethic, empathy, self awareness and integrity. Success at Truleo is beyond technical aptitude. Those who demonstrate these values will find themselves surrounded by like-minded individuals ready to tackle challenges together."
The website boasts testimonials from a number of different police agencies and some statistics about reduced sergeant time reviewing videos, greater "compliance," and better explanations by officers of what is happening.
"Compliance" for RantWoman at first conjures up images of the South Park Cartman and "You must respect my authoritay." RantWoman recognizes that is not the most trusting and respectful place to start, but RantWoman imagines others in the community start out from even more hostile perspectives.
In any case, RantWoman made it through a case study and has a couple comments just to start the conversation.
Screen reader users may need to OCR this.
Again, RantWoman realizes this is MARKETING. However, one big takeaway for RantWoman from the case studies was the relationship between better officer communication and improvied compliance. That is a formulation RantWoman can at least start with.
Here are a couple videos courtesy of the Google.
CNN Reporting: Could AI review help prevent...?
RantWoman is NOT reposting video of Tyree Nichols' death.
--In a city like Seattle, police officers have all kinds of different accents. That is as it should be. Unfortunately every speech to text algorithm RantWoman has ever dabbled with does a horrible job with any kind of accent. So RantWoman would at a minimum want to know what kind of language corpus the system is trained on, whether it has flags for detecting conversation that is not in English, and something about how the machine learning over time can be expected to improve accuracy of system output.
--RantWoman would like to see how the software might impact complaints. It is disheartening for the public when the current accountability process allows officers to wrack up many "unsustained" complaints rather than improved officer behavior
--Does the software detect things that are not in English? The CEO's demo says the software only addresses officer voices. RantWoman actually thinks it would be valuable to improve officers' abilities to identify whether or not someone is struggling in English as well as speech patterns that might identify some kind of disability as well as whether or not the other party or parties might be under the influence of some substance. The disability angle could be important: people with disabilities represent a depressingly high percentage of police encounters gone tragically bad.
--RantWoman is not unconcerned about the ACLU and considerations of picking up civilian voices. RantWoman expects that defense attorneys are capable of addressing issues for anything criminal that emerges. Privacy concerns are trickier because recognizing issues specific to different members of the public is also valuable in efforts to improve how police handle different kinds of encounters.
Anyway, RantWoman wishes the software could have a more extensive trial.
And no amount of AI audio analysis software is going to change the nasty physics of high vehicle speed, vehicle size, pedestrian inability to respond to whatever indications there were of an oncoming police vehicle. But MAYBE there are things that could be learned.
No comments:
Post a Comment