- How worried should we be about the Wuhan coronavirus?
- If you fake being nice at work, your career will go nowhere: Study
- Peter Dutton received a $200,000 sports grant five months before the election
- “This is for you” Annabella Sciorra testifies that Harvey Weinstein raped her
- The simple life: The fallacy of our national stereotype
After the Christchurch massacre was freely streamed on Facebook, our Californian overlords gave bodycams to cops to better train their AI. Should be fine.
The Christchurch massacre was awful enough, however, having the shooter livestream the event was both a hideous and (awfully) a completely logical next step in the visceral spectrum of revealed violence.
Halting the stream should have been a job for Facebook’s automatic censor, removing the footage before I saw things I shouldn’t have seen.
The reason why it wasn’t, according to Facebook, is because it was shot in first-person.
“This was a first-person shooter video, one where we have someone using a GoPro helmet with a camera focused from their perspective of shooting,” they explained, adding that it “was a type of video we had not seen before.”
In an effort to stop a repeat, Facebook is sending their AI out on the beat with the British bobbies of the Metro Police. Facebook will be supplying the rozzers with bodycams, on the proviso that they give Facebook the footage shot during gun range exercises and training programs.
“With this initiative, we aim to improve our detection of real-world, first-person footage of violent events,” Facebook wrote in a news release, “and avoid incorrectly detecting other types of footage such as fictional content from movies or video games.”
Facebook is also apparently keen on introducing to the program within the United States.
Neil Basu, the Met’s Assistant Commissioner for Specialist Operations, said that “…the technology Facebook is seeking to create could help identify firearms attacks in their early stages and potentially assist police across the world in their response to such incidents,” musing in a statement.
Basu also believes that if Facebook’s algorithm gets better at identifying (and stopping) livestreams of attacks, it could “significantly help prevent the glorification of such acts and the promotion of the toxic ideologies that drive them.”
True, just as long as Facebook doesn’t get a hold of the data recorded beyond training exercises – as the long arm of the law controlled by Facebook has me reaching for the smelling salts and my largest alfoil hat.