In a move to spare police force the trauma, artificial intelligence is set to take on the gruelling task of scanning for images of child abuse on suspects’ phones and computers.
In a move that is considered to be humane for the British police, an AI system is being created by the Metropolitan Police’s (Met) digital forensics department to sort out disturbing images that involve child abuse and pedophilia among others. According several reports, physical sorting of those images by cops had caused them trauma that took a long time heal.
“We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans… You can imagine that doing that for year-on-year is very disturbing… With the help of cloud storage, AI could be trained to detect abusive images within two to three years,” Mark Stokes, Met’s head of digital and electronics forensics, said.
Reportedly, they are already drawing up plans to move their sensitive data to cloud providers such as Amazon Web Services, Google or Microsoft.
The Met already employs image recognition software which can recognise contraband articles as well as guns, drugs and cash. But reportedly, the software is unable to identify indecent images and videos containing nudity. Stokes says that one of the reasons could be wallpapers. “For some reason, lots of people have screen-savers of deserts and it [the software] picks it up thinking it is skin colour,” he said.
Last year, the same Met software had trawled through 53,000 different devices for incriminating evidence.
In 2013, a charity called Terre des Hommes had created a computer-generated avatar called “Sweetie” to identify over 1,000 adults who were willing to pay children in developing countries to perform sexual acts in front of the web-cam.
The post AI Trained To Detect Child Abuse Images To Save Police From Trauma appeared first on Analytics India Magazine.