It turns out that child abuse photos are one of those things that “stays on the internet forever.”
This week, a New York Times report detailed the story of two sisters whose father did the unimaginable – posted horrific photos and videos of them when they were just 7 and 11 years old being violently drugged and raped. Although the person responsible for this is in jail and had his computer seized, photos of these two women have been found on “over 130 child sexual abuse investigations involving mobile phones, computers, and cloud storage accounts.”
How Companies Contribute to the Problem
They would be part of the first generation whose parent’s abuses haunt them throughout their lives because of technology. Degenerate individuals save and store these photos on Google Drive, DropBox, Microsoft OneDrive, the Cloud, and any other photo sharing and storing apps they can use.
The lenient security on these sites, (i.e. they’re hackable) is often exploited by other criminals.
Although some companies have begun cracking down in recent years, with more than 45 million photos flagged this year for underage, inappropriate content, it’s not enough.
To this day, those who are victims as children struggle to hide their identity because they are recognizable. Lawyers can get their videos taken down from public sites, when they’re noticed and flagged, but because child abuse photos are illegal individuals that find them save them in their private stores and share them when they need.
“But the same industry has consistently failed to take aggressive steps to shut it down, an investigation by The New York Times found. Approaches by tech companies are inconsistent, largely unilateral and pursued in secret, often leaving pedophiles and other criminals who traffic in the material with the upper hand,” according to the NYT.
What Tech Companies Can Do About Child Abuse Photos
According to experts, the technology to find and stop these photos from circulating exists. Newly detected photos can be matched up with databases of illegal material. Technology can easily identify these videos and pictures that are taken off of seized computers by police departments.
But most companies don’t use it.
Amazon’s cloud storage programs don’t utilize any methods to detect or protect individuals – they won’t scan photos at all on their services. Neither does Apple. Apple and Facebook each have messaging apps that are encrypted, making it impossible to detect illegal photos when they’re being shared right on the platform. With DropBox, Google, and Microsoft, technology can scan images that have been shared, but not ones that remain stored.
Pedophiles meet on various chat apps and share photos. Sometimes authorities manage to infiltrate these groups and track down the poster. Most of the time though, the actual companies whose platforms are being used to distribute child pornography overlook this factor.
SnapChat and Yahoo apparently scan photos but not videos.
All reports show that the amount of photos featuring underage victims is exploding. A lot of this content is already in police and law enforcement databases which companies can work with and access.
Let’s hope that this terrible consequence of the digital world can be avoided as companies begin to take note of this issue.