Banner Top
Thursday, November 21, 2024

The two sisters live in fear of being recognised. One grew out her bangs and took to wearing hoodies. The other dyed her hair black. Both avoid looking the way they did as children.

Ten years ago, their father posted explicit photos and videos on the internet of them, just 7 and 11 at the time. Many captured violent assaults, including him and another man drugging and raping the 7-year-old.

The men are now in prison, but in a cruel consequence of the digital era, their crimes are finding new audiences. The two sisters are among the first generation of child sexual abuse victims whose anguish has been preserved on the internet.

This year alone, photos and videos of the sisters were found in over 130 child sexual abuse investigations.

The digital trail of abuse haunts the sisters relentlessly, they said, as does the fear of a predator recognising them from the images.

“That’s in my head all the time — knowing those pictures are out there,” said E, the older sister, who is being identified only by her first initial to protect her privacy. “Because of the way the internet works, that’s not something that’s going to go away.”

Horrific experiences like theirs are being recirculated across the internet because search engines, social networks and cloud storage are rife with opportunities for criminals to exploit.

The scope of the problem is only starting to be understood because the tech industry has been more diligent in recent years in identifying online child sexual abuse material, with a record 45 million photos and videos flagged last year.

But the same industry has consistently failed to take aggressive steps to shut it down, an investigation by The New York Times found.

The companies have the technical tools to stop the recirculation of abuse imagery by matching newly detected images against databases of the material. Yet the industry does not take full advantage of the tools.
Amazon, whose cloud storage services handle millions of uploads and downloads every second, does not even look for the imagery. Apple does not scan its cloud storage and encrypts its messaging app, making detection virtually impossible. Dropbox, Google and Microsoft’s consumer products scan for illegal images, but only when someone shares them, not when they are uploaded.

And other companies, including Snapchat and Yahoo, look for photos but not videos.

Facebook thoroughly scans its platforms, accounting for over 90% of the imagery flagged by tech companies last year, but the company is not using all available databases to detect the material. And Facebook has announced that the main source of the imagery, Facebook Messenger, will eventually be encrypted, vastly limiting detection.

Tech companies are far more likely to review photos and videos and other files on their platforms for facial recognition, malware detection and copyright enforcement. But some businesses said looking for abuse content is different because it can raise significant privacy concerns.

The main method for detecting the illegal imagery was created in 2009 by Microsoft and Hany Farid, now a professor at the University of California, Berkeley. The software, known as PhotoDNA, can use computers to recognise photos, even altered ones, and compare them against databases of known illegal images. Almost none of the photos and videos detected last year would have been caught without systems like PhotoDNA.

But this technique is limited because no single authoritative list of known illegal material exists, allowing countless images to slip through the cracks. The most commonly used database is kept by a federally designated clearinghouse, which compiles digital fingerprints of images reported by U.S. tech companies. Other organisations around the world maintain their own.

Even if there were a single list, however, it would not solve the problems of newly created imagery flooding the internet or the surge in livestreaming abuse.

For victims like E. and her sister, the trauma of the constantly recirculating photos and videos can have devastating effects. Their mother said both sisters had been hospitalised for suicidal thoughts.

And because online offenders are known to seek out abused children, even into adulthood, the sisters do not speak publicly about the crimes against them. Their emotional conversations with the Times were the first time they’ve spoken publicly about the abuse.

“You get your voice taken away,” E. said. “Because of those images, I don’t get to talk as myself. It’s just like, Jane Doe.”

Searching for Abuse

Joshua Gonzalez, a computer technician in Texas, was arrested this year with over 400 images of child sexual abuse on his computer, including some of E and her sister.

Gonzalez told authorities that he had used Microsoft’s search engine, Bing, to find some of the illegal photos and videos.

Microsoft had long been at the forefront of combating abuse imagery, even creating the PhotoDNA detection tool a decade ago. But many criminals have turned to Bing as a reliable tool of their own.

A report in January commissioned by TechCrunch found explicit images of children on Bing using search terms like “porn kids.” In response to the report, Microsoft said it would ban results using that term and similar ones.

The Times created a computer program that scoured Bing and other search engines. The automated script repeatedly found images — dozens in all — that Microsoft’s own PhotoDNA service flagged as known illicit content. Bing even recommended other search terms when a known child abuse website was entered into the search box.

While the Times did not view the images, they were reported to the National Centre for Missing and Exploited Children and the Canadian Centre for Child Protection, which work to combat online child sexual abuse.

Similar searches by the Times on DuckDuckGo and Yahoo, which use Bing results, also returned known abuse imagery. In all, the Times found 75 images of abuse material across the three search engines before stopping the computer program.

Both DuckDuckGo and Yahoo said they relied on Microsoft to filter out illegal content.

After reviewing the Times’ findings, Microsoft said it uncovered a flaw in its scanning practices and was re-examining its search results. But subsequent runs of the program found even more.

Child abusers are well aware of Bing’s vulnerabilities. Paedophiles have used Bing to find illegal imagery and have also deployed the site’s “reverse image search” feature, which retrieves pictures based on a sample photo.

Separate documentation provided by the Canadian centre showed that images of child sexual abuse had also been found on Google. One image depicts a woman touching the genitals of a naked 2-year-old girl. Google declined to take down the photo, stating in an email to the Canadian analysts that while it amounted to paedophilia, “it’s not illegal in the United States.”

When the Times later asked Google about the image and others identified by the Canadians, a spokesman acknowledged that they should have been removed, and they subsequently were. The spokesman also said that the company did not believe any form of paedophilia was legal, and that it had been a mistake to suggest otherwise.

A week after the images were removed, the Canadian centre reported two additional images to Google. Google told the Canadian centre that neither image met “the reporting threshold” but later agreed to remove them.

“It baffles us,” said Lianna McDonald, the centre’s executive director.

Criminals Everywhere

The problem is not confined to search engines.

Paedophiles often leverage multiple technologies and platforms, meeting on chat apps and sharing images on cloud storage.

The digital trail that has followed one young abuse victim, a girl who was raped by her father over four years starting at age 4, is sadly representative of the pattern.

The girl, now a teenager, does not know that footage of her abuse is on the internet. Her mother and stepfather wish it would stay that way.

Sex offenders frequently share photos and videos of the girl’s abuse. When the images are detected, the FBI notifies the girl’s family or their lawyer. Over the past four years, they have received over 350 notifications.

When the girl turns 18, she will become the legal recipient of reports about the material. At that point, her mother and stepfather hope, she will be better able to handle the news. They also hold out hope that the tech companies will have managed to remove the images from the internet by then.

It has been 10 years since PhotoDNA was developed at Microsoft, yet the industry’s efforts to detect and remove known illegal photos remains uneven and cloaked in secrecy.

The industry’s response to video content has been even more wanting. There is no common standard for identifying illegal video content, and many major platforms do not even scan for it.

Tech companies have known for years that videos of children being sexually abused are shared on their platforms. One former Twitter employee described gigabytes of illegal videos appearing more quickly than they could be taken down on Vine, the video service since shuttered by Twitter.

That was in 2013, when fewer than 50,000 videos were reported. Last year, tech companies referred more than 22 million to the National Centre for Missing and Exploited Children, the nonprofit clearinghouse mandated by the federal government to act as a repository for the imagery.

Efforts to tackle the urgent problem of video content have run into roadblocks of the companies’ own making. Google, for example, developed video-detection technology that it makes available to other companies, and Facebook also has a system. But the two cannot share information because the fingerprints generated by each technology are not compatible.

In 2017, the tech industry approved a process for sharing video fingerprints to make it easier for all companies to detect illicit material. But the plan has gone nowhere.

The lack of action across the industry has allowed untold videos to remain on the internet. Of the center’s 1.6 million fingerprints, less than 3% are for videos.

None of the largest cloud storage platforms scan for abuse material when files are uploaded.

While the files may be scanned later — when users share them, for example — some criminals have avoided detection by sharing their account logins rather than the files themselves.

A spokesman for Amazon, which does not scan for abuse imagery whatsoever, said that the “privacy of customer data is critical to earning our customers’ trust.” Microsoft Azure also said it did not scan for the material, citing similar reasons.

Several digital forensic experts and law enforcement officials said the companies were being disingenuous in invoking security.

An Uncertain Future

A heinous case in Pennsylvania warns of a tsunami of new, hard-to-detect abuse content through livestreaming platforms.

More than a dozen men from around the world were logged in to the business conference software Zoom to watch a livestream of a man sexually assaulting a 6-year-old boy.

None of the major tech companies is able to detect, much less stop, the livestreaming through automated imagery analysis.

And while Facebook, Google and Microsoft have said they are developing technologies that will find new photos and videos on their platforms, it could take years to reach the precision of fingerprint-based detection of known imagery.

Men in the Pennsylvania case were caught in 2015 only because Janelle Blackadar, a detective constable with the Toronto police, discovered the broadcast while conducting an undercover investigation. The detective recorded the stream using screen-capturing technology and within hours alerted Special Agent Austin Berrier of Homeland Security Investigations.

The 6-year-old boy was rescued the next day, and 14 men from multiple states have since been sentenced to prison.

In January, the assailant, William Byers Augusta, 20, received a sentence of up to 90 years.

Banner Content

Message from the CEO

On Internatonal Women’s Day

The mankind will not exist if there is no woman on this planet .Nature gave this power to woman to carry the source of existence.In today’s world even there are lots of awareness and activities to protect the rights of women there are still many evidence of discrimination and abuse for women . Women are still facing difficulties to live a decent and happy life . The physical or gender differences should not matter , what is most important is that we are all human being and Humanity is above all .

TV Channel

img advertisement

FOLLOW US

YOUTUBE

Advertisement

img advertisement

Social

Advertisement

img advertisement