Digg this story   Add to del.icio.us  
PhotoDNA scans images for child abuse
Robert Lemos, SecurityFocus 2009-12-18

Internet service providers may have better success at scanning their networks to actively seek out illicit images of child abuse, thanks to technology donated by Microsoft and Dartmouth College.

On Wednesday, the software giant and the well-known college announced that they had developed a software program to match modified images to the original by using a form of robust hashing that can ignore certain types of changes, such as resizing, cropping and the inclusion of text. The team donated the program, dubbed PhotoDNA, to the National Center for Missing and Exploited Children.

The NCMEC will make the program available to ISPs to detect the "worst of the worst" in child pornography -- those images that show pre-pubescent children being sexually abused, said Ernie Allen, CEO and president of the NCMEC.

The intent is to "use the technology very narrowly and very specifically," Allen said.

The agreement follows a number of other successful initiative in fighting child abuse online. In June 2008, three ISPs signed an agreement with the New York State Attorney General's office to police their networks for child pornography and donate money to the state and the NCMEC to fund investigations. In 2007, MySpace agreed with the attorneys general of more than 40 states to turn over information regarding sex offenders on its network.

While law enforcement has successfully prosecuted hundreds of cases of possession and distribution of illicit images, a small number of cases have underscored overzealous prosecutions. In one case, a Massachusetts government agency fired and reported one of its workers for having child pornography on his laptop, but a later investigation showed that the lack of functioning antivirus software resulted in his laptop being compromised and subsequently filled with illicit images.

Microsoft has already tested the software on its networks and plans to roll out the tool to scan public sources for images for child pornography, said Brad Smith, senior vice president and general counsel at the software giant.

"It is not enough to catch the perpetrators, we have to stop the images to prevent the subjects from being a victim again," Smith said.

While Microsoft will scan public sources for matches to a small database of the worst abuse images, the software giant will not scan private data nor communications, Smith said. ISPs, the government and privacy advocates should discuss the legal and policy issues of such scanning, he said.

Child pornography is a major priority of law enforcement and the detection of images of abuse has grown significantly, according to the NCMEC. Since 2003, the organization has viewed and analyzed 30 million images classified as child pornography, the group claims. Allen predict that the group will deal with another 9 million in 2010.

Much of the increase in child pornography is due to the Internet's ability to allow communities to form among traders of child pornography, he said.

"They (the criminals) no longer view themselves as aberrant," Allen said. "We made enormous progress on the commercial side ... but it has migrated to the noncommercial side."

In the latest announcement, a large scale test of the PhotoDNA tool found that less than one false positive occurred in every billion images scanned, said Hany Farid, a professor of computer science at Dartmouth and co-developer of PhotoDNA. In addition, the software recognizes about 98 percent of images derived from those in its database.

"We tested it over billions and billions of images," he said. "We tried very hard to make it very efficient ... and to minimize the false alarm rate."

If you have tips or insights on this topic, please contact SecurityFocus.


    Digg this story   Add to del.icio.us  
Comments Mode:


 

Privacy Statement
Copyright 2010, SecurityFocus