Using Technology to Prevent the Distribution of Child Pornography

The 2004 decision in Center for Democracy and Technology v. Pappert sheds light on the complications surrounding the use of technology to reduce the consumption of child pornography today.

Government filtering as a mechanism to combat child pornography 

In 2003, the plaintiffs in Pappert – the Center for Democracy and Technology, a nonprofit corporation concerned with Internet public interest issues; the American Civil Liberties Union of Pennsylvania, a membership association concerned with civil rights protection; and, Plantagenet, Inc., an Internet service provider – asserted that Pennsylvania’s Internet Child Pornography Act violated the First Amendment by restricting users’ access to innocent websites. Gerald J. Pappert, the defendant, served as Attorney General of the Commonwealth of Pennsylvania at the time the litigation was initiated. The plaintiffs specifically argued that the technical limitations of the available methods to comply with the Act led Internet service providers (“ISPs”) to block a large number innocent websites. This overblocking, according to the plaintiffs, violated the First Amendment. Pappert, 337 at 610.

By way of background, in 2002 the State of Pennsylvania enacted the Internet Child Pornography Act, which required ISPs to remove or disable access to child pornography accessible through its services after notification by the Attorney General. ISPs could comply with these notifications by implementing filters to block accessible content through its service as an ISP. The Act addressed three possible types of filters: DNS filtering, IP filtering and URL filtering. DNS filtering stops requests for all subpages under the blocked domain name. Rather than distinguishing between permissible and impermissible content, IP filtering blocks innocent sites because of the prevalence of shared IP addresses. In addition, because they are not entirely static, IP addresses could be reassigned and, unless the blocking list is updated, innocuous content may be blocked.

Moreover, as was uncovered over the course of litigation, savvy Internet users were able to easily overcome the DNS filters. The publishers of the content could evade DNS filters by using an IP address as an URL domain name or by changing a portion of the domain name. Users could also avoid these filters by manually entering the IP address for a DNS server not controlled by their ISP. Filters may also be avoided by using an anonymous proxy server, or a service such as Proxify, which routes all requests through the so-called proxy server and then sends the request to the desired website.

Plaintiffs argued that, “due to the technical limitations of the methods used by ISPs to comply with the Act, the efforts of ISPs to disable access to child pornography in response to requests by the Attorney General have led to the blocking of more than one and a half million innocent websites not targeted by the Attorney General.” Pappert, 337 at 610. That the Act, on its face, did not burden protected speech was irrelevant, the plaintiffs asserted; the fact that the State forced private actors – the ISPs – to inadvertently blocking innocent websites was enough.

The State rejected this interpretation and, instead, argued that the overblocking did not violate the First Amendment because it resulted from decisions made by ISPs, not by State actors. According to defendant, ISPs could have options for disabling access to child pornography and the choice of using filtering was completely a decision of the ISPs.

Pappert decision

Ultimately, the court granted the plaintiffs a motion and declared unconstitutional the Pennsylvania’s Internet Child Pornography Act. Based on the evidence presented by the parties at trial, the court concluded that, with the state of the technology at that moment, the Act could not be implemented without excessive blocking of innocent speech in violation of the First Amendment. Pappert, 337 at 611.

The court found that the means were not narrowly tailored to achieve the Act’s goals. The decision relied on the fact that more than 1 million innocent websites were blocked in an attempt to block less than 400 child pornography websites, without making any efforts to avoid the effect on the protected speech in clean websites. This ratio was insufficient to uphold the Act: “Although the inference could be drawn that making it more difficult to access child pornography reduces the incentive to produce and distribute child pornography, this burden on the child pornography business is not sufficient to overcome the significant suppression of expression that resulted from the implementation of the Act.” Pappert, 337 at 655.

In reaching this decision, the court relied on the 2000 Supreme Court case United States v. Playboy. In Playboy, a federal statute which required cable operators that provided sexually-oriented programing to fully scramble the images, block the channels, or limit the transmission of such programming was being challenged on the same grounds. The Supreme Court in that case determined that the statute was unconstitutional because the government failed to establish that the two methods for compliance identified in the challenged section were the least restrictive means for achieving the government’s goal. Pappert, 337 at 650.

Like the statute analyzed in Playboy, the Internet Child Pornography Act in this case provided ISPs with discretion to choose a method of compliance. However, as in Playboy, the only reasonable options available to the ISPs inevitably block protected speech to a significant degree. Pappert, 337 at 651. The court reasoned that there were no rational alternatives to comply with the order to remove or disable access to child pornography. Thus, it was as if the State was regulating in this regard; as if the choices to comply with the notification made by the Attorney General were written in the statute. Consequently, the Congress in that sense was the one who imposed the restriction of free of speech.

There were strong arguments for the court to apply strict and intermediate First Amendment scrutiny; however, the court did not have to choose between the two because, even under the less demanding standard – intermediate scrutiny – the Act did not pass Constitutional muster. Pappert, 337 at 655. The court reasoned that in order for a regulation to survive this level of scrutiny, it must “further an important government interest unrelated to the suppression of free expression and the incidental restriction on first amendment freedoms must be no greater than is essential to the furtherance of that interest.” Id.

Even though the prevention of child exploitation is an interest of the State unrelated to the suppression of free expression, there was a significant suppression of clean websites. Furthermore, the State did not produce evidence that with the implementation of the Internet Child Pornography Act child abuse was being reduced. Consequently, the Act was declared unconstitutional.

Scope of the problem today

When Pappert was resolved, child pornography was already a huge problem; today the child pornography industry has expanded tremendously. The National Center for Missing and Exploited Children reviewed 22 million images and videos of suspected child sexual pornography in its victim identification program in 2013. This was more than 5000% increase from 2007. In another study, international children’s rights organization Terre des Homes created a computer-generated avatar of a 10-year-old girl in order to measure the level of attention she would attract. The results were disturbing: approximately 20,000 men contacted her in ten weeks, and more than 1,000 men actually offered her money to take her clothes off in front of her webcam.

These statistics demonstrate why child pornography laws are the most restrictive in nearly all countries. Simply possessing pornographic images containing children in personal computers could result in imprisonment under many countries’ federal laws. The U.S. laws prohibiting child pornography include every visual representation with a minor engaged in sexual conducts and criminalizes all types of its distribution.

Indeed, the U.S. federal laws define child pornography broadly as any visual depiction of sexual explicit conduct involving the use of a minor engaging in such sexually explicit conduct. It criminalizes knowingly producing, distributing, receiving, or possessing with intent to distribute, a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting, that depicts a minor, or an image that is or appears to be a minor, engaging in sexually explicit conduct.

Technological updates

In light of this growing problem and the wide scope of federal laws to combat it, companies are still trying to innovate better filters with new automated technologies. Today’s filters are more narrowly tailored in order to avoid the Constitutional issues that arose in Playboy and Pappert.

For instance, in 2012 Microsoft made its PhotoDNA tool available to law enforcement agencies, which compiles a digital signature of each image and can be matched against a database of known images of sexual abuse. This technology helps organizations automatically detect and report the distribution of child exploitation images. The tool has already led to the arrest of a man in Pennsylvania who has been charged with receiving and sharing child abuse images on Microsoft’s OneDrive service.

In August 2014, Google implemented special filters for child pornography where “each child sexual abuse image is given a unique digital fingerprint which enables [their] systems to identify those pictures, including in Gmail.” Soon after its implementation, a cyber-tip generated by Google and sent to the National Center for Missing and Exploited Children led to the arrest of a 41-year-old Houston man who is charged with possessing child pornography.

As companies continue to take strides on how to confront this growing problem, the question of how to best develop filters remains.