In May 2007, at the Where 2.0 conference in San Jose, Google launched Street View, a program allowing users to navigate panoramic images of thousands of street level locations. Advocating for mapping efficiency via technological innovation, Google aspired to “provide users with a rich, immersive browsing experience in Google Maps, allowing a greater understanding of a specific area.” This new product enabled any individual to type a particular address, click on a Street View link, and experience a clear and detailed panoramic view of almost any location of their choosing, as though they were actually standing on the street and viewing in real time.
In order to gather the images for panoramic display, Google equipped thousands of cars with mounted cameras that collected photographs, 3D imagery, and local WiFi network data. The photographs and the 3D building imagery were obtained to improve directional clarity on Google Maps through detailed depictions of objects like street and traffic signs. The cameras were not selective in the photographing and any individual within range of the camera could be photographed. Google further collected WiFi network information including publicly broadcasted SSID information (i.e., network names) and MAC addresses (the unique number given to WiFi routers) to further improve location-based services like searching indexes and maps.
It was not long before this momentum for advanced location mapping was met with serious concerns regarding privacy from the greater community. Immersive Media, a Canadian company, provided the photographs to Google without any provisions for blurring sensitive objects like people’s faces or license plates, and numerous websites popped up displaying revealing and personal photographs of individuals’ activities within their private properties. Google responded that such images were not any different than what people would see if they were simply walking on the street and provided for “easily accessible tools for flagging inappropriate sensitive imagery for review and removal.”
Privacy law in the United Kingdom required Google to follow up on such concerns through strict provisions forcing Google to blur the faces of all people depicted in Street View photographs. To satisfy robust privacy demands, these provisions extended the blurring mandate to require Google technology to obscure any facial depictions linked to real people, from pictures of Colonel Sanders at KFC restaurants to the statutes of kings, queens, and politicians that ornament UK streets and towns.
In May of this year, the Data Protection Authority (DPA) in Hamburg, Germany requested to audit Google’s WiFi data collected by its Street View cars. The DPA entreated Google to re-examine all of the information ever collected via WiFi network scans to determine whether or not personal information was obtained. Google, contrary to its original suppositions, discovered it had collected massive amounts of payload data (approximately 600GB) from non-password protected WiFi networks. Such information included, inter alia, searches, personal emails, and chats. Upon discovery of the mistake, Google immediately filtered the sensitive personal information within its network, making it inaccessible, and reached out to regulators throughout the world to determine the best way to destroy the private data. Additionally, Google immediately discontinued its Street View cars from collecting WiFi network information and contracted a third party to conduct a detailed examination of exactly what type of data its software collected so as to dispose of it properly.
Despite Google’s assurances to isolate and destroy the sensitive data, privacy advocates responded to the breach through public criticism and legal avenues. The Electronic Frontier Foundation (EFF) released a report condemning Google as being “too mature to be making rookie privacy mistakes” and further stated when a company is “in the business of collecting and monetizing other people’s personal data — as Google and so many other internet businesses are — clear standards and comprehensive auditing are essential to protect against improper collection, use or leakage of private information.” Additionally, public officials across the globe expressed concern with the data collected, and pressure from Germany, France, and Spain forced Google to hand over collected information to government officials within those countries.
Privacy International, a human rights watchdog organization, placed further pressure on Google by publishing a report which detailed Google’s methodology for collecting and storing data. According to the report, “gslite,” the program used to collect information via WiFi networks, had the capability to determine the difference between encrypted and unencrypted data. Once the program determined the status of the collected data, it could dispose of the encrypted data and record the unencrypted information on the hard drive of the mapping computer within Google’s Street View vehicle. Because the gslite program offered this option, Privacy International alleged that Google’s collection of public data was not a mistake.
The number of lawsuits against Google grew rapidly and Google subsequently filed a motion in the US Judicial Panel on Multidistrict Litigation attempting to amalgamate the lawsuits into one case, as almost all of the complaints contended some claim under the federal Wiretap Act. Google was concerned with conflicting pretrial motions “especially with respect to the proper scope and extent of discovery, class certification and other factual and legal matters” and espoused joining the complaints because “all of the claims make very similar factual allegations, and thus any necessary discovery will be of common facts.”
The Federal Trade Commission launched an investigation to determine whether Google’s recording of the unencrypted data violated federal privacy law. The investigation was halted in October when Google vowed to strengthen privacy controls by hiring an extensive privacy team, training employees in an advanced securities awareness program, and requiring internal engineers to maintain privacy design documents detailing the techniques used when collecting user data.
This October, Electronic Privacy Information Center, a privacy advocacy group, filed a complaint with the Federal Communications Commission alleging Google broke federal privacy law regarding “electronic eavesdropping” and possibly violated Section 705 of the Communications Act. Following the filing of the complaint and Google’s public disclosure of the mistake, FCC Enforcement Bureau Chief Michele Ellison stated the “Enforcement Bureau is looking into whether these actions violate the Communications Act” and confirmed an FCC investigation into Google’s data inception process. The FCC investigation is largely bolstered by global concerns as countries in Europe, which have greater privacy laws than the United States, are increasingly concerned with Google’s acquisition of sensitive personal data. The investigation is currently underway.