Blog

  • Petros Vinis Blog Post

    Petros Vinis

    Information Privacy Law

    Professor Ira Rubinstein

    March 23, 2017

    In October 2015 a local judge in Kentucky cleared the “drone slayer” of all criminal charges, namely a person who had shot down a drone from his backyard, claiming protection of his sunbathing daughter’s privacy. In the meantime, in December 2015 CNN obtained the first ever exemption from the Federal Aviation Authority to operate a drone “over people”, while the FAA has long resisted the pressure to include privacy-related rules in its recommended legal framework.

    Far from being a distant future possibility, the market for recreational drones is growing rapidly, with predicted growth to a value of around $4bn over the next five to ten years. As far as the commercial use of drones is concerned, Amazon has announced that it is going to initiate Amazon Prime Air, a service offering its subscribers the opportunity of a one-hour drone delivery. Nevertheless, and despite those developments and predictions, a consistent and fully-fledged regulatory framework addressing the (privacy) implications of drones does not currently exist.

    Most of the legislatures around the globe have emphasized the safety aspect of drones, with the UK having drafted the “Dronecode” for hobbyists and mandated that operators of commercial drones would have to undergo flight tests. On the other hand, there are no legal provisions regulating recreational drones in the US, with the FAA only maintaining a registrar for commercial drone users.

    But what about the privacy implications of drones, especially seen through the lens of Florida v. Reiley and the notion of “a reasonable expectation of privacy” when exposed in public? The Court in that case based its decision on the legitimate use of navigable air space by the state and the “naked-eye-observation” argument to deny any liability on behalf of the state authorities that surveyed the defendant’s house. Does that mean that as long as a drone flies within an FAA-approved air space, the reasonable expectation of privacy dissipates?

    Moreover, Kyllo v. United States prohibited invasive surveillance techniques, as long as the device used for the surveillance was not “in general public use”. Following that logic, even if drones will ultimately be considered to be highly intrusive, can their predicted ubiquity absolve any form of liability? Finally, these cases address instances only of governmental intrusion and the only relevant legal precedent confronting an invasion of privacy by private individuals is currently only the Bulitt County District Court Judge.

    If CCTV and surveillance cameras are able to claim an overweighing benefit of preserving public security, what kind of counter-justification exists to allow drones and their subsequent intrusion to privacy for the sake of recreation? An international legal framework is needed, perhaps in sound collaboration with technical protocols that should be able to block GPS-navigation when drones are approaching the private sphere of other individuals.

    Related articles:

    https://www.wired.com/2017/02/sky-net-illegal-drone-plan/

    http://www.slate.com/articles/technology/future_tense/2016/05/drone_privacy_is_about_much_more_than_sunbathing_teenage_daughters.html

    http://www.theverge.com/2015/10/28/9625468/drone-slayer-kentucky-cleared-charges

    https://www.stanfordlawreview.org/online/the-drone-as-privacy-catalyst/

    http://www.ibanet.org/Article/NewDetail.aspx?ArticleUid=BFAA24D5-9C1F-439E-9091-5F002CFB0CEB

  • PRG Meeting Announcements, March 22

    Further reading on today’s discussion topics:

    Thanks to Amanda Levendowski and Hugo Zylberberg for providing these links.

  • Emily Poole Blog Post

    Emily Poole

    Information Privacy

    Prof. Ira Rubinstein

     Facial recognition Facebook app hoax terrifies the internet 

    March 14, 2017

    http://www.telegraph.co.uk/technology/2017/03/14/creepy-facial-recognition-app-users-find-strangers-facebook/

    Earlier this week, news circulated on the Internet that a new app, Facezam, was able to identify the identity of strangers in a photograph by using facial recognition technology (FRT). While the news turned out to be a hoax, such use of FRT is not at all far-fetched. In fact, the use of FRT in a crowd was tested as early as 2001, when the technology was deployed in an attempt to detect security risks at the Super Bowl, and Facebook uses FRT to suggest photo tags for its millions of users. According to one estimate, the global facial recognition market is expected to rise to $9.6 billion by 2022, owing to its increasingly frequent use in both the commercial and law enforcement contexts.[1] Today, the government possesses a database of millions of photos, which it uses for law enforcement purposes, and millions more photos are posted to Facebook each day, creating an almost endless supply of photos to which FRT can be applied.

    The rise of this technology poses many interesting privacy concerns, but the legal framework surrounding its use is far from clear. Fourth Amendment case law suggests that a person has either no or very little objective expectation of privacy when in public (owing to doctrines such as plain view, the third party doctrine, and the original Katz test). This suggests that a person would have a difficult time pushing back on the use of FRT when it is used to identify them while they are in public. On the other hand, recent cases have recognized that old frameworks might need to be adjusted in light of new technologies that have the potential to reveal an incredible amount of information about an individual (see O’Connor’s concurrence in Riley, as well as discussion of the Mosaic Theory in the Jones case). Many commentators support this movement in the courts, suggesting that the Supreme Court should abandon a rigid Fourth Amendment analysis in favor of a more flexible approach that focuses on the nature of the surveillance.[2]

    Such tension in the doctrine may provide a potential source of hope for those who see the rise of FRT as a serious threat to privacy. Indeed, it is not hard to imagine a day in the near future when the use of FRT is so pervasive that it becomes impossible to walk outside without being recognized. One might argue, however, that such a growing threat to privacy may actually become enough to trigger a change in doctrine. Considering both recent Fourth Amendment case law, as well as cases in the First Amendment context that recognize the harmful chilling effects of intrusive surveillance (see the Philadelphia Yearly Meeting case), the courts may actually come to realize that action is necessary if the notion of anonymity is not to go extinct.

    Related links:

    https://www.usnews.com/news/articles/2014/07/08/fbi-may-seek-facebook-data-for-facial-recognition

    http://www.npr.org/sections/alltechconsidered/2016/10/25/499176469/it-aint-me-babe-researchers-find-flaws-in-police-facial-recognition

    https://www.scientificamerican.com/article/biometric-security-poses-huge-privacy-risks/

    [1] https://www.alliedmarketresearch.com/facial-recognition-market

    [2] See, e.g., Susan Freiwald, First Principles of Communications Privacy, 2007 Stan. Tech. L. Rev. 3, 9.

  • Leland Chang Blog Post

    Leland Chang

    Information Privacy  Law

    Professor Ira Rubinstein

    March 21, 2017

    Alexa – Amazon’s internet connected home assistant device, man’s new best friend or law enforcement’s greatest spy.

    Prosecutors in Arkansas have issued a warrant against Amazon to hand over data Alexa may have gathered its owner James Bates, a murder suspect. Amazon refused and has filed a motion to quash the search warrant; in a statement they said it “will not release customer information without a valid and binding legal demand properly served on us. Amazon objects to overbroad or otherwise inappropriate demands as a matter of course.”

    This case puts the spotlight on several interesting issues. First is the implications of “always on” machines and the data they gather. Amazon insists “always on” is a misnomer, because while Alexa is always listening for the programmed wake words to activate, prior to activation the inputs it receives are not uploaded to the cloud nor are they recorded. However, as technology advances and more microphones are put into more devices operated through the Internet of Things, it seems more likely data will leak through. Second is how this changes a consumer’s reasonable expectation of privacy, especially in the sanctity of one’s own home (which is under the purview of 4th amendment protection). Data is collected from physical conduct inside the house, but also collected and stored in the hands of a third party. Third is the precedent this case sets for digital rights. Amazon objects to the warrant that they deemed to be “overbroad”, but what then is the standard that prosecutors must meet? Technology companies, like Amazon, must learn to thread the needle between complying with legitimate warrants that will bear relevant evidence and protecting the data and rights of their consumers.

    The Prosecutor intends to file a response to Amazon’s motion. Even though this case is about a murder, tech companies, privacy experts, and digital rights enthusiasts will be wise to follow closely

     

    Related Links

  • Joyce Chang Blog Post

    Joyce Chang

    Information Privacy Law

    Professor Ira Rubenstein

    March 21, 2017

    As part of a broader government reaction to recent eruptions of deadly violence in the region of Xinjiang, Chinese authorities have ordered all drivers there to install a Chinese-made satellite navigation system in their vehicles. Under this compulsory measure, all private, secondhand, and government vehicles as well as heavy vehicles such as bulldozers and big rigs in Bayingolin Mongol Autonomous Prefecture must install the navigation system by June 30, 2017. Drivers who refuse to do so will not be allowed to buy fuel at gas stations.

    According to official announcements, the new requirement is intended to help the government “ensure social security and safety and promote social stability and harmony.” More specifically, the rule is aimed at helping authorities track people in a vast but sparsely populated region where ethnic tensions have given rise to regular terrorist attacks. Government officials have pointed to cars as a key means of transport for terrorists and a consistent weapon of choice when justifying the need to monitor and track all vehicles in the area.

    Because this new measure will eventually affect hundreds of thousands of vehicles in the prefecture, the government will be able to add a large amount of personal data by way of tracked vehicle movements to its existing records of its citizens. The scope of this measure greatly increases the reach of government surveillance. The government’s ability to access and use the location and movement data is also guaranteed by the fact that the vehicle-tracking program will use China’s homegrown Beidou satellite navigation system instead of the U.S. Global Positioning System (GPS).

    The intrusiveness of location tracking, especially of permanent long-term location and movement monitoring, is apparent, but individual privacy in China consistently cedes ground to security concerns. This issue is not limited to China alone as governments around the world struggle to strike a balance between privacy and security concerns. However, given China’s ability to pass and enforce security measures with relative ease and its recent investments into both low-tech and hi-tech methods of surveillance, it seems as if it is only a matter of time before there is little individual privacy, if any, left in the country.

    Sources:

    http://www.bbc.com/news/world-asia-china-39038364

    https://www.nytimes.com/2017/02/24/world/asia/china-xinjiang-gps-vehicles.html?_r=0

    https://chinadigitaltimes.net/2017/02/gps-car-tracking-military-rallies-follow-xinjiang-attack/

    https://www.theguardian.com/world/2017/feb/21/china-orders-gps-tracking-of-every-car-in-troubled-region

  • Alex Siegel Blog Post

    Alex Siegel

    Information Privacy Law

    Professor Ira Rubinstein

    March 21, 2017

    Judge Gorsuch and the Fourth Amendment

    United States Supreme Court nominee Neil Gorsuch is perhaps best known for being two things: a controversial replacement for President Barack Obama’s candidate, Merrick Garland, and a judge selected precisely because his jurisprudential philosophy hews strongly conservative. However, when it comes to the Fourth Amendment – an especially unsettled body of law given advancements in modern technology – Gorsuch’s record has proven less predictable than his generally originalist philosophy might suggest.

    While Gorsuch has sided with the government more often than not (which is the case for most appellate judges across the ideological spectrum), his record includes various instances in which Gorsuch sided with citizens against unlawful government searches. Gorsuch has diverged from a traditionally conservative law-and-order approach to the Fourth Amendment by siding with a child pornography trafficker (United States v. Ackerman) and a methamphetamines user (United States v. Carloss) who were both subjected to searches that Gorsuch found objectionable.

    Moreover, while Gorsuch has extended Justice Scalia’s common law trespass approach to the Fourth Amendment with respect to searches of homes and personal property, he has favored something closer to a totality of the circumstances test when ruling on Terry stops.

    Scalia believed the reasonable-expectation-of-privacy test developed in U.S. v. Katz was an addition to the common law trespass test, not a substitution for it. Gorsuch has held similarly in cases where personal property has been subjected to an alleged search, applying the trespass test in both the physical (Carloss) as well as digital (Ackerman) realms.

    However, in United States v. Nicholson, Gorsuch dissented from the majority’s view that an officer’s mistake of law could not justify a Terry stop. He advocated for a case-by-case approach to determine whether the government had acted reasonably given the circumstances, taking into account the possibility of human error. Gorsuch’s minority approach, later adopted by the Supreme Court in Heien v. North Carolina, suggests a reluctance to use an originalist method to limit government discretion with respect to Terry stops. Gorsuch has regularly sided with law enforcement in their use of stops and seems more protective of the current doctrine, with all its discretion, than he is of traditional common law notions of trespass.

    While Gorsuch’s Fourth Amendment jurisprudence certainly indicates a law-and-order approach, his views aren’t as consistently conservative as Scalia’s were. His record doesn’t indicate an interest in developing a unified originalist approach to the Fourth Amendment. Moreover, unlike Scalia, who was predisposed to finding exceptions to the trespass test for law enforcement efforts (as he did in Florida v. Jardines and United States v. Jones), Gorsuch has proved more willing to view law enforcement searches with some level of skepticism.

    Sources:

    https://www.nytimes.com/2017/02/02/us/politics/neil-gorsuch-supreme-court-fourth-amendment.html?_r=0

    https://www.stanfordlawreview.org/online/spotlight-fourth-amendment/

    http://www.scotusblog.com/2017/03/gorsuch-fourth-amendment/

  • Christian Abouchaker Blog Post

    Christian Abouchaker

    Information Privacy Law

    Professor Ira Rubinstein

    March 21, 2017

    Fourth Amendment Protection and “Smart” Homes

    While the use of smart meter technology in homes across the country offers notable benefits with respect to energy monitoring and cost reduction, it also gives rise to important privacy concerns.

    In Naperville Smart Meter Awareness v. City of Naperville, a federal district court in Illinois held that there is no reasonable expectation of privacy in data collected by smart meter devices, and that such data is outside the scope of Fourth Amendment protection. In this case, the Naperville Smart Meter Awareness Association (NSMA) alleged that the City’s installation of smart meters constituted an unreasonable search and an invasion of privacy under the Fourth Amendment. Smart meters collect energy use data at high frequencies, typically every 5, 15, or 30 minutes. In doing so, smart meters provide aggregate measurements of a household’s electrical usage. NSMA further alleges that smart meters have the capability of capturing detailed information about electricity usage, such as remote daily tracking of time patterns and power loads associated with power usage, therefore providing information about the personal details of a person’s private life. However, the Court held that NSMA members had no expectation of privacy in the aggregate measurements of their electrical usage. The court’s decision is based on the presumption that data collected from smart meters is no more informative than the data collected by analog meters.

    This case is currently on appeal to the U.S. Court of Appeals for the Seventh Circuit. EFF and Privacy International have requested to file a brief addressing the broader impacts of the District Court’s decision. In their brief, EFF and Privacy International offer that smart meter data constitutes “intimate information regarding a person or family’s private, in-home activities”, given the time granularity of such data (i.e. a reading every 15 minutes, or 2,800 readings in a 30-day month). Given the intimate nature of this information, it is argued that smart meter data should be afforded the utmost Fourth Amendment protection. Further, the brief presents data from a normative inquiry into Americans’ privacy expectation surrounding data regarding their in-home activities, which indicates that Americans are particularly concerned about the privacy of data tied to their homes. Based on these findings, and given that certain states have enacted laws protecting data collected via smart meters (e.g. Cal.Pub.Util.Code §§ 8380–8381 prohibits utilities from sharing or disclosing customers’ consumption data to third parties without consent, and also requires the maintenance of “reasonable security procedures”, including encryption, for consumers’ electricity usage data), EFF and Privacy International allege that there is a reasonable expectation of privacy with respect to smart meter data.

    Considering that more than 40% of American households currently have a smart meter, and that this figure is expected to reach 80% by 2020, the outcome of this case will have significant implications for the privacy of Americans.

    Sources:

    https://www.eff.org/deeplinks/2017/03/illinois-court-just-didnt-get-it-we-are-entitled-expect-privacy-our-smart

    https://www.eff.org/document/naperville-smart-meter-awareness-v-naperville-eff-and-privacy-international-amicus-brief

  • Mason Fitch Blog Post

    Mason Fitch

    Blog Post

    Professor Ira Rubinstein

    March 21, 2017

    The California legislature is considering a bill that would remove California’s leading privacy protections—passed in a bill dubbed “CalECPA”—from school halls. A.B. 165, introduced by Assemblymember Jim Cooper, is short: all it says is that CalECPA does not apply to local educational agencies or individuals acting on their behalf.

    CalECPA, heralded as the nation’s best digital privacy law, prohibits a government entity from compelling the production of or access to electronic communication information without a warrant. The protections apply to both data and metadata.  Largely, CalECPA extends the privacy protections afforded to physical belongings to the digital arena.

    As the EFF and other privacy watchdogs have pointed out, passage of A.B. 165 would have dangerous implications for California’s students and parents. As described more fully in the linked article below, removing CalECPA protections from schools would mean that any teacher, administrator, or staff member could conduct an almost unlimited search of a student’s digital presence. It hardly needs to be pointed out at this point, but our digital devices contain an extraordinary amount of extremely sensitive information. Gone are the days where the most embarrassing thing a teacher could do is read aloud what you passed in a crumpled note to your friend across the aisle; the passage of A.B. 165 would give school employees access to anything from your geolocation history to your health information, not to mention personal messages and pictures.

    The disappearing communications platform Snapchat, already immensely popular among students, may become even more popular (and necessary) should A.B. 165 gain passage in the California legislature.

    Even more troubling is the bill’s application to individuals acting on behalf of the educational organization. A.B. 165 would allow on-campus police officers to search students’ digital devices, and there are no limitations on how that information is shared. An undocumented student’s status may be revealed through such a search, and there’s nothing to stop the person conducting the search from sharing that information with federal officials.

    Schools often reside in a special zone when it comes to student privacy as there are special concerns about student safety and development. Every individual, however, maintains an interest in some modicum of privacy; giving schools officials unrestricted access to students’ digital lives—often inseparable from their physical lives—may be a step too far.

    https://www.eff.org/deeplinks/2017/03/dangerous-california-bill-would-leave-students-and-teachers-vulnerable-warrantless

  • PRG News Roundup: March 8th

    by Caroline Alewaerts

    Wikileaks released documents describing the software and tools used by the CIA to hack various computer devices. The leak notably reveals that the CIA can break into smart phones and access messages before and after their transmission, therefore rendering WhatsApp, Telegram, and Signal encryption features irrelevant. The CIA can also hack into an internet-based TV to record conversations. Tech companies have already indicated that they are working on fixing – or have already fixed – the vulnerabilities used by the CIA to break in their products.

    Researchers at AI Lab at MIT Laboratory for Information and Decision Systems (LIDS) recently developed a system (Synthetic Data Vault) that uses machine learning to automatically create artificial/synthetic data out of a “real” database. Their research suggests that using artificial data to develop data science algorithms and models will produce substantially the same results as real data without comprising privacy.

    Republicans have recently introduced a resolution to repeal the FCC’s “Protecting the Privacy of Customers of Broadband and Other Telecommunications Services” rule and to prevent the FCC from adopting similar regulation in the future. The rule was adopted last year and aimed at increasing transparency, choice, and security of customer data.

    The NIH announced a new grant opportunity intended to help organizations encourage patient participation in the All of Us Research Program and the precision medicine biobank. This biobank will store millions of biospecimens and other healthcare data for precision medicine research, and will be the largest biobank in the world. Mayo Clinic, which is the organization in charge of creating the biobank, has however not yet revealed which specific measures it intends to put in place to protect the privacy and security of the biobank.

    New Jersey State recently decided to replace bail hearings by an algorithm designed to evaluate the risk of release of a defendant in jail. The algorithm, however, does not replace judicial discretion, and the computer-generated score that it generates is meant to be used as a guide only.

    Announcement: the NYU Information Law Institute and Department of Media, Culture, and Communications will be organizing the “International Workshop on Obfuscation: Science, Technology, and Theory” on April 7 and 8. More information about the workshop and how to register is available here. The organizers are also looking for students interested in helping organizing the event.

  • Jorge Peniche Baqueiro Blog Post

    Jorge Peniche Baqueiro

    Information Privacy Law

    Ira S. Rubinstein

    March 7, 2017

    The EU and US approaches on privacy issues: the battle could escalate even more but find some convergence

    Yale’s law professor James Q. Whitman has described the differences about privacy law approaches in the United States and Europe as a clash that has actually deeper roots. The core of the conflict is found, he argues, on the consideration that these cultures respectively pay to the fundamental values of liberty and dignity – a matter deeply concerned with their particular experiences, sufferings and traumas through history.

    The distinction is not merely theoretical however. It has provoked some tensions, costly litigation and trade battles during the last decades following the rocketing of transatlantic data traffic. Well, the battle could have reached last year a new stage with the enactment, by both the European Parliament and Council, of the Regulation (EU) 2016/679. The General Data Protection Regulation (GDPR) will take effect on May 24, 2018 and it will repeal former Directive 95/46/EC.

    Those experts in EU law know that the opted legal design and architecture is not only about semantics with regard to the use of the word regulation instead of directive. The GDPR aims to create a more unified framework, binding on the State parties, that substitutes the bunch of domestic legislations promulgated in implementation of the former directive.

    This battle has seen some remarkable episodes and also some interesting truces. First, to guarantee adequate levels of protection and allow to send personal data to “third countries” outside the scope of the former directive, i.e. the European Economic Area, the US-EU Safe Harbor Framework was developed between 1998-2000. The European Commission issued then a crucial decision endorsing the “safe harbor scheme” by stating that US companies certified in meeting EU requirements were allowed to transfer data from the EU to the US. Nevertheless, the European Court of Justice held recently, in 2015, that the “Safe Harbor Decision” was invalid. As a consequence, the EU-US Privacy Shield was announced by both sides last year in order to provide stronger protections.

    The GDPR introduces significant novelties and constitutes indeed a milestone towards a more robust protection. To mention a few: a broader scope of application for data controllers established outside the Union and stricter “valid consent” controls. But as the due date approaches and some on-going litigation cases are being now discussed in the American courts, some have raised concerns about the coming storm in the horizon.

    Ricci Dipshan writing last February for the renowned legal news website “Law.com” pointed out, the issue of litigation-related international data transfers – new perils will be faced when personal data must be transferred from the EU to the US for use in e-discovery

    In short, the GDPR forces e-discovery practitioners in the US to target the data, subject to discovery, in a narrow fashion. This imperative certainly is against odds the US common practice of taking the wholesale data sets and move it into the e-discovery process. Proportionality is the new king in the hill.

    Practitioners Christian Schröder, Jeffrey McKenna and Renne Phillips have sailed into the GDPR sea in the search for options.  They argue that articles 46 and 49 provide the most useful mechanisms for transfers to the US during discovery. EU Standard Contractual Clauses (SCCs), as proposed by the EU Commission, could be a good alternative for facilitating data transfers for smaller companies or one-off data transfers. On the bright side, as Article 49(1) didn’t include a restriction commonly used on domestic implementing legislations, there seems to be room to argue in favor of pre-trial discoveries as opposed to the concept of transfers only allowed for “pending litigation” and not mere controversy between the parties.

     

    Although the main recommendation is a careful case-by-case assessment, which just for this reason seems to foster the deterrent goal pursued by the GDPR, Brian Corbin, assistant general counsel of legal discovery management at JP Morgan Chase & Co, notes that there is nothing new under the sun. Under the 2015 amendments to the Federal Rules of Civil Procedure and its similar requirement of proportionality in e-discovery there is sufficient overlap to have a good starting point for US practitioners and companies to approach data collection under the GDPR.

    Probably there are more episodes to come in the battlefield of the way privacy law is understood in the US and the EU. Still, there seems to be also a compromise point, beneficial for the citizens indeed.

    For more information

    http://www.lexology.com/library/detail.aspx?g=27ae467a-e2ed-4efc-ba4d-16d74c95e661

    http://www.law.com/sites/almstaff/2017/02/06/the-storm-on-the-horizon-4-things-to-know-in-prepping-for-general-data-protection-regulation/?slreturn=20170206133625