Category: Uncategorized

  • Privacy Blog (1)

    Privacy Blog (1)

    By: Maggie Kornreich

    Professor Rubinstein

    March 24, 2014

    http://www.natlawreview.com/article/health-apps-and-hipaa-ocr-publishes-new-guidance-health-app-developers

    This article addresses whether mobile device applications are subject to HIPAA regulations. In February, the Department of Health and Human Services’ Office for Civil Rights (OCR) released Health App Use Scenarios & HIPAA to examine if HIPAA applies to apps that “collect, store, manage, organize, or transmit health information.”

    The Health App Guidance provides six scenarios and decides whether HIPAA would apply to the app developer in each instance. The first scenario involves a consumer who downloads a health app and provides the app with her personal information in order to organize her information without her healthcare providers. Here, the consumer is not a covered entity or business so the app developer is not subject to HIPAA. The second scenario involves a consumer who downloads a health app to manage a chronic condition. The consumer retrieves data from her doctor’s electronic health record as well as her own information to put into the app. The consumer is not a covered entity or business associate and the healthcare provider did not hire the app developer for the service so it is not subject to HIPAA. The third scenario involves a consumer who downloads an app after their doctor recommends it to track diet and exercise. The consumer sends a report to their doctor before the next appointment. The doctor did not hire the app developer so the developer is not subject to HIPAA.

    The fourth scenario involves a consumer downloading an app to manage a chronic condition, where the app developer and the healthcare provider have an interoperability agreement at the consumer’s request in order to exchange consumer information. The consumer inputs their own information into the app. The developer is not subject to HIPAA because they are not creating, maintaining, or transmitting personal health information on behalf of a covered entity or business associate. In the fifth scenario, a healthcare provider contracts with the app developer for patient management services and the provider instructs patients to use the app. Here, because the provider is a covered entity and the developer is considered a business associate, the developer is subject to HIPAA. The sixth scenario involves a health plan that offers a health app to allow members to store health records, check the status of claims and track their wellness information. The health plan analyzes the information. The developer is considered a business associate and the health plan is a covered entity. Therefore, the developer is subject to HIPAA.

    This article is interesting and informative because it outlines the instances when developer or company will be subject to HIPAA. This is increasingly important as people rely on their phones and apps on their phones for most if not all of their personal affairs. It is also significant in that it brings to light instances where people share health information, which many people deem to extremely private, in electronic forms.

  • Information Privacy: Blog Post (Panel 6)

    Information Privacy: Blog Post (Panel 6)

    By:Harry Grabow

    http://www.marketplace.org/2016/02/10/business/new-frontier-voter-tracking

    http://fusion.net/story/268108/dstillery-clever-tracking-trick/

    http://www.usatoday.com/story/news/politics/onpolitics/2016/02/08/company-tracked-iowa-caucusgoers-phones/80005966/

    In the first United States Presidential campaign since the 2013 Snowden, candidates have expressed varying opinions of both the man and the reality he exposed: while some have mildly praised his impact on the privacy discourse while questioning the legality and intelligence of his conduct (see here and here), others have gone as far as insisting that he is a traitor, or even an active Russian spy, prompting a response from the Kremlin itself.

    However, while the candidates’ focus remains on the legitimate national security and civil liberties implications of government surveillance, another use of data collection is taking the campaign season by storm: the creation of voter profiles based on mobile device advertising profiles of individuals at polling places. Traditionally the domain of phone-based public polling and on-site exit polls, digital advertising company Dstillery engineered a way – both creative and creepy in equal measure – to track voter preferences by matching mobile device identifiers present at the geographical locations of caucus sites, to the digital advertising profiles of the device’s owners. Specifically, they monitored the real-time ad bidding that occurs each time a mobile device opens an app or web site, captured the identification associated with the service of an ad, and then further researched additional characteristics associated with that mobile device.

    Though not conducted with the scientific rigor of a public poll, the results captured voter characteristics not usually polled by campaigns. For example, according to an analysis of the results conducted by USA Today: voters expecting a newborn child tend to be Republican and had a greater concentration at Senator Marco Rubio’s caucus sites; voters in locations with strong support for Donald Trump had a penchant for outdoor activities and home improvement; and tech-industry workers and enthusiasts were more concentrated in Senator Bernie Sanders’ caucus sites than those of former Secretary of State Hillary Clinton.

    In an interview with news site Fusion, a representative from Dstillery explained, “One thing that isn’t in the data is personal identifiable information. The data and system are completely anonymous. We have no idea, for example, what your name is. All we see are behaviors and everything we do is based on analyzing those behaviors writ-large.”

    So, while identifying individual voters and their preferences through this method of data collection might not be a present concern, the prospect of campaigns themselves utilizing these tactics might subject Iowans to an even greater saturation of political advertising in 2020, on top of the astounding $70 million spent in 2016. And, with many Iowans lamenting the constant barrage of TV and radio ads, campaigns might be eager to attempt this new, more subtle approach

  • Privacy Blog

    Privacy Blog

    By: Mary Churan Huang

    http://www.irishtimes.com/business/technology/google-eu-s-data-protection-authorities-have-absolute-focus-on-privacy-1.2556597

    This article discusses the positions taken by some of the world’s largest multinational companies in respect of the EU’s incoming General Data Protection Regulation (GDPR). The GDPR is set to replace the existing EU Data Protection Directive of 1995. There is a need to update the existing privacy laws in the EU as the technology evolution has accelerated and the current framework does not properly cover critical issues such as social networks and cloud computing. The GDPR is designed to be a more comprehensive and wide-ranging legal framework which will apply EU-wide, transcending any local privacy laws. The GDPR is intended to strengthen and unify data protection within the EU so that there will be one single legal framework within the EU. This would allow for international businesses to more easily comply with the EU privacy framework while allowing for more comprehensive protection for EU residents.

    Both Google and Microsoft seems to regard the GDPR as a necessary step in data protection but does not anticipate a major change in their respective companies’ policies as they have already undertaken significant steps to enhance privacy over the years. Microsoft is concerned about the heavy penalties set out in the GDPR and has indicated that the new regime will be a great test of whether their privacy controls are working the way they should be. Google indicated that it will be working closely with the regulators to find a rational way of interpreting some of the ambiguities contained in the GDPR.

    Adobe seems to be more critical of the GDPR, expressing its disappointment that the framework is not as unifying as it is intended to be. Adobe is of the view that new framework will still be fragmented and subject to interpretation by local authorities. The responses by these major multinational companies are unsurprising considering that any new regime would bring about much uncertainty. However, all these companies have expressed a willingness to comply and work with the relevant regulators to find a solution.

  • A Consumer Privacy Legislation? A Highly Debated Proposal since 2012

    A Consumer Privacy Legislation? A Highly Debated Proposal since 2012

    By: Ida Faustine Jacotey

    http://www.nytimes.com/2016/02/29/technology/obamas-effort-on-consumer-privacy-falls-short-critics-say.html

    In February 2012, the White House, released “A framework for protecting Privacy and promoting innovation in the Global Digital Economy”, thereby presenting a new Consumer Privacy Bill of Rights described by the presidency as a “blueprint for privacy in the information age”.[1]

    President Obama’s proposal aimed to establish a consumer privacy framework in order to provide consumers with clear guidance on what they can expect from companies[2] handling their personal information, as well as setting obligations for companies using personal data[3]. As stated in the reference article, President Obama intended to see the Consumer Privacy Bill of Rights’ principles put into law, throughout a conjoint work between the Administration and the Congress.

    Four years later, in February 2016, Natasha Singer, journalist for The New York Times, raises the question as to why the President’s proposed framework has not moved forward.

    This question seems particularly relevant today, as the US is hosting an overwhelming fight between Apple and the FBI, bringing therefore greater attention into consumers’ privacy issues.

    Such Proposal is born in a context where technology has strongly stepped into the commercial world. As a result, in 2012 like today, consumer privacy was and still is a major concern in the US, particularly since the Snowden Revolution, which brought alarming revelations related to privacy and data collection to the public eye.

    From a consumer’s perspective, because privacy policies often appear obscure and/or complicated, it seems easier and almost mandatory to rely on trust when one subscribes to a service. Therefore, when the FTC brings actions against companies, consumers’ concern as to the collection and use of their personal information increases. This is particularly true when major companies such as Google or Facebook are involved, since they are an inherent part of most Americans’ lives. Such cases involving big-scale companies are likely to be assimilated as proofs of the industry’s tendency to abuse consumers’ trust.

    As highlighted in this article by Cameron F. Kerry, a former Department of Commerce’s general counsel, a loss of trust is neither desirable from the industry’s point of view.

    In the US, companies enjoy a large freedom to set up their own privacy policies. We know that many companies look closely to the FTC’s actions, particularly because the FTC has been increasingly active and has had a strong influence within the industry to consequently adapt their privacy policies. Despite this evident freedom that US companies enjoy to collect and use data of consumers, the industry seems aware that privacy practices require continuous adjustments to meet with consumers’ reasonable expectations as to the use and collection of data. Some companies took the relevant steps and efforts in enforcing consumer privacy protection, but some have not.

    In the one hand, this context demonstrates a lack of any pragmatic instrument that would provide consumers with some level of guarantee that their privacy remains protected. In the other hand, this great flexibility is also the reason why American companies continue to innovate so much.

    Consequently, there is a significant divergence of opinions about whether a legislation is desirable in the consumer privacy context. Because this ideological war is not new – and not over yet-, Natasha Singer mentions “a tale of clashing visions for American society and commerce”.

    Two confronting ideas are causing the proposal to freeze: Companies’ free access to consumers’ personal information susceptible to cause them harm in many ways against the will to allow American technology companies to continue innovating with data.

    In 2012, for the first time, the White House considered individual rights as a priority in the commercial privacy world[4]. This step fed the debate opposing consumer advocates (“pro-legislation”) and industry advocates (against the legislation).

    As revealed by this article, some advocates argue that there is already a sufficient number of federal laws with specific limitations to companies’ use of consumer records. However, the privacy advocates points out the fact that these existing acts are specific to some companies and that the bill would instead target any companies rather than certain categories only.[5]

    While the debate concerning this legislation still runs, the industry is nevertheless active, and the article points out some notable improvements that have been made since 2012, such as the data disclosure charts devised as a result from numerous discussions on mobile app transparency[6].

     

    [1] www.whitehouse.gov/sites/default/files/privacy-final.pdf, Introductory Note, page 3

    [2] www.whitehouse.gov/sites/default/files/privacy-final.pdf, Executive Summary, listing the key principles of the Bill, or “Fair Information Practice Principles” (FIPPs)

    [3] www.whitehouse.gov/sites/default/files/privacy-final.pdf, Introduction page *7*

    [4] www.whitehouse.gov/sites/default/files/privacy-final.pdf, Introduction: page *5* “Strengthening consumer data privacy protections in the United States is an important Administration priority. Americans value privacy and expect protection from intrusions by both private and govern- mental actors.”

    [5]http://www.nytimes.com/2015/02/28/business/white-house-proposes-broad-consumer-data-privacy-bill.html, Natasha Singer for the New York Times, February 27, 2015, “There are already a number of federal laws, like the Fair Credit Reporting Act and the Video Privacy Protection Act, that limit how companies may use certain specific consumer records. The new proposed bill, the Consumer Privacy Bill of Rights Act, is intended to fill in the gaps between those statutes by issuing some baseline data-processing requirements for all types of companies” and Senator Edward J. Markey stating that “Instead of codes of conduct developed by industries that have historically been opposed to strong privacy measures, we need uniform and legally enforceable rules that companies must abide by and consumers can rely upon”.

    [6] “The yearlong discussions on mobile app transparency ultimately resulted in a subset of the participants getting together on their own time to devise data disclosure charts — akin to nutrition labels on food packages — that apps could display for consumers.”

  • Privacy and Reporting Child Abuse

    Privacy and Reporting Child Abuse

    By: Charles Kopel

    Privacy law came to the fore last month in Ontario, when two provincial government officials undertook a new public awareness campaign. Privacy Commissioner Brian Beamish teamed up with Advocate for Children and Youth Irwin Elman to improve responses to situations of suspected child abuse by “dispelling myths surrounding information sharing with children’s aid societies.”

    As reported by the Toronto Star, the project’s immediate motivation was a coroner’s inquest into the tragic 2002 starvation death of five-year-old Jeffrey Baldwin. Baldwin died in the care of his maternal grandparents, who were subsequently convicted of second-degree murder and sentenced to life in prison. The inquest jury discovered that police officers and school board officials who had knowledge of Baldwin’s predicament did not know what private information they were legally permitted to share with children’s aid workers. In response to the jury’s call for clarification of this legal standard, Beamish and Elman have published a 15-page educational booklet, titled Yes, You Can, and distributed it to relevant public servants. A link to the booklet can be found in the Star article.

    The principle is very straightforward: Under Ontario law, any person who reasonably suspects that a child is in need of protection must report the situation to a children’s aid society. In the face of such situations, all privacy law restrictions fall away. If necessary, teachers must divulge schooling records, healthcare professionals must divulge medical records, police must divulge criminal records, and social services staff must tell investigators all pertinent information.

    I was drawn to this story by my interest in child welfare law, and I see it as a useful illustration of the practical relevance of privacy rights to diverse circumstances and concerns. The rules of privacy law impact and complicate legal determinations in many areas outside of the major bodies of jurisprudence of consumer privacy and law enforcement.

    This story also provides an interesting glimpse of societal attitudes towards privacy rights, if only anecdotally. In a humorous moment in the Star article, Beamish is quoted as saying, “As privacy commissioner, I’m glad people have (privacy) top of mind, but there are occasions (when) not only may information be disclosed, but it must be disclosed…” It seems that, despite the typical lack of zealousness for protecting personal data on the Internet, Western people do have an intuitive sense for the inviolability of the private information of others, even when the social utility of exposing that information is as clear as in this case. A hierarchy of values emerges from the legal conclusion here, subordinating the child’s interest in informational privacy—in being “let alone”—to his/her interest in freedom from bodily and severe emotional harm.

     

     

  • Privacy Blog: About Ned

    Privacy Blog: About Ned

    By: Alec Webley

    Modern privacy law in the United States is often traced to “The Right to Privacy,” a law review article written by Samuel Warren and Louis Brandeis in the late nineteenth century. Arguing for protection for invasions of privacy through the tort law, the article served as a seminal point of reference in ongoing debates about the right of the state and private individuals to enter the private lives of U.S. citizens (helped, no doubt, by the elevation of one of its authors to the United States Supreme Court).

    The origins of the article were conventionally thought to be Warren’s irritation about the media’s coverage of his daughter’s wedding, an anecdote that forms a treasured part of a law professor’s repartee when teaching the privacy torts. It turns out, however, that the truth is stranger and more interesting.

    In a new piece for the Harvard Law Review Forum, NYU’s Charles Colman recounts the story of Samuel Warren’s brother Ned, who was as close to being “openly gay” as it was possible to be in the late nineteenth century (as the first period of what we would today identify as anti-gay hysteria began to sweep the nation). Ned was, in more ways than one, Samuel’s skeleton in the closet—revelation of Ned’s same-sex attraction would have seriously damaged Samuel’s reputation. Samuel’s only protection was privacy, and Ned’s own decision to live largely in seclusion at the family manor.

    Colman is right, I think, to point to cases like Ned’s in linking privacy as it is commonly understood (our ability to keep parts of our lives away from others) to privacy in the constitutional, substantive due process sense (such as in Griswold v. Connecticut). After all, it is precisely those most intimate details about our lives—our sexuality, our families—where we are as intolerant to the gaze of the state as we are of the public.

    But Colman could afford to take this analysis a step further. Ned would not have needed to remain private about his same-sex attraction (his interest in Gracean urns is another matter) if he had lived in 2015. I think there is some cause to suspect that one of the reasons homosexuality has become more commonly accepted is because it is more difficult to keep it private; as we learn more about each other’s lives it becomes harder to vilify them. By designating certain parts of our lives as private, are we helping to create the conditions that make them necessarily so? Broader tolerance of non-conformity to familial and sexuality conventions may well be as essential as good privacy law in smoothing our society’s passage into the Information Age.

  • Information Privacy Law- Kevin Kirby

    By: Kevin Kirby

    Former Mozilla CEO Brendan Eich has launched a new Internet browser called Brave that blocks advertisements by default, only to provide new ad space for Eich to sell.

    Advertising is sick, Eich says. It’s intrusive, tracking users with “cookies, tracker pixels, fingerprinting, everything.” His solution is to block such forms of “intrusive advertising” and instead use consumers’ local browsing history to target ads. Of the new advertising revenue, Brave will keep 15%, 55% will go to the content publisher, and 15% will go to the ad supplier. The final 15% will go to consumers, who can allocate those funds like credits to remove ads from their favorite sites.
    Ad-blocking technology has great potential for increasing consumer privacy protection and browsing speed, but it is unclear whether those benefits are retained in a browser that simply replaces old ads with new ones. Nevertheless, it is quite a novel development for a browser to sell ad space instead of the content providers and publishers. Eich likens his extrication of the adtech middlemen to “putting chlorine in the pool.”

    Eich’s team has raised $2.5M in investment so far and hopes to reach 7 million users this year. If the popularity of current ad-blocking plug-ins is any indicator, Brave might present a serious challenge to the current Internet advertising business model.

     

  • Consumer Privacy Dimensions of the FCC’s Cable Box Proposal

    Consumer Privacy Dimensions of the FCC’s Cable Box Proposal

    By: Dan Davidson

    The Federal Communications Commission (FCC) recently proposed increasing competition in the market for cable boxes, devices that are necessary to view cable television programming, but cost on average $20 per month to rent.[1]  Consumers presently have little choice but to rent a cable box from their cable provider; the FCC hopes enabling other companies to enter the market will drive down the price.[2]  While the FCC’s proposal reads like an antitrust measure, a number of privacy issues lurk beneath the surface.

    To understand why opening up the cable box market implicates privacy, it is important to recognize that cable boxes can provide a wealth of information about their users.  Cable companies generate revenue in part through marketing deals rooted in their ability to control the interface—i.e., the cable box—a consumer uses to access content.[3] It is not surprising, then, that companies like Apple and Google are expected to take advantage of the FCC’s proposal, if it takes effect. These companies, if they entered the cable box market, “would gain an unprecedented, Netflix-like visibility into customers’ viewing habits that they could then attempt to use for advertising purposes.”[4] Put simply, cable boxes represent another means by which to collect valuable consumer data.

    It is easy to imagine how a cable box made by Apple or Google would be a product consumers were eager to purchase.  Imagine a single box next to your monitor that integrated live cable television, DVR capability, Netflix, and movie rentals from iTunes, and all at a lower price than you currently pay for your cable box.[5] Arguably, consumers will make a rational decision whether to stick with their cable company’s cable box, or furnish the Apples of the world with information about their cable watching habits, in exchange for a lower-cost, better-functioning cable box.

    Such a hypothetical scenario exposes, however, consumers’ lack of any real ability to make such rational choices regarding their data. Indeed, the notion of handing over information about cable-watching habits to a company like Google handily illustrates Katherine Strandburg’s argument that the whole notion of “paying” for services with data is a myth. The disutility for consumers in handing over this information is all but impossible to estimate ahead of time—consumers could not possibly predict how Google will make use of their cable-watching habits, and therefore are in no position to accurately judge the potential harms that might arise from, for example, aggregating this data with other data points Google already possesses.[6]

    The FCC’s proposal will “contain a set of privacy provisions aimed at making sure new cable-box manufacturers don’t abuse the data they collect on viewer behaviors.”[7]  But those provisions should not allay privacy advocates’ concerns about companies whose business models are driven by behavioral data getting into the cable box market.

    The privacy provisions would essentially bind new entrants into the cable box market to the current laws and regulations governing how cable companies use consumer data they collect via cable boxes.[8]  These rules are deeply rooted in the notion of consent—as a very general matter, cable companies are prohibited from collecting “personally identifiable information” or sharing that data with third parties without first obtaining consent from consumers, except when sharing is necessary for providing cable service.[9]

    But a paradigm relying on consumer consent is hardly a strong defense against privacy violations. Scholars have noted “the lack of either informed or voluntary consumer consent to the privacy practices of websites,” so it is not clear that a consent-based privacy protection regime does much work for consumers.[10]

    Opening up the cable box market will create opportunities for new players to access cable-watching consumer data. Those new players may aggregate or analyze that data in new and unanticipated ways. The FCC’s proposal thus creates the potential for a convergence of factors that would adversely affect consumer privacy. In light of these concerns, consumer advocates should question whether simply retaining the current consent-based privacy regime around cable boxes would sufficiently protect consumers’ privacy interests.

    [1] Brian Fung, This new government proposal aims to cut your cable costs, Washington Post (Jan. 27, 2016), https://www.washingtonpost.com/news/the-switch/wp/2016/01/27/how-a-new-government-proposal-aims-to-cut-your-cable-costs/.

    [2] Id.

    [3] Nilay Patel, Inside the FCC’s audacious plan to blow up the cable box, The Verge (Jan. 28, 2016), http://www.theverge.com/2016/1/28/10858658/fcc-unlock-the-box-open-cable-plan.

    [4] Brian Fung, Third-party cable boxes won’t be allowed to spy on you (too much), regulators vow, Washington Post (Feb. 10, 2016), https://www.washingtonpost.com/news/the-switch/wp/2016/02/10/third-party-cable-boxes-wont-be-allowed-to-spy-on-you-too-much-regulators-vow/.

    [5] Patel, supra note 3.

    [6] Cf. Katherine J. Strandburg, Free Fall: The Online Market’s Consumer Preference Disconnect, 2013 U. Chi. Legal F. 95, 132 (2013).

    [7] Fung, supra note 4.

    [8] Id.

    [9] Id.

    [10] Ira S. Rubinstein, Privacy and Regulatory Innovation: Moving Beyond Voluntary Codes, 6 I/S: J.L. & Pol’y Info. Soc’y 356, 363 (2011); see also, Strandburg, supra note 6, at 151 (discussing the lack of meaningful consent in the consumer data context).

  • The FTC and the new US-EU framework for data transfer

    The FTC and the new US-EU framework for data transfer

    By: Elsa Mandel

    http://europa.eu/rapid/press-release_IP-16-216_en.htm

    On February 2nd, the European Commission and the United States announced their agreement on the “Privacy Shield”, a new legal framework for data flows between the EU and the US that would replace the former “Safe Harbor” mechanism.

    The international Safe Harbor was an agreement between the EU and the US that laid down principles that enabled US companies to comply with privacy laws protecting European Union citizens. In a nutshell, this “Safe Harbor” was a way for american companies to self-certify to the Commerce Department that they complied with the European standards of data transferring and processing. Within this mechanism, the role of the FTC was then to enforce those promises.

    In October 2015, after a user complained that his Facebook data were insufficiently protected, the European Court of Justice declared that the Safe harbor Mechanism was invalid.

    The new “Privacy Shield” framework that was announced at the beginning of February sets out stronger obligations for companies in the United States to protect the personal data of European citizens.

    More than material changes on the Safe Harbor substantive principles, the US government has also guaranteed that the Privacy Shield would bring a stronger enforcement of these principles.

    Notably, European consumers will benefit from a more direct access to US enforcement. They will have the possibility to file a complaint directly before the FTC, and European Data Protection Agencies ‘referrals to the FTC will be facilitated.

    In reaction to these announcements however, Commissioner Brill of the FTC noted that consumers from the European Union would, in most cases, go directly to their own DPAs before filing a complaint with the FTC.

    Other than that, the FTC’s enforcement will remain the same, and the Commission will continue to enforce promises to abide by the Safe Harbor, even though it has been invalidated, based on §5 of the FTC Act.

    Most likely, the Privacy Shield will have the effect of strengthening cooperation on cross-border data transfers between consumers and government agencies such as the FTC, and clear mechanisms will have to be set up in order to ensure good communication between national DPAs and the FTC.

     

     

  • FCRA: No Harm, No Worries?

    FCRA: No Harm, No Worries?

    By: Emma Bechara

    [Current events article: http://www.scotusblog.com/2015/11/argument-analysis-second-time-around-no-easier-for-justices-in-standing-case/]

    A pending case before the United States Supreme Court, considered in a United States Supreme Court (SCOTUS) blog, brings to the forefront the question of whether or not a consumer plaintiff has legal standing to bring a class action lawsuit for a technical violation of the Fair Credit Reporting Act (FCRA). On 2 November 2015, SCOTUS heard oral arguments in Spokeo, Inc. v Robins, where the petitioner, Robins, alleged that Spokeo – a consumer reporting agency – published inaccurate information about him, which adversely affected his ability to get a job and such inaccurate publishing was a violation of the FCRA. There was no evidence presented that Robins suffered actual “concrete harm”.

    The legal question before SCOTUS is:

    Whether Congress may confer Article III standing upon a plaintiff who suffers no concrete harm, and who therefore could not otherwise invoke the jurisdiction of a federal court, by authorizing a private right of action based on a bare violation of a federal statute.”

    The answer to this question, if in the affirmative, will undoubtedly have important legal and commercial ramifications for data privacy law, and the future of consumer class actions in this sphere. An affirmative answer will inevitably open the floodgates for consumers across the United States to bring suits against consumer credit agencies and Internet companies for Federal privacy law violations. More alarmingly, as Robins could not show that he had suffered “actual harm” as a result of the alleged violations, a finding for him may also set a dangerous precedent that would allow consumers to have standing to sue for a violation of the FCRA, despite being unable to show that they had suffered harm.

    This case considers an interesting dichotomy between the growing interests of consumer privacy, and the traditional views of the law requiring injury to a person. Currently, the FCRA provides consumers with up to $1000 in damages for inaccurate published reports. Is this enough? Perhaps dicta in from the court will shed light on this. Nonetheless, Spokeo argued that allowing consumers to sue, despite sustaining no concrete injuries, would invite class action abuse. According to the SCOTUS blog, this view is likely to be upheld by 7 of the Justices, with only two Justices- Justice Sonia Sotomayor and Justice Ruth Bader Ginsburg – “open to the possibility” of a plaintiff not being to show a “concrete, ‘real world’ harm”. Sadly, the passing of his Honor Justice Scalia has brought an element of uncertainty to the decision, and as Scott Flaherty notes, his Honor “penned three majority opinions that left an indisputable mark on the class action landscape, but now the court must grapple with key issues that linger”.