Blog

  • Pierre-Paul’s Medical Disclosure Claim Against ESPN: Issues of Intersecting Privacy Torts

    Panel 5

    Pierre-Paul’s Medical Disclosure Claim Against ESPN: Issues of Intersecting Privacy Torts.

    By: Eliza Marshall

    The New York Giants’ Jean Pierre-Paul’s suit against ESPN, which pertains to ESPN’s publication[2] of medical records linked to Pierre-Paul’s index finger amputation last summer, provides fruitful grounds for exploring the territory covered by intersecting and perhaps under-inclusive legal regimes in the medical information context. Pierre-Paul’s claim appears to fall in between two broad legal regimes: HIPAA and Florida’s state medical information statute.[3] HIPAA’s protection is broad in that it focuses on source rather than content or publication to avoid questions of harm in the context of medical information, but narrow in that it only covers certain entities and their business associates. Florida’s law, in contrast, is more limited in terms of content, publication and harm, but more broad in that it applies outside of the covered entities listed in HIPAA.[4] Yet Pierre-Paul’s claim may lie in territory covered by neither laws, and demonstrates a gap between regimes arguably worth addressing by expanding one or both.

    Under HIPAA, the content of the disclosure is clearly covered. But the statute does not regulate the behavior of ESPN. It is not a “covered entity,” and falls outside of more expansive definition of “business associate” because it does not (and did not) receive, maintain, or transmit personal health information for any of the functions or activities listed in the regulation.[5] HIPAA is prefaced on the notion that medical information is uniquely sensitive and inherently involves privacy harm, and so the statute does not require any inquiry into harm or publicity and covers the entire category of medical data being vulnerable to disclosure even if no unauthorized access ever occurs. One can question, therefore, why entities like ESPN should not be forced to treat this information with care. But the answer seems clearly to be that HIPAA does not cover them, and so any claim thereunder would have to be against the health care provider who provided the records to ESPN in the first place—and theirs are the only (presumably shallower) pockets that Pierre-Paul can tap.

    State law picks up where HIPAA leaves off,[6] but like the wider genre of Prosser’s privacy torts presents Pierre-Paul with its own set of obstacles. ESPN is covered under Florida’s statute, but it is not clear that the disclosure that occurred is actionable. First, it is not clear that a private right of action exists. Second, unlike HIPAA, Florida’s statute requires Pierre-Paul to prove concrete harm from the disclosure of his medical records. Especially in light of first amendment limits on the publication of true facts, Pierre-Paul faces an uphill battle. It is not clear what information other than the amputation was included in the medical records. But arguing on the basis of the amputation alone, he will have to craft a convincing explanation for an injury suffered simply by the timing of the disclosure—as a professional athlete whose occupation is highly public, this information would not have been secret for long.[7] His absence, or his finger’s, would surely lead to speculation and would be easy to detect with the naked eye even without detailed medical records. As for information beyond the fact of amputation, Pierre-Paul may have a harder time describing how ESPN’s disclosure harmed him in any concrete way. Still, the Shulman[8] case supports a court finding offensiveness and the potential existence of a special zone of privacy when it comes to the medical context and the relationship between a medical provider and a patient that journalists must respect. This suggests Pierre-Paul has some hope. Still, intuitively, having a journalist publish medical records is a highly offensive and unacceptable invasion of privacy. Certainly, most people would object to having it happen to them. Yet the legal result is far less straightforward. This may suggest the need for new methods of protecting privacy that avoids the difficulties of proving harm.

    [1] Link to Article: https://www.law360.com/articles/764455/nfl-player-must-tackle-common-privacy-pratfall-in-espn-suit

    [2] An ESPN reporter tweeted an image of Pierre-Paul’s medical records, reaching nearly 4 million twitter followers.

    [3] Fla. Stat. § 456.057.

    [4] The Florida statute applies to any “records custodian” which is defined as any person or entity that “obtains medical records from a records owner,” which seems to include ESPN. § 456.057(3)-(4).

    [5] These include “claims processing or administration, data analysis, processing or administration, utilization review, quality assurance, patient safety activities listed at 42 CFR 3.20, billing, benefit management, practice management, and repricing.” 42 CFR § 160.103.

    [6] 42 CFR § 160.203(b).

    [7] The article references the Hulk Hogan case and its potential for revealing the promise of Pierre-Paul’s claim for harm in this case, but surely the expectation of privacy is far higher in intimate sexual activity than it is in the presence or absence of a publicly visible body part—regardless of celebrity status.

    [8] Shulman v. Grp. W Prods., Inc., 955 P.2d 469, 479 (1998), as modified on denial of reh’g (July 29, 1998).

  • Privacy Blog (1)

    Privacy Blog (1)

    By: Maggie Kornreich

    Professor Rubinstein

    March 24, 2014

    http://www.natlawreview.com/article/health-apps-and-hipaa-ocr-publishes-new-guidance-health-app-developers

    This article addresses whether mobile device applications are subject to HIPAA regulations. In February, the Department of Health and Human Services’ Office for Civil Rights (OCR) released Health App Use Scenarios & HIPAA to examine if HIPAA applies to apps that “collect, store, manage, organize, or transmit health information.”

    The Health App Guidance provides six scenarios and decides whether HIPAA would apply to the app developer in each instance. The first scenario involves a consumer who downloads a health app and provides the app with her personal information in order to organize her information without her healthcare providers. Here, the consumer is not a covered entity or business so the app developer is not subject to HIPAA. The second scenario involves a consumer who downloads a health app to manage a chronic condition. The consumer retrieves data from her doctor’s electronic health record as well as her own information to put into the app. The consumer is not a covered entity or business associate and the healthcare provider did not hire the app developer for the service so it is not subject to HIPAA. The third scenario involves a consumer who downloads an app after their doctor recommends it to track diet and exercise. The consumer sends a report to their doctor before the next appointment. The doctor did not hire the app developer so the developer is not subject to HIPAA.

    The fourth scenario involves a consumer downloading an app to manage a chronic condition, where the app developer and the healthcare provider have an interoperability agreement at the consumer’s request in order to exchange consumer information. The consumer inputs their own information into the app. The developer is not subject to HIPAA because they are not creating, maintaining, or transmitting personal health information on behalf of a covered entity or business associate. In the fifth scenario, a healthcare provider contracts with the app developer for patient management services and the provider instructs patients to use the app. Here, because the provider is a covered entity and the developer is considered a business associate, the developer is subject to HIPAA. The sixth scenario involves a health plan that offers a health app to allow members to store health records, check the status of claims and track their wellness information. The health plan analyzes the information. The developer is considered a business associate and the health plan is a covered entity. Therefore, the developer is subject to HIPAA.

    This article is interesting and informative because it outlines the instances when developer or company will be subject to HIPAA. This is increasingly important as people rely on their phones and apps on their phones for most if not all of their personal affairs. It is also significant in that it brings to light instances where people share health information, which many people deem to extremely private, in electronic forms.

  • Information Privacy: Blog Post (Panel 6)

    Information Privacy: Blog Post (Panel 6)

    By:Harry Grabow

    http://www.marketplace.org/2016/02/10/business/new-frontier-voter-tracking

    http://fusion.net/story/268108/dstillery-clever-tracking-trick/

    http://www.usatoday.com/story/news/politics/onpolitics/2016/02/08/company-tracked-iowa-caucusgoers-phones/80005966/

    In the first United States Presidential campaign since the 2013 Snowden, candidates have expressed varying opinions of both the man and the reality he exposed: while some have mildly praised his impact on the privacy discourse while questioning the legality and intelligence of his conduct (see here and here), others have gone as far as insisting that he is a traitor, or even an active Russian spy, prompting a response from the Kremlin itself.

    However, while the candidates’ focus remains on the legitimate national security and civil liberties implications of government surveillance, another use of data collection is taking the campaign season by storm: the creation of voter profiles based on mobile device advertising profiles of individuals at polling places. Traditionally the domain of phone-based public polling and on-site exit polls, digital advertising company Dstillery engineered a way – both creative and creepy in equal measure – to track voter preferences by matching mobile device identifiers present at the geographical locations of caucus sites, to the digital advertising profiles of the device’s owners. Specifically, they monitored the real-time ad bidding that occurs each time a mobile device opens an app or web site, captured the identification associated with the service of an ad, and then further researched additional characteristics associated with that mobile device.

    Though not conducted with the scientific rigor of a public poll, the results captured voter characteristics not usually polled by campaigns. For example, according to an analysis of the results conducted by USA Today: voters expecting a newborn child tend to be Republican and had a greater concentration at Senator Marco Rubio’s caucus sites; voters in locations with strong support for Donald Trump had a penchant for outdoor activities and home improvement; and tech-industry workers and enthusiasts were more concentrated in Senator Bernie Sanders’ caucus sites than those of former Secretary of State Hillary Clinton.

    In an interview with news site Fusion, a representative from Dstillery explained, “One thing that isn’t in the data is personal identifiable information. The data and system are completely anonymous. We have no idea, for example, what your name is. All we see are behaviors and everything we do is based on analyzing those behaviors writ-large.”

    So, while identifying individual voters and their preferences through this method of data collection might not be a present concern, the prospect of campaigns themselves utilizing these tactics might subject Iowans to an even greater saturation of political advertising in 2020, on top of the astounding $70 million spent in 2016. And, with many Iowans lamenting the constant barrage of TV and radio ads, campaigns might be eager to attempt this new, more subtle approach

  • PRG News Roundup: March 23rd

    PRG long-time participant Luke Stark’s research on emotion has been published in Information Society: https://starkcontrastdotco.files.wordpress.com/2014/09/01972243-2015.pdf

    DOJ postpones hearing on Apple case: http://fortune.com/2016/03/21/doj-court-hearing-apple-postpone/

    Utah tests online voting in state caucus: http://www.npr.org/2016/03/22/471474745/utah-republicans-to-test-online-voting-in-states-caucus

     

  • Privacy Blog

    Privacy Blog

    By: Mary Churan Huang

    http://www.irishtimes.com/business/technology/google-eu-s-data-protection-authorities-have-absolute-focus-on-privacy-1.2556597

    This article discusses the positions taken by some of the world’s largest multinational companies in respect of the EU’s incoming General Data Protection Regulation (GDPR). The GDPR is set to replace the existing EU Data Protection Directive of 1995. There is a need to update the existing privacy laws in the EU as the technology evolution has accelerated and the current framework does not properly cover critical issues such as social networks and cloud computing. The GDPR is designed to be a more comprehensive and wide-ranging legal framework which will apply EU-wide, transcending any local privacy laws. The GDPR is intended to strengthen and unify data protection within the EU so that there will be one single legal framework within the EU. This would allow for international businesses to more easily comply with the EU privacy framework while allowing for more comprehensive protection for EU residents.

    Both Google and Microsoft seems to regard the GDPR as a necessary step in data protection but does not anticipate a major change in their respective companies’ policies as they have already undertaken significant steps to enhance privacy over the years. Microsoft is concerned about the heavy penalties set out in the GDPR and has indicated that the new regime will be a great test of whether their privacy controls are working the way they should be. Google indicated that it will be working closely with the regulators to find a rational way of interpreting some of the ambiguities contained in the GDPR.

    Adobe seems to be more critical of the GDPR, expressing its disappointment that the framework is not as unifying as it is intended to be. Adobe is of the view that new framework will still be fragmented and subject to interpretation by local authorities. The responses by these major multinational companies are unsurprising considering that any new regime would bring about much uncertainty. However, all these companies have expressed a willingness to comply and work with the relevant regulators to find a solution.

  • A Consumer Privacy Legislation? A Highly Debated Proposal since 2012

    A Consumer Privacy Legislation? A Highly Debated Proposal since 2012

    By: Ida Faustine Jacotey

    http://www.nytimes.com/2016/02/29/technology/obamas-effort-on-consumer-privacy-falls-short-critics-say.html

    In February 2012, the White House, released “A framework for protecting Privacy and promoting innovation in the Global Digital Economy”, thereby presenting a new Consumer Privacy Bill of Rights described by the presidency as a “blueprint for privacy in the information age”.[1]

    President Obama’s proposal aimed to establish a consumer privacy framework in order to provide consumers with clear guidance on what they can expect from companies[2] handling their personal information, as well as setting obligations for companies using personal data[3]. As stated in the reference article, President Obama intended to see the Consumer Privacy Bill of Rights’ principles put into law, throughout a conjoint work between the Administration and the Congress.

    Four years later, in February 2016, Natasha Singer, journalist for The New York Times, raises the question as to why the President’s proposed framework has not moved forward.

    This question seems particularly relevant today, as the US is hosting an overwhelming fight between Apple and the FBI, bringing therefore greater attention into consumers’ privacy issues.

    Such Proposal is born in a context where technology has strongly stepped into the commercial world. As a result, in 2012 like today, consumer privacy was and still is a major concern in the US, particularly since the Snowden Revolution, which brought alarming revelations related to privacy and data collection to the public eye.

    From a consumer’s perspective, because privacy policies often appear obscure and/or complicated, it seems easier and almost mandatory to rely on trust when one subscribes to a service. Therefore, when the FTC brings actions against companies, consumers’ concern as to the collection and use of their personal information increases. This is particularly true when major companies such as Google or Facebook are involved, since they are an inherent part of most Americans’ lives. Such cases involving big-scale companies are likely to be assimilated as proofs of the industry’s tendency to abuse consumers’ trust.

    As highlighted in this article by Cameron F. Kerry, a former Department of Commerce’s general counsel, a loss of trust is neither desirable from the industry’s point of view.

    In the US, companies enjoy a large freedom to set up their own privacy policies. We know that many companies look closely to the FTC’s actions, particularly because the FTC has been increasingly active and has had a strong influence within the industry to consequently adapt their privacy policies. Despite this evident freedom that US companies enjoy to collect and use data of consumers, the industry seems aware that privacy practices require continuous adjustments to meet with consumers’ reasonable expectations as to the use and collection of data. Some companies took the relevant steps and efforts in enforcing consumer privacy protection, but some have not.

    In the one hand, this context demonstrates a lack of any pragmatic instrument that would provide consumers with some level of guarantee that their privacy remains protected. In the other hand, this great flexibility is also the reason why American companies continue to innovate so much.

    Consequently, there is a significant divergence of opinions about whether a legislation is desirable in the consumer privacy context. Because this ideological war is not new – and not over yet-, Natasha Singer mentions “a tale of clashing visions for American society and commerce”.

    Two confronting ideas are causing the proposal to freeze: Companies’ free access to consumers’ personal information susceptible to cause them harm in many ways against the will to allow American technology companies to continue innovating with data.

    In 2012, for the first time, the White House considered individual rights as a priority in the commercial privacy world[4]. This step fed the debate opposing consumer advocates (“pro-legislation”) and industry advocates (against the legislation).

    As revealed by this article, some advocates argue that there is already a sufficient number of federal laws with specific limitations to companies’ use of consumer records. However, the privacy advocates points out the fact that these existing acts are specific to some companies and that the bill would instead target any companies rather than certain categories only.[5]

    While the debate concerning this legislation still runs, the industry is nevertheless active, and the article points out some notable improvements that have been made since 2012, such as the data disclosure charts devised as a result from numerous discussions on mobile app transparency[6].

     

    [1] www.whitehouse.gov/sites/default/files/privacy-final.pdf, Introductory Note, page 3

    [2] www.whitehouse.gov/sites/default/files/privacy-final.pdf, Executive Summary, listing the key principles of the Bill, or “Fair Information Practice Principles” (FIPPs)

    [3] www.whitehouse.gov/sites/default/files/privacy-final.pdf, Introduction page *7*

    [4] www.whitehouse.gov/sites/default/files/privacy-final.pdf, Introduction: page *5* “Strengthening consumer data privacy protections in the United States is an important Administration priority. Americans value privacy and expect protection from intrusions by both private and govern- mental actors.”

    [5]http://www.nytimes.com/2015/02/28/business/white-house-proposes-broad-consumer-data-privacy-bill.html, Natasha Singer for the New York Times, February 27, 2015, “There are already a number of federal laws, like the Fair Credit Reporting Act and the Video Privacy Protection Act, that limit how companies may use certain specific consumer records. The new proposed bill, the Consumer Privacy Bill of Rights Act, is intended to fill in the gaps between those statutes by issuing some baseline data-processing requirements for all types of companies” and Senator Edward J. Markey stating that “Instead of codes of conduct developed by industries that have historically been opposed to strong privacy measures, we need uniform and legally enforceable rules that companies must abide by and consumers can rely upon”.

    [6] “The yearlong discussions on mobile app transparency ultimately resulted in a subset of the participants getting together on their own time to devise data disclosure charts — akin to nutrition labels on food packages — that apps could display for consumers.”

  • PRG News Roundup: March 2nd

    On Monday, the European Commission issued the legal texts that will put in place the EU-US Privacy Shield, and replace the former Safe Harbour framework. The Privacy Shield will not come into effect until the European Commission completes its findings to determine whether the new framework offers equivalent data protections to standards within the EU. [Ars Technica – “Privacy Shield” proposed to replace US-EU Safe Harbor, faces skepticism]

    The U.S. Supreme Court ruled against efforts to establish a public database to track the quality and cost of health care by private healthcare providers, arguing it is precluded by the 1974 Employee Retirement Income Security Act preventing states from imposing requirements on insurance companies to disclose data on self-funded plans. [NPR – Supreme Court Strikes at States’ Efforts on Health Care Transparency]

    The director of the FBI, James Comey, admitted the agency tried to gain access to the iPhone of the San Bernadino shooters in the 24 hours after the attack, and only asked Apple to unlock the phone after an unsuccessful iCloud password reset locked officials out of the account. [NYTimes – FBI Error Locked San Bernadino Attacker’s iPhone]

    In related news, a federal magistrate judge in New York has rejected the U.S. government’s request to gain access to data from a locked iPhone in a drug case. [The Intecept – Apple Wins Major Court Victory Against FBI in a Case Similar to San Bernadino]

    Governor Dennis Daugaard of South Dakota vetoed a bill that would have required transgender students in public schools to use bathrooms based on the gender they were assigned at birth. [NPR – South Dakota Governor Vetoes Bill Stipulating Transgender Students’ Bathroom Use]

     

  • Digital Ellis Islands

    American tech companies, especially those running social networking sites, often pride themselves on giving voice and information to oppressed netizens around the world. Many commend Twitter’s role in facilitating coordination and information flow during the 2009 Iranian presidential election process. Even the U.S. State Department acknowledged this role when it asked the microblogging site to hold off on a software update so as not to interfere with use by Iranian protesters. Twitter is currently banned in Iran. Also in 2009, the Chinese government blocked access to Facebook in order to curtail communication between independence activists rioting in Xinjiang. Twitter, Google search, and Youtube are blocked behind the Great Firewall of China to this day.

    Anonymous web browsing, such as onion routing via Tor or a comparable mechanism, provides a route around censorship and persecution. Individuals can breathe free on American-run sites, they just have to wear a hoodie.

    Yet, according to a recent paper, some of the world’s largest sites are cutting off access to anonymous users. Site providers can easily detect when a user is accessing it from an anonymous origin, and now many are restricting certain uses or precluding access altogether. The authors describe this as “second-class treatment.”

    Google is one site that limits search functions to anonymous users. Some companies have done the opposite. In 2014, Facebook provided a “hidden service,” where users can access the site anonymously and not be curtailed by algorithms that might otherwise block them for fraudulent use. Mark Zuckerberg once said, “How can you connect the whole world if you leave out 1.6 billion people?”

    This state of affairs is a common one in the privacy v. security debate. Blocking anonymous use is meant to curtail criminal use. This comes at the cost of denying innocent users, such as those seeking a refuge of communication and connection to the world when oppressive regimes won’t allow it.

    It is up to American companies, not the American government, to decide whether to stamp the ticket.

    Paper: https://www.internetsociety.org/sites/default/files/blogs-media/do-you-see-what-i-see-differential-treatment-anonymous-users.pdf

  • PRG News Roundup: February 24th

    Supreme Court hears oral argument on key exclusionary rule case: http://www.scotusblog.com/2016/02/argument-analysis-court-closely-divided-on-the-exclusionary-rule/

    House Judiciary Committee meets on February 25 on International Conflicts of Law Concerning Cross Border Data Flow and Law Enforcement Requests: https://www.govtrack.us/congress/committees/calendar

    Georgetown Law Conference: The Colors of Surveillance, on racial bias of government monitoring. April 8, 2016: http://www.law.georgetown.edu/news/press-releases/the-color-of-surveillance-georgetown-law-conference-to-explore-racial-bias-of-government-monitoring.cfm

    New German law authorizing privacy “class actions”: http://www.dataprotectionreport.com/2016/02/2866/