Author: Ashley Jacques

  • Maryland Court says Use of IMSI Catchers Violate the Fourth Amendment

    Maryland Court says Use of IMSI Catchers Violate the Fourth Amendment

    By Nicole Kramer

    The following blog post was written following an article featured on Fortune’s online platform, and can be found here.

    On March 30, 2016, the Maryland Court of Special Appeals issued an opinion written by Judge Leahy that found that the Baltimore police departments’ use of IMSI catchers to track suspects’ phones without search warrants, violated the Fourth Amendment as an unreasonable search.

    IMSI catchers such as the Hailstorm at issue in the Maryland case, are eavesdropping devices that intercept mobile phone calls and help determine a users precise location, thereby “transform[ing]” mobile phones into “real-time tracking device[s].”[1] Such devices are increasingly being used by law enforcement agencies without warrant raising significant privacy concerns. In Baltimore alone, it is estimated that the technology has been used in at least 2,000 investigations.[2]

    The Baltimore Police Department had relied on an approved application for a pen register/trap & trace order on the suspect’s cell phone to locate and arrest the petitioner in this case. One argument that arose in the lower court arguments was that, unlike with GPS or cell site information, information gathered with IMSI catchers was not generated willingly by the phone, rather the technology “forc[ed] the phone to emit information”[3] and identify itself. The information was not merely available to anyone who wanted to look for it; it was not “readily available and in the public view.”[4] And this fact weighed heavily in the court’s opinion.

    In the opinion, Judge Leahy discussed Justice Douglas’s dissenting and concurring opinions in Osborn v. United States, Lewis v. United States and Hoffa v. United States which raised a fear of a society becoming more accustomed to “surveillance at all times.”[5] She sided with the court in Katz, especially Justice Harlan’s concurrence, which offered individuals strong protection against unreasonable searches and seizures in the face of advancing technology. The court ultimately found, in accordance with the Supreme Court rulings in Karo, Kyllo, and Jones, that Justice Harlan’s two-part test should be applied, and that “people have an [objectively] reasonable expectation that their cell phones will not be used as real-time tracking devices by law enforcement,”[6] and therefore that the use of such technology required a search warrant imposing reasonable limitations on the scope and manner of the search.[7] The court further added that the prior case law established that the “use of surveillance technology not in general public use to obtain information about the interior of a home, . . . is a search under the Fourth Amendment.”[8]

    However, as the court noted, there are some exceptions to this conclusion. The court first looks to the Third Party Doctrine and United States v. Miller and Smith v. Maryland, but ultimately rejects this exception. The doctrine “provid[es] that an individual forfeits his or her expectation of privacy in information that is turned over to a third party.”[9] The state argued that the petitioner forfeited his expectation of privacy by carrying a cell phone that he knew would be communicating with nearby cell towers. But in these cases and those that followed from it, including Graham, it remained necessary that the user voluntarily convey the information to a third-party.[10] Which did not happen in Andrews.

    The Maryland Court ruling was a success for privacy advocates. The state’s attorney general has not stated whether his office will challenge the ruling.

    [1] State v. Andrews, 2016 Md. Ct. Spec. App. LEXIS 33, *1 (March 30, 2016).

    [2] David Z. Morris, Maryland Court Says Phone Tracking Unconstitutional, Fortune (April 3, 2016, 4:22PM EDT), http://fortune.com/2016/04/03/maryland-court-phone-tracking/.

    [3] Andrews at *19.

    [4] Id. at *59.

    [5] Id. at *28.

    [6] Id. at *2.

    [7] Id. at *65.

    [8] Id. at *50.

    [9] Id. at *65.

    [10] Id. at *69-70.

  • Smith v. Maryland, Third Party Doctrine as Applied to Reddit Users

    Smith v. Maryland, Third Party Doctrine as Applied to Reddit Users

    Naadia Chowdhury

    This past Friday, Reddit users were concerned about government internet surveillance and the privacy of their data. In its annual report, Reddit typically lists the kinds of requests it gets for its consumer information and for complaints to remove content. Reddit’s latest annual report was missing a paragraph that typically says Reddit did not receive a national security letter to conduct electronic surveillance. This indicated to users that the government may have sent Reddit a national security letter allowing the FBI to conduct surveillance and access user information without a warrant or court order.1

    To become a user on Reddit, you do not need to disclose a lot of information. Potential users create a username and disclose their email addresses. Compared to other social platforms, there is not a lot of directly identifiable information. It can be argued, however, that an email address is enough information to track down an individual.

    Even if the FBI was not working under the national security exception to the requirement of getting a warrant or court order to complete surveillance, it seems unlikely that Reddit users could successfully argue they have privacy rights against the government from accessing their information. Based on Smith v. Maryland, Reddit users do not have a legitimate expectation of privacy regarding their information because they disclose their email addresses and activities on Reddit to Reddit employees and the company. Any information a Reddit user discloses is information he or she voluntarily conveys to the company and therefore, assumes the risk of having their information revealed to a government official.

    In United States v. Forrester, the Ninth Circuit Court of Appeals determined internet users have no legitimate privacy expectations in the IP addresses of the websites they visit. The information Reddit users disclose seem analogous to this case and so, users have weak legal protections in their information.

    It is still not clear if privacy policies that companies have will alter the analysis courts undertake to determine whether there is a legitimate expectation in privacy. It would be logical that the assumption of the risk a user undertakes when registering on a website with a privacy policy guaranteeing to keep the user’s information safe would be altered and be a more limited assumption. If Reddit provides a privacy policy, perhaps an argument can be made, but as of 2008, protections against government electronic surveillance are fairly weak under the Smith test.

  • From Apple to Lavabit: The ECPA and the Legal Struggles Surrounding Encryption

    From Apple to Lavabit: The ECPA and the Legal Struggles Surrounding Encryption

    By: Debra Slutsky

    Although the FBI dropped its lawsuit to compel Apple to assist it in unlocking one of the San Bernardino shooter’s iPhones, the case provides insight into how the Justice Department grapples with modern digital communications using existing law. According to Kim Zetter’s article in Wired, Long Before the Apple-FBI Battle, Lavabit Sounded a Warning, such a struggle between the Justice Department and tech companies, specifically those that offer encrypted communication services, is not new. Zetter writes that Lavabit “made a surprising cameo this month in a brief filed by US attorneys in that case. The attorneys invoked the Lavabit case in a footnote as part of a threat to Apple…” However, as the Lavabit and Apple cases demonstrate, the channels available to the government for accessing such communications are largely legally untested and dependent on law that contemplated a much more primitive technological landscape.

    In the Apple case, attention was brought to the government’s use of the All Writs Act, a 227-year-old-law that grants judges the authority to issue writs, or orders, to compel parties to perform acts within the bounds of law. Through the All Writs Act, The FBI sought to force Apple to build a backdoor into its iOS operating system in order to access encrypted iMessages. The case has also brought spotlight to the Pen Register Act. The Pen Register Act is a component of the Electronic Communications Privacy Act (ECPA) of 1986. While originally intended to record outgoing telephone numbers dialed, the Pen Register Act was expanded by the Patriot Act to include IP addresses and email headers. In 1979, the significant case Smith v. Maryland affirmed that the use of a pen register by a telephone company does not violate the 4th Amendment. This metadata permissibly collected, however, provides deeper insight than just the frequency and duration of phone numbers. This was noted in Smith v. Maryland by Justice Stewart in his dissent, where he wrote that telephone metadata, “although certainly more prosaic than the conversation itself – [is] not without ‘content’…I doubt there are any who would be happy to have broadcast to the world a list of the local or long distance numbers they have dialed. This is not because such a list might in some sense be incriminating, but because it easily could reveal…the most intimate details of a person’s life.”

    Similarly, in 2013, the FBI obtained and served an order pursuant to the Pen Register Act upon Lavabit founder Ladar Levison. The order permitted the FBI to obtain information sent to and from the target’s email account, Edward Snowden. Since Lavabit employed SSL encryption to protect transmitted data, the order further required Levison to provide the SSL keys to access Snowden’s emails. Lavabit’s roughly 400,000 subscribers would be placed at risk of having their emails vulnerable, since the company used only five SSL key-pairs. Levison planned to challenge the order, but lacking financial resources, he was forced to represent himself pro se during initial proceedings. He was held to be in contempt of court after refusing to comply with the order to hand over Lavabit’s encryption keys. On appeal, the contempt order was affirmed, and the appellate judge never addressed Levison’s challenge to the underlying legality of the Government’s use of the Pen Register Act to obtain SSL keys since he did not properly object to that issue during the earlier proceedings.

    In its case against Apple, the Justice Department cited to the Lavabit case for the proposition that its pen register order to produce SSL encryption keys was affirmed on appeal. Levison took to Facebook to publicly state that the Government’s language was “incredibly misleading” as to the appellate court’s actual ruling. The statement is accurate. As the law stands, the Government has wide ranging authority to obtain pen register orders to access digital communications. The bounds of Government authority to obtain digital communications and the lengths it may go to compel third parties to assist in such measures is, however, ripe for review. The Lavabit and Apple cases both concluded before ever reaching a definitive ruling. With encryption becoming the standard for digital communications, a case is bound to arise in the near future that will definitively rule on the issue.

    Sources:

     

     

  • Redefining Fourth Amendment Law for The Digital Age

    Redefining Fourth Amendment Law for The Digital Age.

    By Macarena Troncoso

    On March 11, Brian Farrell – accused of being a staff member on silk road 2.0 – pleaded guilty of conspiracy to distribute illegal drugs using the TOR network.

    The Silk Road 2.0 was a hidden service operating in the Deep Web until it was shut down by the FBI in November 2014. Farrell’s plea agreement could be the final chapter of a case that arises relevant questions about the protection afforded by the fourth amendment to internet users.

    Let’s begin with undisputed facts: Carnegie Mellon University was funded by the US Department of Defense to carry out the research that allowed to reveal the identity of several Dark Market users. The information obtained during the study was later accessed by the FBI through a subpoena and permitted the identification of Brian Farrell as a prominent user of Silk Road 2.0.

    To put it simply: Carnegie Mellon engaged in a prolonged and prospective surveillance of the Deep Web, used government funding, and obtained results that were used for law enforcement purposes.

    What differentiates this conduct from outsourcing police work to universities? Where is the line between private searches and government searches? Was Carnegie Mellon University acting as an agent of the government? Unfortunately, these essential questions will remain unanswered since the court determined that the case did not involve a search, turning irrelevant the discussion about state action as a necessary trigger for fourth amendment safeguards.

    Indeed, while denying the defendant’s motion to compel discovery, judge Richard Jones ruled that users of TOR have no reasonable expectation of privacy on their IP addresses when using TOR Network, even though the purpose of TOR is precisely to hide the identity of its users, enabling them to communicate privately and securely and to access the internet anonymously. Relying on Forrester[1], the judge considered that in the course of using TOR network, “an individual would necessarily be disclosing his identifying information to complete strangers”[2] and this submission of information “is made despite the understanding communicated by the Tor project that the Tor network has vulnerabilities and that users might not remain anonymous”.

    Using the third-party doctrine announced in Smith v. Maryland[3], the judge presumed that individuals who convey information to third parties have taken the risk of an eventual disclosure to the government. Is this notion workable in the digital era? The assumption of this kind of risk appears to be an integral part of life in the XXI Century. People daily turn over a great amount of information to private and public entities through the use of computers, mobile apps, or other tech devices connected to the web. Does that mean that we have surrender our expectations of privacy?

    I believe we should not yield. In her concurrence in Jones[4], Justice Sotomayor called for a reevaluation of the premise that an individual had no reasonable expectation of privacy in information voluntarily disclosed to third parties, considering this approach “ill-suited” for the digital age. It is imperative for the courts to rethink and reshape Third Party doctrine and other fundamentals notions such as State Action and the Reasonable Expectation of Privacy test to attune them to the challenges posed by the Internet era.

    [1] United States v. Forrester. United States Court of Appeals for the Ninth Circuit, 2007. 495 F. 3d 1041.

    [2] The mention of “complete strangers” points to the individuals that host the network nodes.

    [3] Smith v. Maryland, 442 U.S. 735 (1979)

    [4] United States v. Jones, 132 S. Ct. 945 (2012)

  • HIPAA, Gun Control, and Mental Health

    HIPAA, Gun Control, and Mental Health

    By: Erika Asgeirsson

    A new HIPAA rule issued in January will allow certain health agencies and medical facilities (“covered entities” under HIPAA) to report the identity of individuals subject to mental health disqualifications to the federal database to prevent them from purchasing firearms. 45 C.F.R. § 164.512(k)(7). Currently those involuntarily committed to mental health institutions, those found incompetent to stand trial, and those deemed a danger to themselves or others are excluded from shipping, transporting, possessing, or receiving a firearm. In the past, certain covered entities often did not disclose the identity of these individuals to the federal database for fear of violating HIPAA. The new rule, which the administration asserts only clarifies and does not change the situation, is part of President Obama’s action more generally on gun control.

    As the Washington Post article notes, mental health advocates are spilt on this rule. While encouraged by increased attention to the need to effectively care for those suffering from a mental illness, some advocates argue this rule unfairly targets the mentally ill, stigmatizes those suffering from a mental illness, and is not based on data about gun violence actually committed by this community.

    Analyzing this rule from a privacy perspective shows that this issue is much more complicated than it often appears. I agree that we need to take action to reduce gun violence. Attention to mental illness and its intersection with gun violence has recently become a common talking point. However, there are compelling interests on both sides that should make even those supportive of gun control think more critically about this rule. Such restrictions have the potential to reduce gun violence by ensuring that firearms do not fall into the hands of those who should not have them. On the other hand, the attitudes underlying this rule stigmatize those suffering from mental illness. The rule might discourage people from seeking needed treatment or unduly target those suffering from a mental illness without supporting evidence. Important privacy interests are at stake because HIPAA deals with very sensitive information that is often relayed through a health care provider, who has a protected relationship with the patient. (Note, the final rule does not apply to most health care providers but applies to entities the provider may report to.)

    Given this context, it is important to think deeply about this rule and its subsequent implementation. While the points below remain preliminary suggestions, I hope they might encourage further conversation on this important issue. None of these are easy to solve, and will take a great deal of time and effort. However, they are a starting point to ensure that the competing interests, including privacy, are properly balanced.  Some of these appear to be addressed in the administration rule, while others may require more action.

    1. Keep the circle close. Ensure the information is only shared with the database and not with other related agencies. The new rule explicitly addresses who the covered entity discloses information to (the database or a designated entity per 45 C.F.R. 164.512(k)(7)), but it is also important to address who the federal database shares information with and what information is shared. Given the sensitive nature of the information, it may require more rigorous requirements regarding the extent of disclosure by the database than those imposed for the disclosure of other information in the federal database.
    2. Disclose as little as possible. Under the rule, the entity only discloses certain demographic and other data, and does not include the specific diagnosis or other clinical information. The extent of information disclosed to the database should be consistently reassessed to ensure only the information necessary is disclosed.
    3. Use an evidence-based approach. Thresholds triggering the prohibition should be based on clear supporting data so that those suffering from a mental illness are not unnecessarily targeted. In addition to ensuring fair treatment, this also protects against overbroad disclose and other infringements on privacy.
    4. Right to appeal. Just as it is important that a consumer has the opportunity to correct data collected on her, the information and determination must be subject to appeal. This includes appealing misidentification or incorrect classification. Procedures for appeal need to respect the privacy and dignity of the individual contesting the identification or determination.
    5. Explore alternatives that are less intrusive to patient privacy. Other actions, such as increased funding for mental health treatment or gun training and licensing requirements, may be just as or more effective at reducing gun violence with a more limited intrusion into the patient privacy. More research should be done to better evaluate the efficacy of various alternatives.
    6. Operate on principles that protect the dignity of those suffering from mental illness. Ensure that the rules and implementation do not stigmatize those suffering from mental illness. Privacy is often a central element to human dignity.

    Sources:

    For further information, please see:

     

     

  • Germany Is Putting Facebook Through the Rounds

    Germany Is Putting Facebook Through the Rounds

    By Ryan P. K. Brown

    Things do not bode well for Facebook in Germany. The country’s government has been stepping up its enforcement of user-protective laws against Facebook’s data collection practices. User-friendly EU privacy laws are already much more restrictive on how websites can use and collect user data than in the United States as they stand. But within less than a two-week period of time, Germany has pushed back three separate times against Facebook’s data collection and use policy, each through a different means of restriction.

    In late February of this year, the social media giant was fined 100,000 euros for failing to amend its terms of service to in compliance with a 2012 order to fit within the narrowly tailored laws of the European Union that protect user data. After the fine was issued in court, Facebook agreed to change its terms of service in order to comply and said it was going to pay the fine. Obviously, this sum of money is not much of a blow to the media giant’s massive bank account, but this is but one manner in which the German government has warned Facebook of its data collection and use policies.

    The next warning came on March 2 of this year. The German Federal Cartel Office (FCO), Germany’s competition watchdog, issued a statement claiming that Facebook may be using its dominant marketing position in order to violate user data privacy laws. The FCO announced that it is conducting an investigation into the terms of use of Facebook’s social media services. They are particularly concerned that Facebook is abusing its dominance in the social media market in order to conduct illegal and unethical data use and collection practices.

    Finally, media outlets reported that this past Wednesday, March 9, a German court ruled that Facebook’s “like” button may actually be in violation of law if clicked on commercial websites. The court specifically pointed out that the violation occurred when users were not notified that their data may be shared if they clicked a “like” button on a commercial website. In this decision, the court warned that they could fine the commercial websites for hosting the “like” button who lack any notice of how a user’s data may be shared. This warning was not directly aimed at Facebook, but the implications for the social media company—i.e. increasing the friction of data collection and use—are clear.

    Obviously these fines, rulings and investigations are not, individually, much of a threat to Facebook as the social media giant that it is, but the path that Germany appears to be treading could lead to long-term difficulties and power shifting. EU law is already much more user- and consumer-friendly than the United States’. This tightening of the grip, so to speak, on Facebook is an indication of further measures the German government is willing to take to further protect users.

  • First Alexa, Now Fox! From valuable personal assistant to home and outdoor external spy…

    First Alexa, Now Fox!

    From valuable personal assistant to home and outdoor external spy…

    By: Annabelle Divoy

    Why open a dictionary when you can ask Alexa the height of Mount Everest? Why painfully reach for your timer on the kitchen shelf when Alexa can tell you when to turn off the oven? Why even bother tell jokes to your children when Alexa can do it for you? This is only a very small – and seductive – preview of all the tasks that Alexa, the voice-controlled personal assistant created by Amazon Echo, is able to perform[1]. Launched for sales in November 2014, Alexa has been welcomed in many homes, for the reasonable price of $199 (even $99 if you are an Amazon Prime member) and has definitely stolen Siri[2]’s thunder.

     

    Blog 2016

    Although Amazon does not communicate its sales estimates, its artificially intelligent personal assistant seems well-acclaimed by consumers, against all the threats it implies for their privacy. Indeed, in order to hear and execute the commands that consumers direct to “her” by calling “her” name, Alexa is constantly recording everything that happens in the home. If this very well-achieved gadget will certainly help you in many of your daily tasks and chores, entertain you and stimulate your knowledge, it will also seriously invade your most private moments. Alexa will hear you narrating your full day of work to your husband, listen to your telephone call with your best friend Carrie, be the best new companion of your children, learn that you prefer pop music to jazz, know that you added chocolate and wine to your shopping list and, even, that you let your vegetables burn for the third time this week.

    Alexa may thus quickly shift from valuable personal assistant to home-robot intruder spy[3]. The level and amount of personal data that it is able to collect, analyze, use and/or disclose is so high that it becomes worrisome. And these privacy concerns grow considerably bigger when you consider the risk of Alexa’s gigantic range of data not only being used by Amazon and its commercial partners, but also potentially pirated by outsiders. Anxiety does not vanish at the view of Amazon’s Echo Terms of Use, as “Alexa” does not even have its own privacy policy, only referring to Amazon General Privacy Policy[4].

    Yet, only a few seem truly concerned about Alexa’s dangers. For now, most consumers only focus on the attractive functions of this high-tech gadget and are filled with excitement as “Pringles-can-sized Alexa” will soon have a shorter and portable sibling[5]. As related by the Wall Street Journal in January 2016, Amazon recently announced the upcoming launch of “Fox”, a voice-controlled personal assistant, using Amazon Echo’s technology, but fitting in the palm of your hand and not requiring a power cord to function, allowing little Fox to be used outdoor and not be placed under house arrest like tall Alexa.

    Amazon’s business strategy and technological innovation certainly deserve applause and give serious competition to others in the field. But when trading Alexa for Fox, or, even more so, combining both, and giving company to our already indiscrete iPhones or Androids and computers, there might be very little room left for our, yet so valuable, privacy.

    [1] « Introducing Amazon Echo », Amazon’s official video, November 6, 2014. https://www.youtube.com/watch?v=KkOCeAtKHIc

    [2] Siri (Speech Interpretation and Recognition Interface) is Apple’s intelligent personal assistant.

    [3] “Goodbye Privacy, Hello Alexa: Amazon Echo, the Home Robot who hears it all”, The Guardian, November 21st 2015,

    http://www.theguardian.com/technology/2015/nov/21/amazon-echo-alexa-home-robot-privacy-cloud

    [4] https://www.amazon.com/gp/help/customer/display.html?nodeId=201625490, linking to https://www.amazon.com/gp/help/customer/display.html?nodeId=468496

    [5] http://www.wsj.com/articles/amazon-to-release-portable-version-of-echo-speaker-in-coming-weeks-1452532671

  • Privacy Blog Assignment – Panel 6

    Privacy Blog Assignment – Pannel 6

    By: Ricardo Leite Ribeiro

    The article which I am providing the link here was published in the Wall Street Journal on March 2, 2016. It’s about the fact that the German antitrust regulator, the “Bundeskartellamt”, had opened an investigation against Facebook for the “abuse of dominant position” regarding the harvest of personal data from consumers. In the words of the head of the agency “It needs to be clarified whether consumers are being sufficiently informed about the nature and scale of data collection”.

    The news is relevant because it points out the possibility of competition laws to be enforced as a vehicle to guarantee privacy protections for consumers. This is a path that Europe may follow, especially regarding abuse of dominant position violations. Might be antitrust a new frontier for advancing in privacy protection? Is there a role for it to play in this field? Are its instruments and tools suitable for the task? What would be the remedies applied in this case?

    From the article, it becomes clear that the accusation the motivate the investigation is that Facebook is leveraging its monopolistic advantage as a social network to obtain advantages in the data market. As an antitrust problem, this might be classified as using the market power acquired in one specific market to restrain the competition in an upstream market. This is particularly interesting because in U.S. this conduct is very unlikely to be a violation of § 2 of the Sherman Act, particularly after Trinko.

    http://www.wsj.com/articles/facebook-faces-antitrust-investigation-in-germany-1456920796

     

     

     

  • Pierre-Paul’s Medical Disclosure Claim Against ESPN: Issues of Intersecting Privacy Torts

    Panel 5

    Pierre-Paul’s Medical Disclosure Claim Against ESPN: Issues of Intersecting Privacy Torts.

    By: Eliza Marshall

    The New York Giants’ Jean Pierre-Paul’s suit against ESPN, which pertains to ESPN’s publication[2] of medical records linked to Pierre-Paul’s index finger amputation last summer, provides fruitful grounds for exploring the territory covered by intersecting and perhaps under-inclusive legal regimes in the medical information context. Pierre-Paul’s claim appears to fall in between two broad legal regimes: HIPAA and Florida’s state medical information statute.[3] HIPAA’s protection is broad in that it focuses on source rather than content or publication to avoid questions of harm in the context of medical information, but narrow in that it only covers certain entities and their business associates. Florida’s law, in contrast, is more limited in terms of content, publication and harm, but more broad in that it applies outside of the covered entities listed in HIPAA.[4] Yet Pierre-Paul’s claim may lie in territory covered by neither laws, and demonstrates a gap between regimes arguably worth addressing by expanding one or both.

    Under HIPAA, the content of the disclosure is clearly covered. But the statute does not regulate the behavior of ESPN. It is not a “covered entity,” and falls outside of more expansive definition of “business associate” because it does not (and did not) receive, maintain, or transmit personal health information for any of the functions or activities listed in the regulation.[5] HIPAA is prefaced on the notion that medical information is uniquely sensitive and inherently involves privacy harm, and so the statute does not require any inquiry into harm or publicity and covers the entire category of medical data being vulnerable to disclosure even if no unauthorized access ever occurs. One can question, therefore, why entities like ESPN should not be forced to treat this information with care. But the answer seems clearly to be that HIPAA does not cover them, and so any claim thereunder would have to be against the health care provider who provided the records to ESPN in the first place—and theirs are the only (presumably shallower) pockets that Pierre-Paul can tap.

    State law picks up where HIPAA leaves off,[6] but like the wider genre of Prosser’s privacy torts presents Pierre-Paul with its own set of obstacles. ESPN is covered under Florida’s statute, but it is not clear that the disclosure that occurred is actionable. First, it is not clear that a private right of action exists. Second, unlike HIPAA, Florida’s statute requires Pierre-Paul to prove concrete harm from the disclosure of his medical records. Especially in light of first amendment limits on the publication of true facts, Pierre-Paul faces an uphill battle. It is not clear what information other than the amputation was included in the medical records. But arguing on the basis of the amputation alone, he will have to craft a convincing explanation for an injury suffered simply by the timing of the disclosure—as a professional athlete whose occupation is highly public, this information would not have been secret for long.[7] His absence, or his finger’s, would surely lead to speculation and would be easy to detect with the naked eye even without detailed medical records. As for information beyond the fact of amputation, Pierre-Paul may have a harder time describing how ESPN’s disclosure harmed him in any concrete way. Still, the Shulman[8] case supports a court finding offensiveness and the potential existence of a special zone of privacy when it comes to the medical context and the relationship between a medical provider and a patient that journalists must respect. This suggests Pierre-Paul has some hope. Still, intuitively, having a journalist publish medical records is a highly offensive and unacceptable invasion of privacy. Certainly, most people would object to having it happen to them. Yet the legal result is far less straightforward. This may suggest the need for new methods of protecting privacy that avoids the difficulties of proving harm.

    [1] Link to Article: https://www.law360.com/articles/764455/nfl-player-must-tackle-common-privacy-pratfall-in-espn-suit

    [2] An ESPN reporter tweeted an image of Pierre-Paul’s medical records, reaching nearly 4 million twitter followers.

    [3] Fla. Stat. § 456.057.

    [4] The Florida statute applies to any “records custodian” which is defined as any person or entity that “obtains medical records from a records owner,” which seems to include ESPN. § 456.057(3)-(4).

    [5] These include “claims processing or administration, data analysis, processing or administration, utilization review, quality assurance, patient safety activities listed at 42 CFR 3.20, billing, benefit management, practice management, and repricing.” 42 CFR § 160.103.

    [6] 42 CFR § 160.203(b).

    [7] The article references the Hulk Hogan case and its potential for revealing the promise of Pierre-Paul’s claim for harm in this case, but surely the expectation of privacy is far higher in intimate sexual activity than it is in the presence or absence of a publicly visible body part—regardless of celebrity status.

    [8] Shulman v. Grp. W Prods., Inc., 955 P.2d 469, 479 (1998), as modified on denial of reh’g (July 29, 1998).

  • Privacy Blog (1)

    Privacy Blog (1)

    By: Maggie Kornreich

    Professor Rubinstein

    March 24, 2014

    http://www.natlawreview.com/article/health-apps-and-hipaa-ocr-publishes-new-guidance-health-app-developers

    This article addresses whether mobile device applications are subject to HIPAA regulations. In February, the Department of Health and Human Services’ Office for Civil Rights (OCR) released Health App Use Scenarios & HIPAA to examine if HIPAA applies to apps that “collect, store, manage, organize, or transmit health information.”

    The Health App Guidance provides six scenarios and decides whether HIPAA would apply to the app developer in each instance. The first scenario involves a consumer who downloads a health app and provides the app with her personal information in order to organize her information without her healthcare providers. Here, the consumer is not a covered entity or business so the app developer is not subject to HIPAA. The second scenario involves a consumer who downloads a health app to manage a chronic condition. The consumer retrieves data from her doctor’s electronic health record as well as her own information to put into the app. The consumer is not a covered entity or business associate and the healthcare provider did not hire the app developer for the service so it is not subject to HIPAA. The third scenario involves a consumer who downloads an app after their doctor recommends it to track diet and exercise. The consumer sends a report to their doctor before the next appointment. The doctor did not hire the app developer so the developer is not subject to HIPAA.

    The fourth scenario involves a consumer downloading an app to manage a chronic condition, where the app developer and the healthcare provider have an interoperability agreement at the consumer’s request in order to exchange consumer information. The consumer inputs their own information into the app. The developer is not subject to HIPAA because they are not creating, maintaining, or transmitting personal health information on behalf of a covered entity or business associate. In the fifth scenario, a healthcare provider contracts with the app developer for patient management services and the provider instructs patients to use the app. Here, because the provider is a covered entity and the developer is considered a business associate, the developer is subject to HIPAA. The sixth scenario involves a health plan that offers a health app to allow members to store health records, check the status of claims and track their wellness information. The health plan analyzes the information. The developer is considered a business associate and the health plan is a covered entity. Therefore, the developer is subject to HIPAA.

    This article is interesting and informative because it outlines the instances when developer or company will be subject to HIPAA. This is increasingly important as people rely on their phones and apps on their phones for most if not all of their personal affairs. It is also significant in that it brings to light instances where people share health information, which many people deem to extremely private, in electronic forms.