Blog

  • College Rape Case Shows A Key Limit To Medical Privacy Law

    April 23rd, 2015

    College Rape Case Shows A Key Limit To Medical Privacy Law

    By: Ryusuke Tanaka

    http://www.npr.org/blogs/health/2015/03/09/391876192/college-rape-case-shows-a-key-limit-to-medical-privacy-law

    A student allegedly raped by other students got medical therapy at her university’s clinic. After the student sued the university, the university accessed, without notice or consent, to the student’s medical record and sent them to its attorney in preparing for its defense against the student. The university’s access invokes privacy concerns and uncertainty in the scope of the laws.

    What laws govern this issue? The Health Insurance Portability and Accountability Act (HIPAA) and Health Information Technology for Economic and Clinical Health (HITECH) Act have a relatively strong regulation for the protection of individual’s health information possessed by health care provider. Yet, HIPAA regulations apply only to “health plans, health care clearinghouses and health care providers” that transmit health information electronically in connection with certain health insurance related transactions[1]. If the university in this case processes and transmits, for example, health care claims submitted to a health plan, then it becomes possible to regard the university as “health care provider” or as “hybrid entities” that employs health care provider.

    Whereas, the Family Education Rights and Privacy Act (FERPA) prohibits educational institutions from disclosing “education records” without the authorization of student (or parent). In general, “education records” are defined as records which contain information directly related to a student and maintained by an educational institution[2]. FERPA permits schools to disclose, without consent, educational record to the court for its defense if a parent or student initiates legal action[3]. The university in this case, when sued by the student, could plausibly rely on this provision to disclose her medical information to its attorney or the court.

    According to the United States Department of Health and Human Services, regarding educational records where FERPA applies, schools should comply with FERPA, and in that case, they are not necessarily bound by HIPPA.[4]

    The point that should be emphasized in this case is that the information accessed and disclosed was the therapy record of a rape victim. With high probability, it contains sensitive information that a reasonable person would not wish to be disclosed. In addition, a victim like the student in this case would have visited a school therapist not to complaint about incident but to sincerely receive medical care. Given the situation where the school counselors owed confidential responsibility and fiduciary duty under the professional ethics code, it is possible to say that a reasonable student would reasonably expect that information given to a school counselor should be protected as a medical record and not regarded as an educational record.

    This case seems to urge the court to clarify the exact scope of HIPAA and FERPA.

    [1] 45 C.F.R. §160.102

    [2] 20 U.S.C. §1232g(a)(4)(A)

    [3] 34 C.F.R. §99.31(a)(9)(iii)(B)

    [4] http://www.hhs.gov/ocr/privacy/hipaa/faq/ferpa_and_hipaa/513.html

     

  • Reflections on D.C. Administration’s Proposed Exemption of Police Body Camera Footage Disclosure

    April 23rd, 2015

    Reflections on D.C. Administration’s Proposed Exemption of Police Body Camera Footage Disclosure

    By Wei-Po Wang

    Recently, the District of Columbia Mayor Muriel Bowser is looking to enact legislation to exempt footages from the Metropolitan Police Department (MPD)’s expanding body camera program from public records requests based on the Freedom of Information Act (FOIA) or its state counterpart. (“D.C. wants to keep police body camera footage hidden from public eye.” http://www.washingtontimes.com/news/2015/apr/14/dc-wants-police-body-camera-footage-exempt-from-pu/?page=all). The significance of the D.C. proposal, different from similar proposal or enactment by other states, is that instead of trying to hit a balance between the public interest in holding the police enforcement procedures accountable and the privacy concern associated with making these footages public, it goes all out and requests a blank check exemption of footages by police body cameras from the disclosure regime of FOIA.

    Should the proposed statute come into reality, it would fall within the exemption under § 552(b)(3), where disclosure could be avoided if specifically exempted by statute. However, one must pay special attention to the qualifiers of subsection (b)(3), where the statute must “(A) requires that the matters be withheld from the public in such a manner as to leave no discretion on the issue, or (B) establishes particular criteria for withholding or refers to particular types of matters to be withheld.” Given the proposed enactment affords a blanket exemption for all footage data recorded by any police body camera, neither requirement (A) nor (B) seems to be of particular issue, since blanket exemption leaves no room for discretionary decision by the executive branch, and the criteria (i.e., all footages without any qualifier) is indeed particular in kind.

    However, in view of the heightened public concern and awareness of law enforcement accountability arising out of the recent series of police brutality starting from the Ferguson incident through the most recent death of Freddie Gray, it has become apparent that more police accountability to be afforded through adoption of new technologies such as body cameras is now an ever important public interest issue. This leads to doubt that Mayor Bowser’s proposed legislation could have made more genuine efforts to strike a subtle balance between accountability and privacy.

    In light of this line of development, it may be worth exploring how the new technology of police body camera and the footage data created by it would fit into the current FOIA exemption regime, especially under § 552(b)(7), which exempts from disclosure records or information compiled for law enforcement purposes, as long as they meet certain enumerated categories.

    If it is a case where the public disclosure of the footages may interfere with enforcement proceedings, subsection (b)(7)(A) warrants such exemption. Subsection (b)(7)(B) affords exemption if the disclosure would deprive a person of a right to a fair trial or an impartial adjudication. This particular exemption may have implication in a situation where the images in a certain footage would in effect temper the perception of a potential jury pool on a foreseeable prosecution of the enforcing police officer’s conduct. The traditional personal privacy concern is also squarely addressed by subsection (b)(7)(C), for example in a case where the video footage caught certain private activities of bystander citizens who have no relevancy for public scrutiny. Prevention of endangering life or physical safety of any individual is also addressed by subsection (b)(7)(F).

    Apart from the categorical exemption for interference with enforcement proceedings, § 552(b)(7) also highlights two specific exemptions associated with some most contested consideration in the course of law enforcement by police force. There is always the fear that such video footages would have the effect of revealing customary law enforcement techniques adopted by the police forces, which would inform future perpetrators on how to circumvent these enforcement efforts. This level of concern is safeguarded by subsection (b)(7)(E). Similarly, video recordings may a lot of times reveal the various confidential sources of information fostering the effective law enforcement, and disclosing these confidential source would have paramount adverse effect on future enforcement, investigation and prosecution efforts. Fortunately, this is also covered by subsection (b)(7)(D).

    Based on the above analysis, it seems fair to declare that any privacy or law enforcement associated concerns along came with the development of body camera technologies has already largely been addressed by existing FOIA exemption regime. As a result, it may be more advisable for the D.C. administration to consider forgoing the blank check approach on body camera footage exemption, and instead taking up a more balanced, enumerative approach more akin to that of the FOIA.

  • Fitness apps may pose legal problems for doctors

    April 23rd, 2015

    Fitness apps may pose legal problems for doctors

    By: Emma Trotter

    The February 2015 Associated Press article “Challenges for Doctors Using Fitness Trackers & Apps,” which can be found at http://www.theepochtimes.com/n3/1257858-challenges-for-doctors-using-fitness-trackers-apps/, raises several issues that relate to topics covered during this week’s class on health privacy. The article reads as a list of potential trouble spots for doctors and declines to offer many solutions.

    First, the article points out that, because HIPAA was written to only narrowly apply to entities that issue, accept, or otherwise deal in health insurance, the law’s privacy protections do not extend to the many new apps and devices that help users keep track of their health and fitness. As mentioned in class, this information might come as a shock to users, who tend to assume that HIPAA is much broader than it really is. This could lead to users over-sharing, thinking their information is protected because they are collecting and providing it in a health context, in the meaning of Helen Nissenbaum. If an app were to sell that normatively sensitive health information to third parties, it could theoretically be used, in secret, to deny a less in-shape person a job or offer that person insurance at a higher rate.

    The article also mentions that certain apps have one purpose but could be used for others. For example, if a person wearing a step counter that tracks location goes and meets up with another person wearing that same brand of step counter, the device manufacturer probably has the ability to determine that those two people are together. While this may not seem like a privacy harm in and of itself, we have learned over the course of the semester from several theorists, including Neil Richards, that surveillance can curtail intellectual freedom and exploration.

    Additionally, the article points out some reliability problems with certain types of data. For example, smart pillboxes that purport to track when patients take medication really only show when patients pick up the boxes. For now, doctors are still relying on patients to accurately self-report. That information could be supplemented by FICO’s new Medical Adherence Score, which we learned about from Parker-Pope’s NYT article, but since that score relies on information such as home ownership and job stability, not actual health data, it is fundamentally inference-based and reflects statistical averages better than the actual behavior of any individual patient.

    Another reliability issue the article brings up stems from the fact that many of the apps and devices aren’t regulated by the FDA. The article suggests that this means some of the claims made by these businesses might not deserve doctors’ trust; for example, Fitbit sleep tracking might be oversensitive to movement and show a user as getting far less sleep than she really is. This concern could be mitigated somewhat by the FTC’s ability to use its section 5 jurisdiction to hold these companies accountable for deceptive or unfair business practices based on extremely overstated claims, which we studied earlier in the semester. But, as the article also points out, this limited recourse would only address data reliability and wouldn’t prevent the apps from selling data to third parties and violating contextual integrity if their posted privacy policies allow them to do so.

    Yet another reliability issue raised by the article is that, for now, the data collected by these apps and devices skews toward younger people more likely to use or wear them. Since younger people are statistically healthier than older people, this could introduce bias into the data collected.

    Finally, the article touches on the issue of liability. Imagine that a fitness tracking app shows something worrisome – a spike in blood pressure, for instance – and a doctor fails to notice it. Is that doctor liable, under traditional tort theories of medical malpractice, for an injury that then befalls the patient? The article suggests developing technological systems to scan the data and automatically flag potential trouble spots – but that doesn’t completely eliminate the issue. What if the technology fails, or the doctor still fails to act? This issue is of course compounded by the possibility that the data may be unreliable, as discussed above.

  • Which Federal Agency Should Regulate Health Apps?

    April 21, 2015

    By: Rachel Wisotsky

    Which Federal Agency Should Regulate Health Apps?

    Sources:

    Mobile health applications are subject to the regulatory authority of several federal agencies. Due to the rapidly evolving nature of the industry, and the limits of each agency’s regulatory authority, it remains unclear which agency will offer the most comprehensive oversight over privacy and security risks. Three agencies that play a role in the regulation of health apps are The Department of Health and Human Services (HHS), The Food and Drug Administration (FDA), and The Federal Trade Commission (FTC).

    The HHS

    The HHS, which monitors HIPAA violations, will have a crucial role in regulating health apps used by health care providers. However, the HIPAA privacy rule only applies to “covered entities”, which does not include consumers who use private health apps outside of a healthcare setting. The HHS lacks experience with the privacy or security risks of consumer-facing commercial technologies.

    The FDA

    The FDA’s authority to regulate apps is limited to apps that qualify as a medical devices. The FDA announced it will focus its oversight on apps that are used an accessory to a regulated medical device- for example, to diagnose, treat, or prevent a disease; and to apps that transform a mobile platform into a medical device- for example, an app that turns a Smartphone into an ECG to detect heart conditions.

    Further, the FDA’s regulatory authority only focuses on security protections. The FDA indicated it will only use its authority to regulate health apps that pose a risk of harm to consumers if there is a malfunction or failure. The FDA also indicated that it will not enforce regulatory requirements for low-risk apps, such as those that track heart rates, sleep patterns, or steps.

    The FDA does not focus on privacy safeguards or oversee company policies about the collection, use, or disclosure of potentially sensitive health information.

    The FTC

    The FTC can use its authority to regulate unfair and deceptive practices to enforce security and privacy protections. Regarding privacy, patients using apps must largely rely upon company policies regarding uses of data that are offered unilaterally- in other words, accept the terms or don’t use the app. These policies may be especially unfair in the case of medical apps, since patients often do not have a choice whether to use them. The FTC also has expertise in penalizing companies for unfair design, unfair default settings, and unfair data security practices. The FTC has already successfully brought enforcement proceedings against private health apps for misconduct including: making scientifically dubious claims to treat medical conditions including melanoma and acne, and causing consumers to unwittingly share personal health information with other people.

     

  • Data Privacy, the French Alps Crash, the Nazis and the TTIP

    April 20th, 2015

    Data Privacy, the French Alps Crash, the Nazis and the TTIP

    By: Geoffroy van de Walle

    On March 24, 2015 a Germanwings plane en route from Barcelona to Düsseldorf crashed in the French Alps, leaving 150 dead. The investigation soon revealed that Andreas Lubitz, the co-pilot took control of the plane when the pilot temporarily stepped out of the cockpit. Mr. Lubitz locked himself up in the cockpit and deliberately crashed the plane down.

    It soon emerged that Mr. Lubitz had been treated for depression and suicidal tendencies. Upon these revelations, legitimate questions arose as to how a pilot in that condition could be allowed to operate a plane. Carsten Spohr, Chairman and CEO of Germanwings’ parent company Lufthansa said in a press conference “[i]n the event that there was a medical reason for the interruption of the training, medical confidentiality in Germany applies to that, even after death. The prosecution can look into the relevant documents, but we as a company cannot”.[1][2] These revelations attracted backlash in the press, with several headlines blaming privacy laws for the crash. For example on March 31, UK Newspaper The Times titled “German obsession with privacy let killer pilot fly”.

    In contrast, a more nuanced Washington Post article[3] reported reactions in Germany that called for more, not less, privacy. The article reports the sentiment in Germany that Mr. Lubnitz and his family continue to deserve privacy even after the crash. Bild, a German tabloid, was criticized for aggressively reporting on the story; other outlets like Die Welt refrained from publishing pictures of Mr. Lubnitz and continue to refer to him as Andreas L.

    The strong German stance on privacy, which some attribute to prior experiences with Nazism and East German Communism, highlights the cultural differences that affect how people see privacy. This issue pops up not only in the U.S.-EU relations[4], but also within Europe, where Member States are still struggling to find a compromise on a General Data Protection Regulation (GDPR), six years after the reform was initiated.

    While the GDPR continues on its uncertain path, the U.S. and the EU are negotiating the Transatlantic Trade and Investment Partnership (TTIP), a broad free trade agreement. In the wake of the Snowden revelations, the EU decided not to include data privacy issues in the TTIP in order not to derail the process, despite calls by tech giants to do so.[5] In March of this year, EU officials have shown some willingness to add data protection issues in the TTIP while quickly adding that “[u]ntil the EU’s data protection regulation has been agreed, we cannot introduce such concepts within the TTIP negotiations.”[6]

    But a few days later, a report by the European Parliament’s Civil Liberties, Justice and Home Affairs (LIBE) Committee torpedoed any efforts to open talks on privacy. The document, authored under the leadership of Jan Albrecht,[7] a member of the Green Party and privacy advocate,[8] expressly calls on the negotiators to include a clause exempting “the existing and future EU legal framework for the protection of personal data from the agreement, without any condition that it must be consistent with other parts of the TTIP”[9].

    Data protection remains the elephant in the room in the TTIP.[10] But it seems unwise for Europeans to include it in the TTIP at a stage where the future of the GDPR remains unclear. As the TTIP delegates pack for the next round of negotiation (April 20-24) in New York, data privacy issues are unlikely to make it into their suitcases.

    [1] http://time.com/3761895/germanwings-privacy-law/

    [2] Indeed according to German privacy experts, only Mr. Lubitz could chose to reveal his condition to his employer. Doctors are only allowed to break their professional secrecy in case of an epidemic illness or if the patient is suspected of planning to commit a serious crime. Mr. Lubitz doctor’s failure to report him must mean he did not feel that Mr. Lubitz was likely to do so.

    [3] http://www.washingtonpost.com/world/crash-challenges-german-identity-notions-of-privacy/2015/04/01/8a1cde9a-d7d6-11e4-bf0b-f648b95a6488_story.html

    [4]http://www.economist.com/news/europe/21647634-can-america-and-europe-ever-get-over-their-differences-data-protection-not-so-private-lives

    [5] Financial Times, Data protection ruled out of EU-US trade talks, 4 November 2013, http://www.ft.com/cms/s/0/92a14dd2-44b9-11e3-a751-00144feabdc0.html

    [6] http://www.euractiv.com/sections/trade-society/brussels-makes-overture-data-flow-agreement-ttip-313080

    [7] http://www.europarl.europa.eu/meps/en/96736/JAN+PHILIPP_ALBRECHT_home.html

    [8] http://www.janalbrecht.eu/fileadmin/material/Dokumente/Short_CV.pdf

    [9] Opinion of the Committee on Civil Liberties, Justice and Home Affairs for the Committee on International Trade on recommendations to the European Commission on the negotiations for the Transatlantic Trade and Investment Partnership (TTIP) (2014/2228(INI))

    [10] http://www.euractiv.com/specialreport-eu-us-trade-talks/ttip-data-elephant-room-news-530654

  • EU Council’s Agreement and the “One-Stop Shop”

    April 16th, 2015

    EU Council’s Agreement and the “One-Stop Shop”

    By: Kevin Gallagher

    http://www.dataprotectionreport.com/2015/04/eu-proposes-one-stop-shop-for-data-protection-supervision-and-enforcement/

    http://www.dataprotectionreport.com/2015/04/eus-one-stop-shop-proposal-focuses-on-main-establishment-as-nexus-of-dpa-enforcement-authority/

    http://www.privacyandsecuritymatters.com/2015/03/one-less-carrot-for-business-council-of-european-union-limits-the-one-stop-shop-mechanism-in-the-draft-data-protection-regulation/?utm_source=Mondaq&utm_medium=syndication&utm_campaign=View-Original

    In March 2015, the Council of the European Union published an agreement on the One Stop Shop mechanism of the proposed new European data protection regulation.

    Background

    In 1995, the EU passed a directive that aimed to regulate the processing of personal data in the European Union. As with all EU directives, each member state was required to implement this directive in their own internal law. This approach can create many problems. Firstly, the cultural view of privacy protection may not be the same in every country. Therefore, many countries may create different levels of privacy protection while implementing laws fulfilling the same directive. Though this may not be a problem for corporations that operate within the borders of one European Union Member State, jurisdictional problems can arise with trans-national companies within the EU.

    In an attempt to solve these and other issues, the European Commission has proposed the General Data Protection Regulation (GDPR). The GDPR is a single law which attempts to “[harmonize] data protection legislation and enforcement.” [1] After passing through the European Parliament with several thousand amendments, [2] the proposed legislation is now being reviewed by the European Council. In March 2015, the European Council published a partial general agreement on parts of this legislation. [3] Included in this partial general agreement was its view on a “One Stop Shop” mechanism to make enforcement easier for trans-national companies within the EU and companies outside of the EU that do business within or collect data from European Union Member States.

    The Council’s One Stop Shop Mechanism

    In the European Council’s version of the One Stop Shop mechanism, supervisory authorities (SA) “assume control of the controller’s or processor’s activities” of the companies within their EU Member State. However, for trans-national companies the decision of which SA assumes control of the company’s activities. In order to compensate for this, the idea of a “main establishment” of a business is used. In the proposal by the European Commission, the main establishment is defined in the as “the place of its establishment in the Union where the main decisions as to the purposes, conditions and means of the processing of personal data are taken;if no decisions as to the purposes, conditions and means of the processing of personal data are taken in the Union, the main establishment is the place where the main processing activities in the context of the activities of an establishment of a controller in the Union take place. As regards the processor, ‘main establishment’ means the place of its central administration in the Union.” [3] To simplify, the main establishment in relation to a data controller is the EU state in which decisions regarding “purposes, conditions and means of processing the data are taken.” [4] If these decisions aren’t taken in the EU, this the main establishment is where the main processing takes place. [4] In relation to a data processor, the main establishment is the place of central administration within the EU. [4] In addition to these definitions, the European Council added that “The main establishment of a controller in the Union should be the place of its central administration in the Union, unless the decisions on the purposes and means of the processing of personal data are taken in another establishment of the controller in the Union. In this case the latter should be considered as the main establishment.” [3] For companies that do business in the EU but do not have an EU establishment are “obliged to designate a representative in one of the EU Member States in which it offers goods and services or carries out monitoring activities.” [4]

    Though the purpose of the One Stop Shop was to simplify the enforcement process, critics have noted that the One Stop Shop method will be used only in “very limited circumstances” and that the lead SA “would have to act more as a coordinator than a sole decision maker.” [5] “Furthermore,” the critics add, “if the lead authority fails to reach agreement with other interested national authorities, the decision must be referred to a new supervisory board, the European Data Protection Board.” [5] For this reason, arguments can be made that this is not a “true One-Stop Shop.” [5]

    Implications

    Despite criticisms this agreement has received, it would still create a more harmonious way of dealing with enforcement for trans-national companies than exists under the current EU directive. It is worth noting, however, that “nothing is agreed until everything is agreed,” which means that the European Council, European Parliament and the European Commission still need to agree on a final text after the Council publishes the complete draft of its internal agreement, meaning this is not necessarily the final wording of the GDPR. One thing is certain, however. The EU is one step closer to beginning the “trialogue” that is required to pass an EU regulation.

    References

    [1] http://www.dataprotectionreport.com/2015/04/eu-proposes-one-stop-shop-for-data-protection-supervision-and-enforcement/

    [2] http://www.europarl.europa.eu/sides/getDoc.do?type=TA&reference=P7-TA-2014-0212&language=EN&ring=A7-2013-0402

    [3] http://register.consilium.europa.eu/doc/srv?l=EN&f=ST%206833%202015%20INIT

    [4] http://www.dataprotectionreport.com/2015/04/eus-one-stop-shop-proposal-focuses-on-main-establishment-as-nexus-of-dpa-enforcement-authority/

    [5] http://www.privacyandsecuritymatters.com/2015/03/one-less-carrot-for-business-council-of-european-union-limits-the-one-stop-shop-mechanism-in-the-draft-data-protection-regulation/?utm_source=Mondaq&utm_medium=syndication&utm_campaign=View-Original

     

     

  • Facebook in trouble with EU Privacy watchdogs again!

    April 16, 2015

    Panel 2

    Facebook in trouble with EU Privacy watchdogs again!

    http://www.theguardian.com/technology/2015/mar/31/facebook-tracks-all-visitors-breaching-eu-law-report

    By: Aishani Gupta

    Facebook and its privacy policies have been under scrutiny for sometime now in the EU. Earlier this month it was revealed after extensive research by the Belgian Data Protection agency that Facebook tracks users and non-users alike. What this means for us is that once you visit Facebook, whether you sign up for an account or not they start tracking you to understand more about your lifestyle, personal preferences etc. The purpose of this tracking is to be able to give a user targeted advertisements.

    This begs the question of how this violates EU law as it currently stands? EU law on privacy and data protection are rather stringent. It is required that all users be given the specific ability to opt out from being tracked online. However, if Facebook is tracking users (whether they are signed into Facebook or not and non users) then they are violating this requirement of giving consumers an opt-out mechanism. Naturally, Facebook’s rebuttal to this report is that they are full of inaccuracies and they have contacted the Belgian authorities for the purpose of clarifying the errors in the report. Though, in later reports Facebook has acknowledged that they do in fact track non-users. Though, quite obviously they claim that this was a bug and they had no intention of tracking non-users.

    April 29, is a date that they eyes of privacy advocates from around the world will be on Belgium’s Data Protection Agency. It is then that the Agency will decide whether to take any action against Facebook based on the report or not.

    Belgium is not the only country that is providing trouble for Facebook. In Austria as well, there are issues being taken to court. Privacy campaigner “Europe v Facebook” has filed a class action suit (a different version of a class action then it stands in the US) in the Austrian courts.

    The investigation by the Belgian Agency has also sparked investigations in Germany, France, Spain and Italy. This demonstrative of the regime in the EU. Targeted action in a collective manner against a Facebook seems to be the key. It will be most interesting to note the determination of these cases by the courts and the subsequent change (if any) in the privacy policies of Facebook according to the directives of these cases. In terms of costs and benefits the social media giant might find that it is easier to change its tracking policies than constantly pay fines in different countries. Let us hope!

     

  • PRG News Roundup: April 15th

    New York Appellate court finds voyeuristic photographer protected under first amendment:

    https://news.artnet.com/art-world/arne-svenson-neighbors-photographs-supreme-court-286916

    Google charged with antitrust violations by EU:

    http://www.nytimes.com/2015/04/16/business/international/european-union-google-antitrust-case.html

    EU Commissioner Věra Jourová announces she would submit a new proposal to revise the EU-US Safe Harbor Framework on May 28th:

    http://www.tagesspiegel.de/politik/datenschutz-bruessel-will-datenuebermittlung-nach-usa-neu-regeln/11629820.html (in German)

    New EU Competition Commissioner Margethe Vestager will speak at Lipton Hall on April 20, 2015:

    https://its.law.nyu.edu/eventcalendar/index.cfm?fuseaction=main.detail&id=39123

    United States and EU release joint press statement for 2015 US-EU Information Society Dialogue:

    http://www.state.gov/r/pa/prs/ps/2015/04/240680.htm

    David Brooks and the cop-cam debate: Do police have right to privacy?

    http://www.nytimes.com/2015/04/14/opinion/david-brooks-the-lost-language-of-privacy.html

    Retailers want your refund: Turbotax offers bonus for citizens to accept refund through Amazon gift card:

    https://ttlc.intuit.com/questions/1899434-what-is-the-refund-bonus-offer

    FCC to investigate Verizon’s use of ‘supercookies’:

    http://www.cnet.com/news/lawmakers-push-feds-to-investigate-verizons-use-of-supercookies/

    UN Human Rights Council appoints special rapporteur on right to privacy:

    https://www.eff.org/deeplinks/2015/03/un-human-rights-council-appoints-special-rapporteur-right-privacy

     

     

     

     

  • US Senators Proposes New Privacy Bill to regulate Data brokers

    April 14th, 2015

    US Senators Proposes New Privacy Bill to regulate Data brokers

    By Luis Camargo

    Link: http://www.pcworld.com/article/2893672/lawmakers-target-data-brokers-in-privacy-bill.html

    It has been a long time since companies adopted targeted marketing as one of its most important commercial strategies. The idea is simple: the more you know about your client (or prospective client) the better you will be able to market your products and services.

    Personal and individualized information have then suddenly become a very valuable asset. And naturally, it became clear that gathering and selling this is personal information could be a very profitable business. In this context, the so-called Data brokers were born.

    It is important to note that Data brokers acts very differently than Credit Report companies. The latter are companies that routinely receives data from banks, credit card companies and others sources, and under the rules of the Fair Credit Reporting Act (FCRA), are responsible for the confidentiality and accuracy of the information, and sells the credit reports for specific uses allowed by the law, such as the application for credit, insurance, employment, or renting a home[1].

    Data brokers, on the other hand, are companies that operates “collecting, analyzing and packaging some of our most sensitive personal information and selling it as a commodity…to each other, to advertisers, even the government, often without our direct knowledge[2]”.

    Even though both credit reporting companies and data brokers basically gathers and sells personal information, the difference is evident: while credit reporting companies are regulated and obligated to grant consumers access and opportunity to dispute inaccurate information[3], data brokers act almost in the obscurity. There is no regulation, and many consumers have absolutely no idea of their existence. There is no clear information about how those companies collect data, what information is collected, and, more importantly, to whom this data is sold.

    As already mentioned, the collection of consumer’s data is not something new. Consumers are used to give their names, telephone numbers, and other types of personal information to brick and mortar stores.

    However, with the advent of the Internet this scenario became much more critical. It is not only easier to storage, organize and classify personal information contained in electronic files, but it is also easy to collect it from all our online activities.

    The more we use Internet the more is likely that we are giving a surprising amount of information about ourselves. Today not only the information that we voluntarily provide in websites that we use is shared, but most importantly, there are countless applications in our cellphones that while we use them to avoid traffic, to order our favorite meal, or even to buy a ticket to watch the next Knicks game, valuable information (and probably the most desirable information for target marketing) is also been collected and gathered by data brokers.

    On March 26, 2012 the Federal Trade Commission (“FTC”) issued the FTC Final Commission Report on Protecting Consumer Privacy[4], containing important rules “setting forth best practices for businesses to protect the privacy of American consumers and give them greater control over the collection and use of their personal data[5]”.

    As a very important attempt to bring the attention for the necessity of a regulation for Data brokers, the FTC included a recommendation to the Congress to “consider enacting targeted legislation to provide greater transparency for, and control over, the practices of information brokers[6]”. “The proposed framework recommended that companies provide consumers with reasonable access to the data the companies maintain about them”, in a way that it would give more control about what and how information about them is used”[7]. In addition, FTC called on data brokers to make their operations more transparent by creating a centralized website to identify themselves[8].

    FTC’s actions didn’t stop with the issuance of the Final Report on Protecting Consumer Privacy. In December 2014, FTC filed a complaint against data broker Leap Lab, for selling “sensitive personal information of … consumers – including Social Security and bank account numbers – to scammers who allegedly debited millions from their accounts[9]”. Even though it was an important way to call the attention of Data brokers that the FTC is actually aware of their practices, it is clear that FTC do not have substantial authority to properly and timely enjoin Data brokers abusive practices.

    Therefore, as a response to FTC efforts and concerns about consumer privacy, “[f]our U.S. senators have resurrected legislation that would allow consumers to see and correct personal information held by data brokers and tell those businesses to stop sharing or selling it for marketing purposes”[10].

    In a similar bill that failed to pass the Senate in 2014, the Data Broker Accountability and Transparency Act[11] was proposed last March, as a necessary solution for the regulation of data brokers.

    The Bill has important provisions that address several of the problems raised by the FTC, aiming to assure the accuracy of the data collected, and more importantly, giving the right of the consumer to access the data collected and also to stop the Data brokers to share personal information for marketing purposes. Moreover, the bill grant jurisdiction to the FTC to “craft rules for a centralized website for consumers to view a list of data brokers covered by the bill[12]”.

    Even though a similar bill had failed in the pass, it seems that it is time to Congress to impose regulation on data brokers to advance consumers’ information and privacy protection.

    The article in discussion also brings negative comments about the bill, specially by the Direct Marketing Association (“DMA”), which represents the data broker industry.

    The DMA claims that “[t]he legislation isn’t needed”, specially because Data brokers “are continually taking steps on their own to improve transparency to consumers” and arguing that the “kind of transparency is happening every day, in terms of self-regulation in the marketplace”[13].

    Even though the alleged self-regulation could be argued as a solution for this industry, it is evident that this was something that the FTC has been promoting without success with the Final Commission Report on Protecting Consumer Privacy. The experience already proved that consumers are not protected without a law that would address the problem of transparency and consumer access of information.

    Although there is a good incentive for data brokers to provide the most accurate information possible to its customers, a voluntary implementation of a system that would allow any person to consult and revises, or even block the use of her information could be so costly, that the data broker that legitimately cares about consumer privacy would never be able to compete with careless companies that would not give any importance to consumer privacy rights.

    In conclusion, a new data broker data would be a fundamental instrument not only to protect consumer privacy, but also to level the playing field, imposing all the data brokers to give transparency and to allow consumer validation of the information the company is selling.

     

    [1] Disputing Errors on Credit Reports [https://www.consumer.ftc.gov/articles/0151-disputing-errors-credit-reports]

    [2] The Data brokers: Selling your personal information. [http://www.cbsnews.com/news/data-brokers-selling-personal-information-60-minutes/]

    [3] Disputing Errors on Credit Reports. Id.

    [4] FTC Issues Final Commission Report on Protecting Consumer Privacy [https://www.ftc.gov/news-events/press-releases/2012/03/ftc-issues-final-commission-report-protecting-consumer-privacy]

    [5] Id.

    [6] Id.

    [7] Protecting Consumer Privacy in an Era of Rapid Change [https://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-report-protecting-consumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf]

    [8] FTC Issues Final Commission Report on Protecting Consumer Privacy. Id.

    [9] FTC Charges Data Broker with Facilitating the Theft of Millions of Dollars from Consumers’ Accounts [https://www.ftc.gov/news-events/press-releases/2014/12/ftc-charges-data-broker-facilitating-theft-millions-dollars]

    [10] Lawmakers target data brokers in privacy bill [http://www.pcworld.com/article/2893672/lawmakers-target-data-brokers-in-privacy-bill.html]

    [11] http://www.markey.senate.gov/imo/media/doc/2015-03-04-Data-Brokers-Bill-Text-Markey%20.pdf

    [12] Lawmakers target data brokers in privacy bill. Id.

    [13] Lawmakers target data brokers in privacy bill. Id.

  • Talking Barbie

    April 9th, 2015

    Talking Barbie

    By: Rugeradh Tungsupakul

    According to the recent toy fair in New York City, Mattel, the manufacturer of Barbie dolls, had introduced “Hello Barbie”, a new version of its famous Barbie dolls that can listen and talk back to children.

    Basically, Hello Barbie will be functioned by speech recognition and Wi-Fi connection. Whatever your children say to Hello Barbie will be recorded and saved in the cloud. In this way, Barbie is collecting a lot of information about your children and responses to them based on those saved information.

    Please follow this link for more information: http://money.cnn.com/2015/03/11/news/companies/creepy-hello-barbie/

    In my opinion, Hello Barbie may, at least, encounter the following controversies:

    • Parents cannot control what will be recorded and transmitted to the cloud. For instance, children may intentionally or accidentally push a record button anytime. This means any voices or conversation within the house will leak out to the outside world.
    • Responses from Barbie are out of the parents’ control. Even it is claimed that Barbie’s responses will be based on information recorded and saved in the cloud, this is not a guarantee that its responses to children will be relevant, appropriate and harmless to either children or parents.

    With regard to issue (ii), though Mattel may claim its entitlement to the First Amendment Right, the parents should have the right to select what kind of information should be allowed in their house as well as what kind of message their children can consume. Assuming that children play with their Barbie at home, parents should have the right of a householder to bar any unwanted message sent into their house. (Rowan v. United States Post Office Department)

    Another possible argument of Mattel may be that the responses from Barbie are non-commercial speech so it should not subject to a lesser protection as a commercial speech is. Therefore, Hello Barbie’s function should be fine as long as it complies with the Children’s Online Privacy Protection Act[1].

    In my personal point of view, messages from Barbie may either be ‘commercial’ or ‘non-commercial’. Due to the lack of detailed information of Hello Barbie, I would like to make a comparison in the following situations:

    Scenario 1[2]:

    Children: “What should I be when I grow up?”

    Barbie: “Well, you told me you like being on stage, so maybe a dancer?”

    Scenario 2:

    Children: “I feel so lonely, what should I do?”

    Barbie: “You are not alone. At least, you have me or you may ask your parents to buy more talking friends!”

    Obviously, the answer from Barbie in Scenario 2 should be considered as a commercial speech because it proposes a commercial transaction and relates solely to the economic interests of the speaker and its audience. This might be a big task for Mattel to escape from the stricter scrutiny.

    Further, it is interesting to think whether the government would be authorized to regulate the use of Hello Barbie in addition to the Children’s Online Privacy Protection Act. Based on a three-part test of Central Hudson, assuming that the commercial speech is not misleading and relates to lawful activities, it is highly likely that the government can assert protection of both parents and children as a substantial interest to be achieved by the regulation. Children, by nature, can easily be convinced and may be used as a part of hardcore marketing trick. Parents, if cannot control the content of messages sent to their children, may financially suffer because of their children’s deceived demand.

    In addition, the regulation must be directly advanced the government interest and narrowly tailored not to restrict more speech than necessary in order to survive Central Hudson. However, these two prongs should be better to discuss after more details of Hello Barbie are available in the marketplace.

    Since there is a high possibility that more and more controversies may arise after Hello Barbie hits the store this fall, it is very interesting to keep an eye on how the government and the society will react to this new coming doll.

    ****************

    [1] This is a claim from Mattel’s spokeswoman.

    [2] This is a real example from the toy fair.