Gabriel Gutiérrez
Documents Say NSA Pretends to be Facebook in Surveillance , from the Wall Street Journal’s Big Data Blog, written by Reed Albergotti and Danny Yadron
The article “reveals” that the NSA has disguised itself as Facebook to gain access to the computers of targets of investigations. Information on the technique is based on documents leaked by Snowden. The NSA says the accusations are false and Facebook representatives say the technique wouldn’t work anymore because of new security measures implemented by the company.
I thought the article was amusing because it depicts a company whose own privacy policies often spark criticism being used by the government to spy. Furthermore, the company’s own security measures seem to actually be protecting the privacy of targets. If true, the situation described illustrates that there is always a “bigger bully” and that privacy concerns – especially in the on-line setting – are very closely integrated. The article also touches on how the tactic isn’t directed towards mass data-gathering and instead targets specific individuals, presumably already under the NSA’s scrutiny for some suspicious activity.
Monica Perrigino
On March 20, 2014, the Federal Trade Commission issued a 49-page report entitled “Self-Regulation in the Alcohol Industry” in which it expressed its support for the continued self-regulation over alcohol marketing in the country, deeming it “more prompt and flexible than government regulation.” This report provides an excellent, current example of industry self-regulation – illuminating the topic we have been studying in class this week by setting it in a real-life context.
This study is the FTC’s fourth major study on alcohol industry compliance with self-regulatory marketing guidelines, and it found that 93.1% of all measured media ad placements met the industry’s self-regulatory standard (the standard being that 70% or more of the measured audience must be at least 21 years old).
With respect to privacy interests, the report yielded generally positive results, finding that alcohol industry members “appear[ed] to have considered privacy impacts in the marketing of their products.” While the largest chunk of measured media consists of broadcast and print (nearly 1/3 of drinks’ companies marketing budgets are spent on traditional media, whereas only 8% are dedicated toward digital and online advertising), for the most part alcohol companies nevertheless advise consumers how their information will be used with respect to online registration opportunities. They also require consumers to opt-in to receive marketing information, and consumers can readily opt-out when they want to stop receiving such information. Furthermore, use of cookies and tracking tools on brand websites are limited to those needed to ensure that only consumers who have stated that they are 21 years old or older can re-enter the site.
Distilled Spirits Council president, Peter Cressy, spoke in regards to the report with pride. He asserted: “The FTC report clearly shows that the spirits industry directs its advertising to adults and is a leader in self-regulation” – further embodying a tone of positivity and optimism in regards to the success of self-regulation in this area.
Despite this positive feedback, the FTC has nevertheless made a series of recommendations for how to improve the system. Some recommendations for online marketing efforts include forcing consumers to enter their dates of birth, instead of just asking them to confirm that they are at least 21 years old and encouraging any medium where compliance falls below 90% to target an audience with a higher 21-plus audience so that it will meet the standard when the ad actually appears. Cressy insisted that “DISCUS will give careful consideration to the recommendations in the report.”
The full text of the report can be found here.
William Brewer
Privacy Group Calls for Federal Investigation of Facebook’s $19 Billion WhatsApp Deal
A DC information privacy think tank, Electronic Privacy Information Center (EPIC), has filed a complaint for the FTC to investigate the recent $19 billion acquisition of the cell phone app “WhatsApp” by Facebook. The crux of the investigation will focus on whether WhatsApp has made privacy policy promises to consumers that it will be unable to keep under new ownership. Due to Facebook’s history of collecting data from acquired companies, EPIC asserts that there is a legitimate fear that it will do so again. The worry, then, is that Facebook, upon acquisition, will extract user data gathered by WhatsApp before the acquisition, while the previous privacy policies were in place. It may be a separate (and additional) question whether there are sufficient safeguards against future privacy policy violations (post-merger) for WhatsApp users (e.g. WhatsApp users with previously held expectations of privacy not being able to opt-out of new Facebook practices). The starkness of privacy policies between WhatsApp and Facebook couldn’t be more pronounced. While Facebook is known for using user data for advertisements, etc., WhatsApp’s policy ensures that “contents of any delivered messages are not kept or retained by WhatsApp,” though it does keep some meta-deta (phone numbers and time-stamps).
The author notes that acquisitions like this are rarely halted on privacy grounds, with the FTC relying often instead on competition-based effects for disapproval.
Ian Ratner
http://mashable.com/2014/03/21/microsoft-privacy-hotmail/
In March of 2014, Microsoft came under significant scrutiny after using a loophole in its privacy policy to read through a user’s Hotmail emails and instant messages. In conducting this search, Microsoft was seeking information regarding one former employee’s alleged misappropriation of trade secrets. The search itself was lawful because Microsoft owns Hotmail, the trade secretes were related to Microsoft software, and therefore the search was conducted to protect Microsoft’s own property – which is permissible under the Electronic Communications Privacy Act.
Despite its legality, the search obviously drew a lot of negative attention. Indeed, a separate article in the New York Times pointed out that many users felt hesitant to continue using Microsoft’s services given the loophole. As a result, Microsoft decided to publicly tweak its privacy policies to mitigate these concerns. This is particularly important with regard to information privacy law because the FTC not only concerns itself with a company’s privacy policy, but also with a company’s public statements and notice.
Microsoft’s new privacy policy relating to searches of its own users’ email and instant messages is complex. First, Microsoft will employ a legal team separate from its investigation team to assess the risk to Microsoft’s property. Second, if the legal team finds that there is sufficient evidence to warrant the search, then Microsoft will relay the information to a former judge to receive his or her opinion on the matter. These steps are intended to replicate the steps Microsoft would need to conduct if the warrant process were actually applicable. In the same vein, Microsoft proclaims that its legal team will also take steps to make sure that the search is confined to original risk to its property – i.e., that the search does not invade more of the user’s search than necessary. The last part of Microsoft’s new policy involves transparency: the company will include in its bi-annual reports data regarding the number of these searches that it conducts.
This new policy is important in the context of the FTC because the new policy would certainly be material to new users, which affects whether the FTC could find deceptive practices. In other words, this new policy will assuredly affect whether users continue to use Microsoft’s products, so it is important that Microsoft adheres to this policy going forward.
Sharon Steinerman
Wearable technology has become increasingly popular over this past year, as technology companies have looked to market a new type of device to tech-savvy users who already own smart phones and tablet devices. Wearable tech has particularly taken off in the areas of health and fitness, as companies like Fitbit and Nike have begun successfully marketing smart watch-like devices that can serve as pedometers, calorie counters, sleep monitors, and general fitness trackers. Users can even sync up these devices with various apps on their phones and computers to better keep track of their fitness plans.
However, according to Mother Jones, the FTC has become increasingly concerned about the volume of data that the makers of these devices are collecting and, potentially, selling. In addition to tracking your location, these devices offer the option for users to input sensitive and ostensibly private medical information, such as blood pressure and glucose levels. Most devices also encourage users to input gender, weight, height, age, and other sensitive personal information. Although these companies have privacy policies that outline individual user identity protection, the information may still be collected in the aggregate and potentially sold to advertisers.
Other concerns stem from the interactions between these devices and other fitness applications. Fitbit, for example, a company that makes a range of fitness trackers that can monitor activity and sleep levels as well as nutritional information, allows and even encourages users to set their devices to interact with third-party applications for calorie counting and weight loss monitoring. These third party applications have their own privacy policies that may offer incredibly limited privacy protection, but the makers of these applications are similarly provided with sensitive health information by users of the wearable fitness technology. This information may then be sold to advertisers, all without users ever being aware of this gaping privacy breach.
Julie Ann Rosenberg
Facebook and the Federal Trade Commission (Hereinafter, “FTC”) currently disagree about the interpretation of a children’s privacy law. The FTC recently filed a brief in the current case, Batman v. Facebook. If adopted, the FTC’s position would hurt Facebook’s argument in this ongoing district court case in California.
The disputed issue between the FTC and Facebook, is whether or not states can enforce their own laws governing teen privacy. Currently, the Children’s Online Privacy and Protection Act (COPPA) only applies to and protects children under the age of 12. Facebook contends that therefore states may not enforce their own state laws regulating teenagers’ privacy (children above 12 years of age).
The case arose from a 2012 settlement regarding Facebook’s “sponsored stories,” or advertisements that used users’ information. The users who are challenging the settlement argue that the settlement violates state privacy laws, because it doesn’t require teens to receive permission from their parents before appearing in Facebook advertisements. Facebook contends that since the COPPA (federal protections) only apply to children up to age 12, older teens’ Internet activities cannot be subject to restrictions, even under state law. In its filing, the FTC directly disagreed with Facebook, and outright declared Facebook’s position as wrong, and unsupported by the language, structure, and legislative history.
Kate Englander
“Pot shops wary of privacy concerns in handling customer information”
Colorado Amendment 64, which went into effect on January 1, 2014, legalized the sale and personal consumption of marijuana through an amendment to the state’s constitution. This article addresses the way in which Colorado’s marijuana dispensaries are addressing their customers’ privacy concerns after the passage of Amendment 64. Because it is still illegal to sell and use marijuana under federal law, and because marijuana use is still largely taboo, many users are concerned about maintaining their privacy.
While consumers might freely give personal information, such as their name, phone number, and address, at many retail stores, marijuana retailers in Colorado are wary of the fact that their customers may not wish to have their name or personal information associated with marijuana use in any sort of collected database. On the other hand, marijuana dispensaries must weigh the privacy concerns of their customers against their own objectives. First, dispensaries have an interest in tracking their customers’ preferences and purchasing habits in order to target advertising and promotions to them. Furthermore, some dispensary owners are concerned about verifying customers’ identity to protect against credit card fraud.
The amendment itself does not require dispensaries to collect personal data about customers – they need only verify that the customer is 21 or older under the law. This requirement stands in contrast to the medical marijuana laws in California, where dispensaries are required to track patients’ personal information.
Often when we have considered the collection and dissemination of identities aggregated with commercial data, it has been difficult to identify the harm. Are there real quantifiable damages in the dissemination of consumer preferences, when they indicate that a certain customer prefers a certain brand of makeup, or frequently purchases high-end jewelry? Courts have often regarded the potential damages as relatively minimal. However, the collection of personal information in connection with marijuana purchases provides an example collection of personal information in association with purchasing data can lead to definite harm to a person’s reputation or perhaps even to criminal liability.
Abigail Everdell
“Give Me Back My Online Privacy: Internet Users Tap Tech Tools That Protect Them From Prying Eyes” – Wall Street Journal
This article outlines a number of programs that have emerged as popular tools for limiting the collection of data on the internet. The article acknowledges that only 8% of internet users make use of such programs, a number the author seems to consider large, but which still strikes me as small in light of the high number of Americans who are concerned about data collection. Nevertheless, the article has a hopeful tone, suggesting that emerging programs are more successful at helping users find a “middle ground” of data collection–one which doesn’t block all collection, but does allow a certain measure of awareness or control regarding when and how data is being collected.
I thought this article was particularly relevant to our readings this week as it suggests that market self-regulation, while not a complete solution, may be making strides towards addressing the problem of indiscriminate commercial data collection on the internet. Professor Rubinstein, according to his article excerpted in our readings this week, might refer to these kinds of programs as “privacy-friendly PETs [Privacy Enhancing Technologies],” an aspect of “Privacy by Design.” The underlying assumption of the materials we read, however, seems to be that data collection companies must implement PETs on their own, and the financial incentives to do so are not compelling. The proliferation and growing popularity of third-party PETs described in this article, however, suggests that there may be hope for the market to better address consumer preferences in some regard.
Ann Lucas
Recent FTC Ruling Could Cloud Data Security Enforcement by John Moore, iHealthBeat Contributing Reporter
The FTC filed an administrative complaint under the Section 5(a)(1) of the FTC Act’s ban on “unfair … acts or practices” in August of 2013 against LabMD, a medical testing lab, for data security breaches involving consumer health data. More specifically, the complaint alleges that a LabMD spreadsheet containing names, social security numbers, dates of birth, medical treatment codes of more than 9,0000 consumers was found on a peer to peer network in 2008. On Jan 16, 2014, the FTC denied LABMD’s motion to dismiss by a 4-0 unanimous vote. Last week, LabMD filed suit in federal district in Northern Georgia claiming that the August 2013 administrative complaint filed by the FTC against the firm, “is arbitrary, capricious, an abuse of discretion and power, in excess of statutory authority and short of statutory right, and contrary to law and constitutional right.” LabMD alleges that the FTC lacks the jurisdiction under Section 5 of the Federal Trade Commission Act to regulate personal health information security practices. Moreover, the firm claims that HIPAA/OCR takes precedence over the FTC in the realm of data security with respect to health care.
This article highlights the steep costs of an FTC enforcement action. LabMD has ceased operations due to the high costs of its legal battle with the FTC. Additionally, although FTC fines amount to only $16,000 per violation and are lower than HIPAA’s maximum fines, which are capped at $1.5 million, the 20-year privacy audits add to the high cost of such actions. Mac McMillan, the CEO of an IT consulting firm estimates that the cost of conducting periodic audits could prove more expensive in the long run than a HIPAA fine. “You’ve got the cost of an external monitor for 20 years,” McMillan said, noting that the audits are conducted by a third party. He said, “It’s not just the cost, but being under the microscope for 20 years,” adding, “That is an awfully long time to have the government … reviewing what you are doing.”
Ilana Broad
The United States government has been struggling to maintain open honesty under President Obama in the recent years. New statistics regarding the amount of time it takes the federal government to respond to a FOIA request and the frequency with which they deny FOIA requests show an increase in, both, the time it took to get a response and the number of rejections. [1] The study, based on government-released statistics from almost 100 federal agencies over six years, shows a major setback in the government’s response to citizens’ desires for government openness and accountability.
While FOIA requests were up approximately 8% in the last year, government response to FOIA requests for information went up only 2%, and the documents released were censored more often than ever before. White House spokesman Eric Scultz believes that these statistics are good – they show that the government is responding to FOIA requests more often and more quickly than ever. The problem with his perspective on these statistics, frankly, is that it’s wrong – federal agencies, on average, took longer to respond to FOIA requests than in previous years. Perhaps some of the issue stems from a lack of inter-agency communication in an era when information crosses agency borders very often. In fact, there have been instances where FOIA requests by one agency were answered with very censored documents, and when other requests for the same documents from another agency/representative come back with entirely open documents. [2]
Most importantly, 36% of all FOIA requests (that means including the requests that don’t get responses) are rejected or censored. The reasons cited for refusal to grant a FOIA request speak volumes about this troubling trend. Reliance on the national security exception to FOIA openness has doubled since Obama’s first year in office. The NSA saw a 138% increase in number of FOIA requests – which may account for some of the increase in reliance on the national security exception – but the NSA denied full access to information requested 98% of time.
Reporters have noted how “abysmal” federal openness has been, and even our Congress-people are on notice as to how dissatisfied FOIA applicants have been. Some people blame it on bureaucracy and some find more grim conspiracies to point to. Regardless of the reasons behind this increase in government secrecy, it’s important to remember how necessary government openness and accountability are for a democratic society. The Electronic Frontier Foundation has been on the forefront of keeping the government, specifically the NSA, honest. [3] In the last five years, EFF litigation has been responsible for exposing numerous domestic investigations done without Congressional or court approval, and sketchy attempts at maintaining secrecy and undisclosed information practices.[4]
[1] Open Government Study: Secrecy Up, Politico , http://www.politico.com/story/2014/03/open-government-study-secrecy-up-104715.html.
[2] FBI Redacts Letter About Drone Usage That Was Already Published in Full by Sen. Rand Paul, Global Research News, http://www.globalresearch.ca/fbi-redacts-letter-about-drone-usage-that-was-already-published-in-full-by-sen-rand-paul/5371368.
[3] How EFF’s FOIA Litigation Helped Expose the NSA’s Domestic Spying Program, Electronic Frontier Foundation; Deeplinks Blog, https://www.eff.org/deeplinks/2014/03/sunshine-week-recap-how-effs-foia-litigation-helped-expose-nsas-domestic-spying.
[4] EFF Victories in 2 FOIA Cases: Government Arguments ‘Clearly Inadequate’ to Support Claims, Personal Liberty Digest, http://personalliberty.com/2014/03/19/eff-victories-in-2-foia-cases-court-rules-governments-arguments-clearly-inadequate-to-support-claims/.