Blog

  • Natural Language Versus the Fourth Amendment on Search

    (x-posted from Coffee House Talks)

    In doing the initial framing for an article on how to apply Helen Nissenbaum’s theory of Contextual Integrity to the 4th Amendment, it has become apparent that there are differences between how natural language would classify whether something is a search, a reasonable search, or an excused or unexcused reasonable search, and how the law would classify the same action. Now this is not a mind-blowing observation, as it has been understood for some time that the fact of some things being classified as “not a search” for Fourth Amendment purposes is just kind of weird. However, I believe the differing categorizations of the two areas have implications when asking what an ideal Fourth Amendment doctrine would look like, so I’ll explore that here.

    (more…)

  • Orin Kerr on United States v. Jones

    Orin Kerr ponders oral arguments in United States v. Jones (reposted from The Volokh Conspiracy):

    I was at the Supreme Court this morning for the oral argument in United States v. Jones, the GPS case. In this post, I want to blog my reactions to the argument: I’m going to update the post as I go, so general readers can get the important stuff first at the top and then general readers can get the rest down the page:

    (more…)

  • FTC Finally Goes After Flash Cookies

    The FTC and ScanScout came to a settlement over ScanScout’s deceptive use of Flash cookies.  ScanScout used Flash cookies to track users, but its privacy policy merely stated a user could “opt out of receiving a cookie by changing your browser settings to prevent the receipt of cookies.” Since Flash cookies could not actually be blocked through browser controls during the relevant time period the FTC investigated, the privacy policy statement was found to be deceptive.  Read more here.

  • EPIC files FTC complaint against Verizon

    As a follow-up to Helen’s post about Verizon’s new privacy practices, EPIC has filed an FTC complaint alleging that the move amounts to an unlawful trade practice.

  • Android orphans and the update problem for smartphone security and privacy

    Michael Degusta has a wonderful blog post up about the history of missing software updates for Android smartphones, compared to Apple’s iPhone. A sample:

    Android Orphans

    In this chart, green blocks represent periods when a phone ran the most up-to-date major version of its operating system, while yellow, orange, and red blocks represent periods where a phone could only run increasingly out-of-date major versions. See Michael’s post for the full chart and some great analysis.

    Two factors combine to make the lack of updates a significant problem. First, in the United States at least, most phones are sold on two-year contracts, so a lack of updates means they will almost certainly be used well after their OS is no longer the current version. Second, since smartphones are constantly connected to the cell-phone network and the Internet, they present an attractive and vulnerable target for malware authors when security vulnerabilities are discovered. If updates can’t be applied to many of the smartphones in use, then the potential harm from a security problem expands greatly. Indeed, the many Android privacy and security problems show the potential severity of the issue.

    So what is to be done? It’s understandable why, in the fast-moving and competitive market for Andoid smartphones, makers don’t want to spend money supporting devices they’re no longer selling. Yet if two-year contracts are the standard, it may not be unreasonable for users to expect makers to support a device for at least two years after they stop selling it. With the FTC’s recent reemphasis on trade practices that are “unfair” but not necessarily “deceptive” (a subject worthy of a post of its own), it will be interesting to see if the agency has anything to say about the Android orphan problem.

  • Mastercard, Visa to help Target Ads

    Similar story to Verizon is coming up..

    (Taken from Slashdot. Source: Mastercard, Visa to help Target Ads)

    “The two largest credit-card networks, Visa Inc. and MasterCard Inc., are pushing into a new business: using what they know about people’s credit-card purchases for targeting them with ads online. ‘A MasterCard documentobtained by the Journal outlines some of the company’s plans, which included linking Web users with purchases. According to document, the credit card provider said it believes “you are what you buy.” … Visa is planning a similar service, which would aggregate its customers’ purchase history into segments, including location, to make ads more effective at appealing to people in a respective area.’”

    Eleni Gessiou

  • TPM – Feds To Monitor Google’s Privacy Practices For Next 20 Years

    From TalkingPointsMemo:

    “Feds To Monitor Google’s Privacy Practices For Next 20 Years

    Sarah Lai Stirland October 24, 2011, 4:10 PM 942 5

    The U.S. Federal Trade Commission on Monday finalized a landmark settlement with Google in which the company has agreed to be audited for its privacy practices for the next 20 years.

    The commission has said that this is the first time that it has required any company to formally implement a comprehensive privacy program to protect individuals’ personal information.

    The FTC commissioners voted to approve the settlement 4-0, after the period for public comment ended. The proposed settlement was announced in March.

    The FTC case was prompted by the now-defunct Google Buzz social networking service. Google tried to tack Buzz onto Gmail users’ e-mail accounts, enabling them to provide status updates and to share photos and videos, but it created an uproar when it made users’ Gmail contacts public by default.

    The commission charged that Google engaged in unfair and deceptive practices in 2010 when it launched Google Buzz by leading users of its Gmail system to believe that they could easily opt-out of the social network. The controls that would enable them to do that were ineffective, the FTC charged at the time.

    Also the tools that Google created to enable users to limit the sharing of users’ personal information were confusing and difficult to find, the agency alleged.

    In its complaint, the FTC said that Google had enrolled some Gmail users in Google Buzz even after the users had clicked on a tab to decline to use the service, and that the identities of people that Gmail account holders most frequently communicated with were made public by default. Worse, when users tried to get out of the service, they weren’t fully removed.

    In a press statement on the settlement, the FTC noted, “In response to the Buzz launch, Google received thousands of complaints from consumers who were concerned about public disclosure of their email contacts which included, in some cases, ex-spouses, patients, students, employers, or competitors.”

    Google made changes to respond to those complaints, but the FTC went after the company because Google had violated its own privacy policy by using its users’ personal information in a way that they had not consented to even though Google had said they would ask for permission first.

    The commission had also charged that the way that Google had gone about representing the way its users’ personal information would be displayed was deceptive. Users didn’t know, for example, that their most frequently e-mailed contacts would be made public by default.

    The FTC’s settlement with Google requires the company to inform and obtain its users’ consent before it shares any of their information with third parties, and subjects the company to 20 years of privacy audits every two years by an independent third party monitoring service. The audits are meant to ensure that Google is living up to its promises about what it is doing with its users’ personal information. The company is also required to implement a comprehensive “privacy program.”

    Google recently killed its disasterous Google Buzz project, which had been long abandoned in favor of its Google+ social network, which has met with general praise for the way it enables users to control how they share information on a fine-grain level.

    In an e-mail to TPM, Google’s Senior Manager of Global Communications Chris Gaither said that Google has completely revamped the way it approaches privacy.

    Instead of being an afterthought, privacy is a concept that’s considered during the design of new products.

    “We’ve strengthened many of our internal privacy and security controls over the past year,” he said. “For example, in October we appointed longtime Google engineer Alma Whitten to director of privacy across product management and engineering.”

    In addition, Gaither says, “We’ve increased privacy training for all our employees. We’ve tightened our compliance controls for those who deal with sensitive data. And last fall, we added a new process to our existing privacy review system requiring every engineering project leader to maintain a Privacy Design Document for each initiative they are working on. This document records how user data is handled and is subject to regular review.”

    Like other technology companies, Google had come increasing fire both here in the United States and especially in Europe over privacy issues.

    Last May, Google inadvertently collected data from private WiFi networks when its Street View cars drove by. Google has since been investigated by the regulatory authorities in Europe over the incident.”

  • New privacy study shows top-ranked sites selling user information

    The WSJ just blogged about a recent internet privacy study implicating several high-traffic sites of selling user information to third party SEO companies.  Sites include OKCupid!, RottenTomatoes, and yes, the Wall Street Journal herself.

    Nothing new here, but note WSJ’s clever loophole: they don’t sell users’ email addresses; instead, they sell email addresses used in failed login attempts, meaning that potential privacy issues are squelched because the addresses they sell are technically not attached to any users.

  • CSCW Workshop: Reconciling Privacy with Social Media

    CSCW Workshop: Reconciling Privacy with Social Media

    February 12, 2012

    Full Details: http://phitlab.host22.com/cscw2012privacyworkshop.html

    Call for Participation

    Much research on privacy in social media has focused on limiting personal information disclosure, increasing control, and perpetuating social withdrawal. Therefore, privacy goals are often characterized as diametrically opposed to goals of sharing and connecting via social media. However, privacy can also be characterized as a broader process where individuals and groups coordinate social interaction with others. In this broader conceptualization, privacy behavior moves beyond binary decisions to withhold or disclose and becomes an interactional process that involves the cooperation of others in the relationship. The goal of this workshop is to explore privacy in broader contexts and to understand its relationship to the benefits of social media and the support of online cooperative relationships.

    The workshop will focus on two main themes: Focusing on the benefits and outcomes of interactional privacy behaviors in social media environments, and emphasizing design and evaluation solutions for bringing such benefits to fruition.

    We invite potential workshop participants to submit 2-4 page position papers that describe research related to the workshop themes. The deadline for submission is November 25.

    Please see the workshop website at http://phitlab.host22.com/cscw2012privacyworkshop.html for more information.

    Workshop Co-Organizers:

    Heather Richter Lipford, University of North Carolina at Charlotte

    Pamela Wisniewski, University of North Carolina at Charlotte

    Cliff Lampe, University of Michigan

    Lorraine Kisselburgh, Purdue University

    Kelly Caine, Indiana University Bloomington

    Program Committee:

    Coye Cheshire, University of California Berkeley

    Catherine Dwyer, Pace University

    Woodrow Hartzog, Samford University

    Adam Joinson, University of Bath

    Jen King, University of California Berkeley

    Airi Lampinen, Helsinki Institute for Information Technology HIIT & University of Helsinki

    Deirdre Mulligan, University of California Berkeley

    Fred Stutzman, Carnegie Mellon University

    Janice Tsai, Microsoft

    Michael Zimmer, University of Wisconsin-Milwaukee