Category: Government, Technology, and Policy

  • New York City Council Hearing on Algorithmic Transparency Bill

    On Monday, October 16, the New York City Council’s Committee on Technology held a hearing on the proposed bill INT 1696-2017, which “would require agencies that use algorithms or other automated processing methods that target services, impose penalties, or police persons to publish the source code used for such processing. It would also require agencies to accept user-submitted data sets that can be processed by the agencies’ algorithms and provide the outputs to the user.”

    Multiple PRG members and ILI fellows were in attendance and testified. James Vacca, the bill’s prime sponsor, chaired the hearing. Find the video of the meeting, as well as minutes and other materials, here.

  • PRG – Overview of Legal Implications of NSA Spying

    Post-911 laws and FISA court developments. PRG Discussion on 9/18/13

    FISA Act governs gathering of data about foreign actors, set up in wake of Watergate.  Created framework for data collection and court review by FISA courts.   With the Patriot Act, push to expand powers and reach of a number of laws.  Patriot Act expanded FBI ability to send out administrative letters to collect information without court order and created roving wiretaps.  Legalized “sneak and peek” searches without immediate notification to target.

    Section 215 of the Patriot Act lowered the threshhold for search to any situation where collecting foreign intelligence is “a purpose” rather than just the only purpose.  16 provisions were set to sunset in 2005, but 14 were made permanent and two were reextended to 2015.

    Other key event was Bush Administration setting up wideranging wiretapping program and Section 702 of FISA creating official rules for targeting persons outside the United States.  These will be coming up for renewal in coming years.

    FISA court created under 1978 Act; 11 district court judges appointed by Chief Justice of the US Supreme Court.  Most opinions have been secret. Following expansion of requests to become more programmatic, FISA has been issuing long but secret opinions creating precedents for operation of the FISA court.    Existing Supreme Court precedent has been declared to make metadata given to a third party not subject to Constitutional protection.   34,000 surveillance requests since FISA created; 11 have been rejected.

    Not an adversarial proceeding with no actor representing person or groups whose data is to be accessed.  In many cases, information collected via FISA is then tracked down through other sources by FBI to “cover the tracks” so that the fact that FISA was used does not have to be presented in later public court proceedings.

    Anyone on US soil is not covered by FISA but non-citizens not on US soil have no protections under the law.

    Question raised about whether revelations about NSA were shocking because they revealed the extent of surveillance allowed by the law or whether there are real violations of US law.  A related question is whether the surveillance violates international law.

    Section 215 now allows collection of “any tangible thing”, which has been interpreting to mean whole telecommunications databases.  Restriction on collection if search is “solely based on First Amendment activities” which is not very restrictive if FBI can find any other reason to justify such a search.   Old law restricted access to specific information about a suspect person has become access to any data “relevant” to an authorized investigation. Minimization procedures are limited by fact that data retention allowed to “understand foreign intelligence” or related to a crime.

    Section 702 allows AG and Director of National Intelligence can set up surveillance program with no court overview once it’s established. Collection of data on US persons is allows as long as it is not intentionally targeting US persons. Statute says government does not have to specify who they want to target or where they want to look in any specific surveillance operation approved by a FISA proceeding.

     

  • Federal Court Strikes Down National Security Letter Statute

    By: Matt Zimmerman

    On Friday, the federal district court for the Northern District of California released a 24-page opinion in which it struck down a national security letter (NSL) statute — 18 U.S.C. § 2709 — that authorizes the FBI to obtain customer records from telecommunications companies and to gag those recipients from publicly disclosing that an NSL had been received.  In 2011, the Electronic Frontier Foundation (full disclosure:  I’m lead counsel on the case at EFF) filed a petition on behalf of an unnamed telecommunications provider to set aside both the NSL it received as well as the statute itself.  In our petition, EFF challenged both provisions of the NSL statute on First Amendment and separation of powers grounds.  The court granted our petition, agreeing that the statute amounted to a prior restraint without the necessary procedural safeguards required by the First Amendment.  Moreover, because it found that the statute was not severable, the court ordered that the entire statute must be struck down and that the FBI issue no further NSLs.

    This is a big deal.  While NSL statutes were first created in the mid-80s as a counter-intelligence tool to help ferret out spies, their scope was dramatically expanded by the PATRIOT Act to allow the FBI to obtain subscriber information about anyone so long as a field-level Special Agent in Charge certified that the information sought was “relevant” to a national security investigation.  NSL use has skyrocketed since the PATRIOT Act was passed, with the FBI issuing nearly 300,000 NSLs.

    While EFF’s petition challenged both NSL powers, the court’s order fundamentally rests on the procedural problems with the gag provision.  As written, the statute authorizes the FBI to gag an NSL recipient, indefinitely and without the need for any court oversight.  As the court found, this violates the Supreme Court’s First Amendment procedural requirements demanded where the government seeks to impose a prior restraint.  Under the Supreme Court’s 1965 Freedman vs. Maryland decision, a case evaluating a Maryland licensing scheme that required films to be evaluated by a government ratings board prior to public showings, a statute must must be designed to ensure that any person who is gagged gets a quick, fair opportunity to challenge that decision, specifically:

    1. the burden must fall on the government to go to court to obtain approval for any gag

    2. the pre-review gag must be strictly limited in time, and

    3. the time in which a reviewing court must make its determination must be set to “short fixed period compatible with sound judicial resolution.”

    The court found that the NSL statute plainly fails the Freedman test: the FBI can gag an NSL recipient on its own and without any judicial review, the statute does not force the government to initiate the review in the event that a recipient objects, and there are no requirements that a challenge be promptly heard or evaluated.  Just as in the Freedman case, the court here noted that the FBI was institutionally inclined to gag NSL recipients, and the statute improperly stacked the deck against NSL recipients if they chose to challenge the gag.

    The unconstitutionality of the nondisclosure provision proved fatal to the statute:  the court further determined that as the statute was not severable (i.e., that Congress did not intend that either provision could survive independently), the entire statute must be struck down, including the FBI’s ability to demand customer records.  Statistics cited by the court backed the court’s severability conclusion:  97% of all NSLs are delivered with a gag provision.

    While the court’s order was sweeping, little will change for the moment.  The court stayed its order for a 90 day period in which the government will likely file an appeal and seek a further stay until the Court of Appeals issues its own ruling.  For the moment, however, Judge Illston has given enormous support to critics of all stripes who have long argued that such an invasive, unchecked grant of power to the FBI was not justified and had to go.

  • Do Fair Information Practices (FIPs) really create better outcomes?

    http://www.markleweeklydigest.org/2012/09/eu-and-us-eye-privacy-in-parallel.html

    The article linked above adds to the comparative discussion between EU and US privacy regimes. In every conversation I can recall, Americans consider both the European and Canadian models of FIPs and Privacy by Design to be much superior to the sectoral approach here in the US. And on their face, I think this makes a lot of sense: keep the focus on the use, collection, and storage of *all forms* of personal data, rather than trying to chase down, and apply rules governing, singular instances of data abuse (e.g. mobile device IDs, GPS location, drone surveillance, etc, etc).

    During a conversation the other day with a colleague, we wondered: is there actually any evidence to support the claim that the European model leads to better consumer or industry outcomes? This isn’t meant to be a normative question, but an empirical one. Are there fewer cases of medical identity theft in Europe? Are there fewer privacy intrusions? Is there less cyberstalking, tax or government fraud, or forged identification documents? Is there any evidence at all that FIPs create better outcomes?

    One paper I can think of compares the effect of US and EU privacy regimes on consumer credit and debt (http://userpage.fu-berlin.de/~jentzsch/eu-vs-us.pdf). The paper finds that the EU has stronger data protection and credit reporting laws (i.e. allows less information exchange), but also less consumer debt than the US. In addition, the US, having a weaker privacy regime (i.e. allowing more information flow), has fewer national credit bureaus and that US consumers enjoy broad access to credit (which we may feel is good) — but they also suffer from more consumer debt (which we may feel is bad). While it would be unfair to characterize this as a causal model, what might be a possible explanation?: that weaker privacy regimes lead to cheaper credit, but induce more consumer debt. So on net, is this good or bad?

    Of course, this is just one paper. I’d love to hear of any other empirical work on this topic.

  • Scary Description of New Hungarian Secret Police

    In Paul Krugman’s blog at the NYT, his colleague Kim Lane Scheppele has been periodically writing about the recent radical changes in Hungarian government. Today she wrote about the TEK, the new Hungarian Secret Police. The description of their police is everything you would expect from a new secret police, and worth reading in it’s entirety, but here’s a (somewhat long) excerpt:

    TEK can engage in secret surveillance without having to give reasons or having to get permission from anyone outside the cabinet. In an amendment to the police law passed in December 2010, TEK was made an official police agency and was given this jurisdiction to spy on anyone. TEK now has the legal power to secretly enter and search homes, engage in secret wiretapping, make audio and video recordings of people without their knowledge, secretly search mail and packages, and surreptitiously confiscate electronic data (for example, the content of computers and email). The searches never have to be disclosed to the person who is the target of the search – or to anyone else for that matter. In fact, as national security information, it may not be disclosed to anyone. There are no legal limits on how long this data can be kept.

    [R]equests for secret surveillance are never reviewed by an independent branch of government. The justice minister approves the requests made by a secret police unit operated by the interior minister. Since both are in the same cabinet of the same government, they are both on the same political team.

    TEK now has had the legal authority to collect personal data about anyone by making requests to financial companies (like banks and brokerage firms), insurance companies, communications companies (like cell phone and internet service providers) – as well as state agencies. Data held by state agencies include not only criminal and tax records but also educational and medical records – and much more. Once asked, no private company or state agency may refuse to provide data to TEK….[TEK’s] data requests no longer [have[ to be tied to criminal investigations…. In fact, they have virtually no limits on what data they can collect and require no permission from anyone.

    If an organization (like an internet service provider, a bank or state agency) is asked to turn over personally identifiable information, the organization may not tell anyone about the request. People whose data have been turned over to TEK are deliberately kept in the dark.

    These powers are shocking, not just because of their scope, but also because most Hungarians knowledgeable about constitutional law would probably have thought they were illegal. After the changes of 1989, the new Hungarian Constitutional Court was quick to dismantle the old system in which the state could compile in one place huge amounts of personal information about individuals. In its “PIN number” decision of 1991, the Constitutional Court ruled that the state had to get rid of the single “personal identifier number” (PIN) so that personally identifiable data could no longer be linked across state agencies. The Court found that “everyone has the right to decide about the disclosure and use of his/her personal data” and that approval by the person concerned is generally required before personal data can be collected. It was the essence of totalitarianism, the Court found, for personal information about someone to be collected and amassed into a personal profile without the person’s knowledge.

    Does this not also violate the EU Data Protection Directive, or does that only apply to private companies rather than the member governments? It seems clear from the other posts about this that the Fidesz government isn’t particularly concerned about that sort of thing, but at what point does the EU just kick them out?

  • Google consent decree

    This is what the Google-FTC consent decree says about changing it sharing practices:

    II.
    IT IS FURTHER ORDERED that respondent, prior to any new or additional sharing by
    respondent of the Google user’s identified information with any third party, that: 1) is a change
    from stated sharing practices in effect at the time respondent collected such information, and 2)
    results from any change, addition, or enhancement to a product or service by respondent, in or
    affecting commerce, shall:

    A. Separate and apart from any final “end user license agreement,” “privacy policy,”
    “terms of use” page, or similar document, clearly and prominently disclose: (1)
    that the Google user’s information will be disclosed to one or more third parties,
    (2) the identity or specific categories of such third parties, and (3) the purpose(s)
    for respondent’s sharing; and

    B. Obtain express affirmative consent from the Google user to such sharing.

    Here is the relevant definition:

    “Third party” shall mean any individual or entity other than: (1) respondent; (2) a service
    provider of respondent that: (i) uses or receives covered information collected by or on
    behalf of respondent for and at the direction of the respondent and no other individual or
    entity, (ii) does not disclose the data, or any individually identifiable information derived
    from such data, to any individual or entity other than respondent, and (iii) does not use
    the data for any other purpose; or (3) any entity that uses covered information only as
    reasonably necessary: (i) to comply with applicable law, regulation, or legal process, (ii)
    to enforce respondent’s terms of use, or (iii) to detect, prevent, or mitigate fraud or
    security vulnerabilities.

    Interestingly, the Facebook consent decree has similar, but less restrictive, language:

    II.
    IT IS FURTHER ORDERED that Respondent and its representatives, in connection
    with any product or service, in or affecting commerce, prior to any sharing of a user’s
    nonpublic user information by Respondent with any third party, which materially exceeds the
    restrictions imposed by a user’s privacy setting(s), shall:

    A. clearly and prominently disclose to the user, separate and apart from any “privacy
    policy,” “data use policy,” “statement of rights and responsibilities” page, or other
    similar document: (1) the categories of nonpublic user information that will be
    disclosed to such third parties, (2) the identity or specific categories of such third
    parties, and (3) that such sharing exceeds the restrictions imposed by the privacy
    setting(s) in effect for the user; and

    B. obtain the user’s affirmative express consent.

    Nothing in Part II will (1) limit the applicability of Part I of this order; or (2) require Respondent
    to obtain affirmative express consent for sharing of a user’s nonpublic user information initiated
    by another user authorized to access such information, provided that such sharing does not
    materially exceed the restrictions imposed by a user’s privacy setting(s). Respondent may seek
    modification of this Part pursuant to 15 U.S.C. §45(b) and 16 C.F.R. 2.51(b) to address relevant
    developments that affect compliance with this Part, including, but not limited to, technological
    changes and changes in methods of obtaining affirmative express consent.

  • Natural Language Versus the Fourth Amendment on Search

    (x-posted from Coffee House Talks)

    In doing the initial framing for an article on how to apply Helen Nissenbaum’s theory of Contextual Integrity to the 4th Amendment, it has become apparent that there are differences between how natural language would classify whether something is a search, a reasonable search, or an excused or unexcused reasonable search, and how the law would classify the same action. Now this is not a mind-blowing observation, as it has been understood for some time that the fact of some things being classified as “not a search” for Fourth Amendment purposes is just kind of weird. However, I believe the differing categorizations of the two areas have implications when asking what an ideal Fourth Amendment doctrine would look like, so I’ll explore that here.

    (more…)

  • Homeland Security moves forward with ‘pre-crime’ detection

    Documents obtained by the Electronic Privacy Information Center through a Freedom of Information Act request show that the Department of Homeland Security is moving forward with a program called Future Attribute Screening Technologies (FAST).  The basic idea is to use various technologies to non-intrusively measure things like heart rate, eye movement, and voice pitch, among other things, to detect individuals who have “mal-intent.”

    Read more here.

  • Washington Post on Domestic Spying

    A in-depth report Monday in the Washington Post describes the expanding apparatus of US domestic intelligence since the September 11th Terrorist Attacks, including fusion centers, the new Suspicious Activity Reporting Initiative and the FBI’s Guardian Database. The article is well worth reading, but it is missing a bit of legal context that is important to an understanding the government policy that is driving the change.

    US domestic intelligence is being expanded under the authority of the Intelligence Reform and Terrorism Prevention Act (IRTPA) of 2004. This was the first and most comprehensive legal response to the recommendations of the 9/11 Commission. It outlined a wholesale rewiring of the domestic intelligence apparatus and the establishment of an Information Sharing Environment (ISE). The nationwide suspicious activity reporting initiative (NSI), which journalists Dana Priest and William M Arkin mention briefly, is the primary focus of the ISE today. It includes its own federal data standard. The “See something say something” campaign which has been getting so much press recently is simply one facet of the NSI, the focus of which up until recently has been training local and state police to be intelligence agents. For a wide range of public documents that provide coverage of the NSI and ISE, see post-doc Kenneth Farrall’s isesar.us web site, developed with the support of NYU’s Department of Media, Culture and Communication and a grant from the Department of Defense.

  • Ultimate Privacy: How to Disappear, Erase Digital Footprints & Vanish Without a Trace

    Interesting Network World article posted in mid September: “As privacy seems harder to hold onto in this digital age, privacy expert Frank Ahearn can help you legally poof and fall off the grid….”