Category: Uncategorized

  • Laura Poitras at the Whitney

    Laura Poitras at the Whitney

    By: Kayla Wieche

    http://whitney.org/Exhibitions/LauraPoitras

    http://www.nytimes.com/2016/02/05/arts/design/laura-poitras-astro-noise-examines-surveillance-and-the-new-normal.html?_r=0

    http://www.newyorker.com/podcast/political-scene/laura-poitras-and-david-remnick-visit-the-whitney-museum

    Until May 1, visitors to the Whitney Museum’s eighth floor will encounter ‘Astro Noise,’ the multi-sensory exhibit by artist and journalist Laura Poitras. Poitras is best known for her involvement with the Snowden revelations and her documentary Citizenfour, which features NSA whistleblower Edward Snowden detailing and describing classified documents on government surveillance. ‘Astro Noise,’ named after an encrypted file that Snowden gave to Poitras in their initial communication over two years ago, continues to probe the tension between privacy rights and government surveillance.

    The exhibit features visual presentations of various components of the government surveillance program – detention, torture, drones, data mining – and the legal reasoning that enables and supports it. After exiting the elevator, visitors are greeted by large prints depicting images of an American and British intelligence hack of Israeli drone feeds. The first room houses a screen with one side streaming video footage of passersby’s faces reacting to the site where the Twin Towers had stood in the days after the Sept. 11 attacks, and the opposite side projecting video of prisoner interrogations in Afghanistan. Following this striking display is an interactive video and sound exhibit relating to drone surveillance. Next, the visitor is guided through a dark hallway perforated with brightly lit peepholes through which intelligence documents legally justifying these programs are displayed. The exhibit ends with indications that all visitors have been surveilled during it.

    The sense of unease generated by visiting ‘Astro Noise’ is purposeful and powerful; it is intended to make the visitor critically question the validity of and take action against privacy violations committed in the name of national security. Poitras told The New Yorker “we create the political landscape in which we live and we can change that landscape.” The gift shop sells US Constitutions, perhaps suggesting that visitors use it as a tool to begin to enact that change.

  • Your Next Ride Might Be Used by The Government and Third Parties to Track Your Steps

    Your Next Ride Might Be Used by The Government and Third Parties to Track Your Steps

    By: Felipe Palhares

    April 21, 2016

    Link: https://www.theguardian.com/technology/2016/apr/12/uber-us-regulators-data-passengers-report

    Taking a ride with Uber might reveal more than you think about your whereabouts, especially to the government and to regulatory agencies. Uber has recently disclosed that state and local transport agencies requested data of more than 11 million user accounts and half a million drivers between July and December. This includes GPS coordinates, route maps and addresses.

    Although this data is supposedly anonymized, thus not direct revealing the name of the users, it is not clear exactly what data is being informed by Uber to the authorities besides those identified above and this could impose a great concern regarding the privacy of Uber’s users. Even if users’ names are not disclosed, it should not be difficult to discover this information after looking through the other kind of data being disclosed to the regulators. If Uber is being forced to reveal the model and color of the car, plate numbers and a specific ID number unique to each user, it would only take a little bit of research and surveillance to allow someone to discover their real identity.

    Furthermore, considering that you can set your home and work address to your Uber account, those data could also be used to easily match an ID number to a person’s identity. The implications of this type of data being provided to third parties are fairly dangerous. For one, according to the article some of the data is available to the public through record requests, which means that anyone could discover where you live, where you work, the places you frequent, how often you frequent these places, what time of the day you usually leave home and what time you come back, along with a lot of other information that you might not want to have disclosed to the world.

    After all, the places that you frequent might reveal a lot about you, such as your political, religion and sexual preferences, aspects of your life that you would not expect to have revealed only for choosing to take a ride with Uber. This could also be dangerous for your safety. According to a study conducted by the CDC (National Intimate Partner and Sexual Violence Survey: 2010 Summary Report), one in 6 women (16.2%) and one in 19 men (5.2%) in the United States have experienced stalking victimization at some point during their lifetime. Hence, revealing your whereabouts to the public could allow stalkers to track you more easily and increase unnecessary risks to your personal safety.

    Moreover, if this data is immediately available for everyone, or at least for the authorities, it could also be used by the government or the police to track your steps and investigate your life without applying for or being granted a search warrant. Therefore, collecting and providing all this information to transport regulators upon blank requests without explaining why the information is needed raises serious concerns about users’ privacy. This should be clearly and expressly communicated to users, allowing them to make an informed decision before calling their next Uber ride.

  • “Microsoft Sues Justice Department to Protest Electronic Gag Order”

    “Microsoft Sues Justice Department to Protest Electronic Gag Order”

    By: Yilu Zhang

    April 20th,2016

    http://www.nytimes.com/2016/04/15/technology/microsoft-sues-us-over-orders-barring-it-from-revealing-surveillance.html?_r=0

    Last week, Microsoft launched a court battle on the offensive against the US government’s use of the Electronic Communications Privacy Act to request consumer information under the cloak of gag orders. In a public move, which seems to parallel Apple’s recent opposition against the FBI’s request to code backdoor access into its iPhone devices, Microsoft may also be leveraging the court of public opinion, by taking a stand for its customers’ privacy rights over more furtive government intrusions.

    Microsoft is not claiming that government orders should never proceed secretly; rather, the company cites to the thousands of secrecy orders received over the last 18 months, raising doubts that the government is, in good faith, employing these secrecy orders only when there is a real risk of harm to others or to the evidence sought. Furthermore, the statute does not specify with any particularity the standard for establishing “reason to believe” that disclosure would hinder an investigation, and Microsoft is never privy to those rationales anyway, as it only sees the warrant that comes out of the other end. Microsoft also points out that the majority of these government secrecy orders contain no specified end date. These gag orders under ECPA are arguably unconstitutional on two fronts. First, being forbidden from alerting Microsoft’s customers that their information has been disclosed to government agents violates the customers’ 4th Amendment rights of reasonable search and seizure. Second, Microsoft contends its compelled silence violates its First Amendment speech rights.

    Microsoft’s suit also highlights the growing obsolescence of ECPA, which was passed in 1986. In this current technological era, cloud computing has emerged as a significant means of data transmission and storage. ECPA, however, fails to protect cloud data in the same manner it protects government access to physical information (e.g., documents in a drawer) or email. The government is therefore able to take advantage of this growing loophole (as Microsoft would see it) to demand customer data without a corresponding notification to targeted customers. This discriminatory treatment of cloud computing is indeed questionable, as the technology becomes increasingly prevalent and individuals store greater and greater volumes of data in the cloud. Keeping an outdated ECPA provision alive in the cloud computing era permits the government to access these large stores of individuals’ data directly through a third party without ever leaving a trace of such access.

    As an aside to the constitutional challenges, Law Professor Michael Froomkin of the University of Miami, makes an interesting note that “Most people do think of their email as their personal property, wherever it happens to reside… But there is a disconnect between behavior and expectations and the statute. And Microsoft is inviting a court to bring the law in line with people’s expectations.” 4th Amendment jurisprudence, which has evolved to focus heavily on reasonable expectations of privacy, sets up a debate as to how society’s expectations of privacy are to be measured—whether from a descriptive stance (e.g., by conducting surveys of actual social expectations) or from a normative stance (which may acknowledge the possible circularity that emerges from legal norms shaping social expectations). As a policy matter, to the extent that we care to match expectations with legal reality under either approach, this Microsoft suit shines a light on the existing mismatch between consumer beliefs and the wider latitude that ECPA actually affords the government.

  • New surveillance program in the NJ transit system sparks privacy concerns

    New surveillance program in the NJ transit system sparks privacy concerns

    By: Rodrigo Moncho Stefani

    April 20th, 2016

    Panel 1

    Video surveillance seems relatively normal in modern society. Maybe not all video surveillance systems are as prevalent as the one that London has in place all around the city, but it has become normal to see signs that warn “You are being videotaped”. Nowadays one is expected to be under video surveillance pretty much in every business, or spaces with access to the general public, especially if that place is a public institution.

    In terms of privacy protection and regulations, this reality could be translated into the fact that there is little to no privacy expectation when we are in a place that we know (because we have been warned by one of the abovementioned signs) or we should know (because we are in a bank, a transit terminal or similar places) that we are under video surveillance. That being said, in those situations people expect to be videotaped, meaning that a camera is capturing their image, and the information is possibly being stored for certain amount of time. But those cameras usually only capture images, and even in some instances not particularly good or very defined images, as the video from the recent scandal surrounding Trump’s campaign manager showed.

    Therefore, it could be argued that one cannot expect in those places to have privacy about one’s image, actions and physical interactions, but those expectations could remain for the contents of one private conversations. Cameras can see you, how you are dressed, what you are doing, and maybe even who you are talking to, but there is no way of knowing what you are saying. A similar distinction has been made between meta data on an email, or the address on a letter, and their contents, the latter having a stronger protection than the former. The feeling of intrusion is different if an observer can see a person or an interaction, than if that observer can also listen to a conversation.

    That seems to be the case in the recent announcement that the New Jersey transit authority would begin recording audio in some of the trains that it operates in the state, on top of the video surveillance that it already conducted (http://www.nytimes.com/aponline/2016/04/12/us/ap-us-nj-transit-surveillance-systems.html?_r=0).

    The trains are limited to the light rail trains, and the change has not taken place in the entire system, but still the announcement brought some reactions from privacy advocates. There is a feeling that from now on, riding those trains would be like being in a place where the walls have ears. The questions are generally around whether the privacy invasions that the new system would imply, are justified by the law enforcement and crime prevention benefits that it can bring.

    It seems clear that the benefits of a measure like this will hardly outweigh the privacy invasion that some train users might feel. Any benefit that the audio of an event could bring, would seem to be the same as those that a video could provide (not including of course the sounds in the driver cockpit). And also, if the recordings are going to be used in a targeted investigation, it seems that a specific warrant should be required.

    That being said, it should also be noted that these types of systems are very hard to monitor constantly, even when they are only video systems, clearly a constant monitoring of an audio surveillance system would almost require of an army of officers hearing to every conversation, which would mean that the actual harm could be limited.

  • PRG News Roundup: April 20th

    Today’s news roundup:

    • Google continues to run afoul of European antitrust regulators.
    • A newly-declassified FISA Court judgement from November ruled that “backdoor” warrantless email searches are legal under the Constitution.
    • Microsoft has sued the US Department of Justice over ECPA gag orders.
    • The 6th Circuit Court of Appeal ruled that cell-site location information is not protected under the 4th Amendment.
    • The Supreme Court heard oral arguments regarding whether applicants for a drivers license can be compelled to agree to warrantless breathalyzer testing under the 4th Amendment.
    • Shortened URLs can be used to spy on people.
    • The 7th Circuit makes it easier for individuals to sue for prospective future harm resulting from data breaches.
    • And the New York Times gets a little muddled on the parameters of Google’s responsibilities regarding the “right to be forgotten.”

    As per today’s conversation, Ed Amoroso’s keynote introduction to network security from the Princeton 5G Summit is viewable here.

  • Giant Leak of Offshore Financial Records Exposes Global Array of Crime and Corruption

    PANAMA PAPERS: A RESULT OF SELECTIVE GOVERNMENT SURVEILLANCE

    Topic: Government Surveillance

    By: Aluizio Porcaro Rausch (Panel 2)

    Post: Giant Leak of Offshore Financial Records Exposes Global Array of Crime and Corruption

    Link:  https://panamapapers.icij.org/20160403-panama-papers-global-overview.html

    The International Consortium of Investigative Journalists (ICIJ), a team of more than 370 journalists from 76 different countries, and other news organizations around the world recently exposed a large number of politicians, businessmen, celebrities and criminals of hiding funds in tax havens. Leaking around 11.5 million records of secret financial deals performed under the assistance of Mossak Fonseca, a Panama law firm, and several well-known banks, these journalists revealed to the public a globe network of money laundering and tax evasion from 1977 to 2015.

    Among the many individuals and entities involved in this long-standing underworld industry are Russian President Vladmir Putin, prime ministers of Iceland and Pakistan, Chinese President Xi Jinping, British Prime Minister David Cameron, soccer player Lionel Messi, UBS and HSBC. Although not directly touching US jurisdiction, the leak also includes 33 people and companies blacklisted by the US government – such as drug lords and terrorists – and a US businessman that signed documents for a off-shore creation while serving prison sentence in New Jersey.

    In a time of Base Erosion and Profit Shifting counter movements promoted by the

    Organization for Economic Co-operation and Development (OECD) and by the most developed countries in the world, this leakage points out even more complex tax evasion schemes and ineffectiveness of governments’ fiscal information access. The involvement of several world leaders also raises doubts about the seriousness of formal commitments for more transparency of tax systems.

    Specifically about the U.S., it is important to mention that the Foreign Account Tax Compliance Act (FATCA), enacted in 2010, set higher standards for fiscal data disclosure worldwide, as many countries followed American example. Nevertheless, this effort does not seem sufficient to eradicate abusive tax planning.

    Selectivity of law enforcement and government surveillance is an old issue. In the US History, its roots are in the abusive procedures adopted by the colonizer British towards the colonies, as Justice Stewart summarizes in Standford v. Texas. Unsurprisingly, it is still a current issue in many other jurisdictions as well. Despite all governments’ resources, the wealthy and powerful are protected from official surveillance. If not for non-governmental entities such as ICIJ, the general public would remain in the dark.

    Aluizio Porcaro Rausch

  • Cell Site Location Information and United States v. Jones

    April 14th, 2016

    Cell Site Location Information and United States v. Jones

    By: William Simoneaux

    This post by the Electronic Frontier Foundation discusses a recent decision upholding the lawfulness of the FBI’s warrantless request for cell site location information (CSLI), used to help convict two defendants by linking them to the locations of various robberies. In United States v. Carpenter, the Sixth Circuit reasoned that the location information was conveyance information necessary to make the call, as distinct from the content of the call itself.

    The EFF filed an amicus brief in the case arguing for the opposite result. It pointed out that, despite the Sixth Circuit’s point that the information was not as precise as GPS data, it was precise enough “to place one of the defendants at church every Sunday.”  Additionally, the EFF argued that the volume of information collected, three to four months worth, was problematic, especially when compared to the 28 days of monitoring that took place in United States v. Jones.

    Based on the reasoning of both the Sixth Circuit’s opinion and the EFF’s response, any resolution of the relationship between the government’s use of CSLI and the Fourth Amendment should involve Jones. It is the Supreme Court case that most directly touches the question of the privacy interests involved in the government’s tracking of individuals’ location over time. The difficulty in looking to Jones for guidance, however, is that that case involved the physical placement of a GPS device on the defendant’s car, the deciding factor in Justice Scalia’s plurality opinion. With CSLI, no physical trespass ever need occur.

    Justice Alito’s concurrence in Jones did not rely on the physical trespass argument, but rather on the duration and precision of the monitoring of the individual’s location. On the other hand, what does it mean that CSLI is possibly less precise than the GPS data in Jones, but still precise enough to glean information that touches on the privacy interests of the Fourth Amendment? Ultimately, clarification from the Supreme Court on just how great the privacy interest in one’s location over an extended period of time may be required.

    https://www.eff.org/deeplinks/2016/04/sixth-circuit-disregards-privacy-new-cell-site-location-information-decision

     

  • Two Consequences of a New Encryption Bill

    Two Consequences of a New Encryption Bill

    By: Sawyer Williams

    April 14th, 2016

    The Senate Intelligence Committee released a draft of its hotly anticipated encryption bill on Wednesday.  The legislation, authored by Chairman Richard Burr (R- NC) and ranking member Dianne Feinstein (D-CA), would force companies who offer encrypted services to their customers to also provide technical assistance to federal investigators who legally request such data.  According to The Hill, “[t]he move is a response to concerns that criminals are increasingly using encrypted technology to hide from authorities.”

    This of course follows on the heels of the very public debate about encryption between Apple and the Department of Justice, where Apple refused to provide the FBI with assistance in attempting to unlock an iPhone used by one of the suspects in the San Bernardino shooting.

    One interesting ramification of increasing encryption on cell phones, i.e. encryption that can prevent a search even when it is conducted pursuant to a warrant, is that it could very well cause the Courts to empower the federal government’s ability to search unlocked cell phones under the search incident to a lawful arrest doctrine.  This ability was severely limited in Riley v. California because the Court found that the two-prong Chimel rationale of protecting the officer and preventing the destruction of evidence was absent once an arrestee’s cell phone is seized.  But where the locking of a cell phone has the potential to disappear evidence forever, suddenly the second prong the Chimel rationale – the one about preventing the destruction of evidence – is absolutely on point.  Thus, if police officers apprehend a suspect and his or her phone happens to be unlocked, they may at least tamper with the phone’s settings to prevent its locking.  Can they search the entire phone?  Maybe not.  Still, in my mind it’s a dangerous expansion (in privacy terms) of a doctrine that allows warrantless searches.

    The Senate bill is designed to combat this kind of encryption by increasing the costs of its implementation.  For example, were a company to encrypt its services such that the government is completely foreclosed from any of its customer data, the bill still requires the company to provide “technical assistance” to the government after a legal request.  That could translate to valuable engineering time or even access to the company’s entire source code.  Furthermore, whatever goodwill the company hoped to accrue by providing the encryption to its customers is tarnished as a result of this mandatory association with the government.  How highly will customers value encryption where every company is in cahoots with the government?

    I already explained one less obvious benefit of the bill: it avoids or disincentives recourse to less savory but still legal means of searching cell phones, such as the search incident to a lawful arrest doctrine.  It actually incentivizes the police to get a warrant (or a subpoena at the very least).  If they get a warrant, they are rewarded with the help of the company.

    However, there is a costly downside to the proposed legislation.  Encryption is not only (or even mostly) about keeping our government out, it’s also about keeping malicious hackers at bay, or preventing snooping foreign governments.  And it’s about the furtherance of valuable technology with multifaceted benefits.  Think about bitcoin and its possibility as an example, all due to cryptography.  The legislation from the Senate Intelligence Committee will add enormous costs to the research and development of encryption methods, because it effectively places a ceiling above which encrypted services cannot be implemented unless providers are willing to bear the direct and indirect costs of government cooperation.  Encryption is much too valuable for society to impose this kind of blunt prophylactic.

  • Privacy, Public Safety, and the Fourth Amendment

    Topic: Privacy, Public Safety, and the Fourth Amendment

    By: Andrew Tepper

    April 14th, 2016

    Source: http://arstechnica.com/tech-policy/2016/04/first-came-the-breathalyzer-now-meet-the-roadside-police-textalyzer/

    Texting while driving is a public safety concern and many states have imposed laws cracking down on offenders. Recently, New York lawmakers have proposed legislation to allow police to use a device to determine whether a driver, who was involved in a motor vehicle accident, was texting while driving. Officially, the bill, Senate Bill S6325A, “[p]rovides for the field testing for use of mobile telephones and portable electronic devices while driving after an accident or collision.” SB S6325A. The device has been dubbed a “textalyzer” and Cellebrite, the Israel firm that many believe may have helped the FBI crack the Apple encryption, is one company who is developing the technology.

    The bill calls for a driver who is involved in an accident to give their phone to the authorities for testing. If the driver does not comply their license would be suspended immediately. Furthermore, if the law does pass it would change the current law so that now drivers would give “implied consent” to the authorities to test their phones “or portable electronic device…at or near the time of the accident or collision.” SB S6325A. While this law may be heralded as a public safety triumph it raises major concerns the Fourth Amendment’s right to privacy.

    It is clear that this was on the mind of lawmakers, as “the textalyzer allegedly would keep conversations, contacts, numbers, photos, and application data private.”[1] Instead, the “texalyzer,” perhaps by using metadata, will only show if the phone was in use prior to an accident.

    Despite these assurances questions surrounding the law’s efficacy and Fourth Amendment privacy issues still exist. Firstly, if the “texalyzer” does test for metadata a person can potentially find ways to encrypt their device’s metadata leading to issues involving warrants and hacking. More importantly though are the Fourth Amendment concerns.

    As the law was proposed this week and the technology is still being developed many of these questions and uncertainties do not have answers. Firstly, in the age of information hack it is unclear how anonymous this type of testing would be—leading to a whole host of other concerns. Cellebrite currently has technology that can check a phone’s activity. Would the “texalyzer” provide access to or capture a mobile device’s content even if does not intend to do so? What is clear, however, is that this proposed law furthers the complicated debate between increased safety and invasion of privacy.

    Additional sources:

    http://www.theverge.com/2016/4/12/11412314/textalyzer-distracted-driving-new-york-legislation

    https://www.nysenate.gov/legislation/bills/2015/s6325/amendment/a

    [1] David Kravets, First came the Breathalyzer, now meet the roadside police “textalyzer”,  Ars Technia, Apr 11, 2016, http://arstechnica.com/tech-policy/2016/04/first-came-the-breathalyzer-now-meet-the-roadside-police-textalyzer/.

  • Cell Phone Anti-Encryption Measures: the Key, But to What?

    Cell Phone Anti-Encryption Measures: the Key, But to What?

    By:  Jackson Yates

    April 14th, 2016

    Earlier this year, California Democratic state legislator Jim Cooper introduced Assembly Bill 1681, which would require every cell phone produced and sold in California to have an “unlocking” capability to better assist law enforcement in investigating crimes, particularly human trafficking. Unsurprisingly, even with a court order requirement, this introduction came at the horror of technology and privacy advocates. Aside from the potential technical shortcomings— it is unclear, for example, what is to stop people from engaging third parties to somehow circumvent anti-encryption measures— the proposed legislation fairly clearly runs afoul of some established principles of American privacy law.

    As courts have repeatedly acknowledged, perhaps most notably in Riley v. California, a cell phone is not like an ordinary “container” that law enforcement might find on a suspect’s person. Rather than a vessel with physically limited storage capability, a modern cell phone is essentially a portal into its user’s private life. Furthermore, it is nearly impossible to imagine a scenario in which officers unlock a subject’s phone to find, say, a folder labeled “Human Trafficking,” allowing them to properly constrain their search.

    Resistance from tech companies truly comes as no surprise. In a recent dispute between Apple and the FBI following the San Bernardino shootings, Apple staunchly refused to unlock the shooter’s phone in a paradigmatic display of privacy prioritization. The tech giant cited consumer privacy as an all-important concern and noted that, if word got out that Apple had the ability to “unlock” phones in this way, numerous privacy-evading entities would surely seek to engage the service to the detriment of individuals’ expectation of privacy, just like how criminals could abuse the technology behind Cooper’s bill. The FBI has found another, decidedly smaller company to unlock the San Bernardino shooter’s phone, but Cooper’s AB 1681 elicits the exact same concerns.

    In support of AB 1681, Cooper urges, “It’s not NSA. It’s not Edward Snowden.” That is true; on its face, the proposed bill does not seem to directly subject average citizens to government surveillance. However, due to the vast potential ramifications of AB 1681’s carte blanche approach, Cooper’s statement that, “It’s not the boogeyman,” is not terribly convincing to privacy advocates.

    Underlying article: http://www.latimes.com/politics/la-pol-sac-smartphone-encryption-bill-20160122-story.html

    Link to bill on Jim Cooper’s website: http://asmdc.org/members/a09/page-2

    AB 1681: http://www.leginfo.ca.gov/pub/15-16/bill/asm/ab_1651-1700/ab_1681_bill_20160120_introduced.html