Blog

  • PRG News Roundup Nov. 13, 2019


    According to the Intercept, the NYPD maintained until very recently a fingerprint database gathered from people charged as juveniles, in violation of state law.


    Microsoft has announced that it will extend California’s privacy protections under the new California Consumer Privacy Act to the rest of the United States.


    Following a Wall Street Journal article which revealed “Project Nightingale”, Alphabet (Google’s parent company) announced that it has partnered with Ascension, the second largest healthcare system in the United States. This will provide the company access to the health information of millions of patients across 21 states. In response, the Office for Civil Rights in the Department of Health and Human Services initiated an investigation to ensure that the partnership is in compliance with HIPAA. There is similarly a question as to the implications this partnership would have for Google’s plans to acquire FitBit.


    A Federal District Court in Massachusetts ruled earlier this week that searches of electronic devices at the border without reasonable suspicion violate the Fourth Amendment.


    A recent New York Times opinion piece offered the idea of a public option in AI to facilitate competitive entrance to the market.


    The New York Times reported on the Trump administration’s push for new rules that will limit the scientific research the EPA can use to determine public health regulations. In particular, the article details “a new draft of the Environmental Protection Agency proposal, titled Strengthening Transparency in Regulatory Science, [which] would require that scientists disclose all of their raw data, including confidential medical records, before the agency could consider an academic study’s conclusions.” This measure would significantly impact regulation which is based on research results gathered from, for example, health records subject to confidentiality agreements. The draft, if it is adopted, could also apply retroactively to regulation currently in place.

    Manual Override, an exhibit currently showing at the Shed at Hudson Yards, showcases works by artists engaging with and critiquing emerging technology.

    (Compiled by student fellow Stav Zeitouni)

  • PRG News Roundup Nov. 6 2019

    Chair of the Federal Election Commission Ellen L. Weintraub published an opinion piece in Washington Post in response to twitter’s ban of political ads. Instead of a total ban, the Commissioner argues for a strategy that would focus on preventing microtargeting.


    PRG member Albert Cahn published two new op-eds. In an NBC News piece, Dr. Cahn examines privacy impact of electronic monitoring — a class of body-worn devices that are increasingly used to replace prison confinement as well as accompany bail and probation releases. Dr. Cahn’s recent Daily News article discusses a recent move by NYPD to give commissioner discretion on releasing body camera footage to the public.


    IBM has put out a proposal for federal regulation of Facial Recognition technology, echoing similar calls from MicrosoftAmazon, and the U.S. Chamber of Commerce. The document introduces a differentiation between DetectionAuthentication, and Matching as types of Facial Recognition tasks. The proposal advocates against blanket bans on the technology, citing possible benefits. 


    Democratic Reps. Anna Eshoo and Zoe Lofgren introduced a new Online Privacy Act that would establish an agency to enforce user privacy laws.


    The New York Times addresses legal implication ensuing from a sweeping warrant granted to a police officer for searching the full GEDmatch database with over a million users. 


    An agency hired by Google reportedly sent its contractors to target the homeless people in Atlanta as part of an effort to collect more racially diverse facial scans.

    (Compiled by Student Fellow Margarita Boyarskaya)

  • PRG News Roundup Oct. 30, 2019

    On Oct 24, Two U.S. senators have reintroduced legislation to stop U.S. Customs and Border Protection’s growing practice of searching or seizing travelers’ electronic devices at U.S. ports of entry, including land borders and airports. (Law360)

    Germany’s government commission published a set of recommendations for how data and algorithmic development should happen in the age of artificial intelligence. (U.S. News)

    After suing NSO for exploiting 1,400 WhatsApp users, Facebook deleted the accounts of NSO Group workers. (Wall Street Journal) (Arstechnica)

    As a government agency seeks approval of a facial recognition system, it says one use for it could be verifying the age of people who want to view pornography online. (NY Times)

    Seventy percent of campaign websites reviewed in an audit failed to meet security and privacy best practices, according to the 2020 U.S. Presidential Campaign Audit by the Online Trust Alliance (OTA). (Security Magazine)

    (Compiled by Student Fellow Grace Huang)

  • PRG News Round-Up Oct. 23, 2019

    Google published in Nature this morning that they demonstrated quantum supremacy. Some are hailing it as an enormous technical breakthrough.

    NordVPN’s server was compromised and an encryption key was stolen. This happened March 2018 but they didn’t disclose anything to customers until the news started leaking.

    A bipartisan group of Senators introduced the bill ACCESS for promoting competition in social media firms by forcing data portability. Seems pretty weak, but at least it’s something.

    The FTC banned a Florida company from promoting and distributing its children- and employee-monitoring apps, which the FTC referred to as “stalking” apps.

    Georgia Supreme Court: Police can’t download data from a car’s black box without a warrant.

    (compiled by Tom McBrien)

  • PRG News Roundup Oct. 16, 2019

    As noted last week, the US/UK CLOUD Act Agreement, which enables law enforcement agencies in both countries to access digital data and evidence stored within each other’s borders, was released earlier this month. For an in-depth explanation of its provisions, see Theodore Christakis’s paper.

    Professor Philip Alston, the UN Special Rapporteur on Extreme Poverty and Human Rights, published a new report concerning the digital welfare state. Among other things, the report considers the consequences of state agencies using algorithms for resource allocation.

    In response to American, Australian and British officials’ open letter to Facebook vis-a-vis end-to-end messaging encryption, more than 100 civil society organizations have signed onto a separate letter in support of Facebook’s planned encryption. Similarly, Edward Snowden made his support of end-to-end encryption known in a Guardian article.

    An official Chinese governmental app called “Study the Nation” allows for spying via a backdoor. According to the BBC, “use of the app is mandatory among party officials and civil servants” and is tied to the acquisition of press cards for journalists.

    Berkeley’s city council unanimously voted to ban government use of facial recognition technology. 

    The Washington, DC police department is using GPS data culled from the ankle monitors of people placed on parole or probation for investigations and lead-finding. Additionally, the ankle monitors appear to have two-way audio capabilities which the police department says are not currently in use, although there have been allegations to the contrary.

    (compiled by Stav Zeitouni)

  • PRG News Roundup: Oct. 9, 2019

    The U.S. Commerce Department placed 28 Chinese business entities on a trade blacklist in connection with the mass detention and abuse of the muslim Uighur population is Xinjiang. Among the companies hit by the new sanctions are eight of China’s biggest private AI companies specializing in facial recognition, algorithmic surveillance, and autonomous vehicles. Expanded sanctions also include visa bans on Chinese officials linked to the mass detention of Uighurs and ethnic Kazakhs.The European Court of Justice ruled that individual countries can order Facebook to remove defamatory posts, photographs and videos from display not only in the country of litigation, but globally. This ruling, which cannot be repealed, extends the reach of the region’s internet-related laws beyond its own borders, and comes a week after the same court ruled in favor of Google in the landmark ‘right to be forgotten’ case.
    The U.S. and U.K. governments have reached a data-sharing agreement — the first of the executive agreements envisioned by the CLOUD Act —  that enables law enforcement agencies in both countries access to digital data and evidence stored within each other’s borders. Shortly after, the United States announces negotiations with Australia on potentially signing the CLOUD Act. 

    California Gov. Gavin Newsom (D) signed bill AB 1215 blocking law enforcement from using facial recognition technology in body cameras, effective through Jan. 1, 2023. California is now the largest state to take steps to limit police use of the technology, following New Hampshire and Oregon.Over 30 civil rights groups sign a letter urging Amazon to cease partnerships that give local police offices access to Amazon Ring smart doorbell data.
    US Attorney General William Barr, along with counterparts in the U.K. and Australia, request that Facebook delays plans to deploy end-to-end messaging encryption.
    A EU Internal Market commissioner-designate Sylvie Goulard announced upcoming updates on Digital Services Act that must include proposals concerning AI. The updates are to be released within 100 days of the new commission’s tenure. Mrs. Goulard also stressed the need to update the eCommerce laws. 
    A recent disclosure of a July 2019 ruling by the Foreign Intelligence Surveillance Court shows that the FBI was found to be violating the the Constitution’s Fourth Amendment protections against unreasonable searches in using a warrantless internet-surveillance program intended to target foreign suspects in 2017 and 2018. 
    The Justice Department is planning to require collection of DNA from immigrants crossing the U.S.-Mexico border for use in a national criminal database. The practice would apply to immigrants who enter the country at legal ports of entry to ask for asylum.

    (Compiled by Margarita Boyarskaya)

  • PRG News Roundup Oct. 2, 2019

    Apply to We Robot 2020 by October 7!

    Andrew Yang suggested treating data as a property right and included rights such as the right to be forgotten (Forbes).

    Facebook has affirmed that will not apply remove political speech even if it breaks its community rules (The Verge).

    Representative Takano introduced The Justice in Forensic Algorithms Act that would help ensure the ability of defendants to access algorithmic evidence in criminal trials (TechDirt).

    Princess Awesome, a girls’ clothing brand, is struggling to get their products advertised because of the key words that Amazon provided (some for petticoats but not leggings). This can particularly affect small businesses. 

    Recent judgment from European Court of Justice regarding cookie practices. You must not just inform, but must give the user a choice. No pre-selected checkmarks for “I’m fine with all cookies.” (TechCrunch)

    Compiled by Cassi Carley and Tom McBrien.

  • PRG News Roundup: Sept. 25, 2019

    A recent BuzzFeed News story demonstrated the relative ease with which people can be tracked around New York City, through a cross reference of public cameras accessed via EarthCam and Instagram stories geotagged to Times Square.


    The European Court of Justice ruled in Google v. CNIL that Google is required to carry out a delisting (or “dereferencing”, as the Court put it) of search results on all versions of the search engine which correspond to EU member states when ordered to delist by one member state. It is important to note that the Court seemed to invite member states to consider even broader extraterritorial regulation in this vein.

    Rep. Mark Takano (CA) introduced a bill that would help defendants in federal criminal trials access forensic algorithms used in probabilistic genotyping software.


    ImageNet, an image database organized by nouns (modeled on the WordNet hierarchy), announced that it would remove roughly half of the images in its “person” category, in order to combat some of the biases that have become embedded in the dataset. In response, ImageNet Roulette has announced that its website would be taken off the internet on September 27, 2019.

    The Department of Housing and Urban Development has announced its intention to re-work its disparate impact policies, which would likely make it much more difficult for potential complainants to prove that they’ve been victims of discrimination in housing. The proposed rule also contains a section on the use of algorithms in housing decisions, which reads, “Paragraph (c)(2) provides that, where a plaintiff identifies an offending policy or practice that relies on an algorithmic model, a defending party may defeat the claim by: (i) Identifying the inputs used in the model and showing that these inputs are not substitutes for a protected characteristic and that the model is predictive of risk or other valid objective; (ii) showing that a recognized third party, not the defendant, is responsible for creating or maintaining the model; or (iii) showing that a neutral third party has analyzed the model in question and determined it was empirically derived, its inputs are not substitutes for a protected characteristic, the model is predictive of risk or other valid objective, and is a demonstrably and statistically sound algorithm.” To submit a comment on the proposed rule, go here.

    The use of facial recognition technology in public housing has received some backlash.

    The FTC is still requesting comments on the effectiveness of the amendments to the Children’s Online Privacy Protection (COPPA) Rule.

    Compiled by Stav Zeitouni and Tom McBrien.

  • PRG News Roundup: Sept. 18, 2019

    Cell-site location information (CSLI), which is often used as evidence in court to connect suspects to crime, has come under more scrutiny after it was revealed that incorrect data may have played a part in more than 10,000 criminal cases in Denmark. Thirty-two individuals have already been released from prison upon showing that their convictions rested on inaccurate CSLI data.

    BBC recently introduced a child wellbeing app named “Own It,” which aims to help children develop healthy online habits by monitoring their screen time, discouraging them from sharing sensitive information, and monitoring the tone of their messages. While the app has some privacy advantages over some of its peers, such as its lack of reporting features to parents, it joins a growing list of “sentiment analysis” technologies that analyze and categorize users’ moods via their online activity.

    A bill introduced to New York City Council would bar the City from adding financial services chips to municipal identification cards. While the chip may have increased access to financial services for low-income New Yorkers, many feared the potential for exposing undocumented New Yorkers to immigration agencies.

    A neighborhood in Loomis, California decided to install license plate surveillance technology to help the area fight crime. The technology takes a picture of the license plate of every car that enters and leaves the neighborhood, after which the pictures are stored for thirty days.

    Written by Tom McBrien

  • PRG News Roundup: Oct. 24

    PRG News Round Up – October 24

     

    Ann Cavoukian, the former privacy commissioner of Ontario and key developer of the Privacy by Design movement, resigned from Sidewalk Labs. Cavoukian said that her resignation was intended as a “strong statement” about the treatment of data privacy in Sidewalk Labs’ plans for a smart city neighborhood in Toronto. Sidewalk Labs had recently shared that, while it was committed to de-identifying all the data it collects, it would not require, and therefore could not guarantee, that other groups participating in the project would do the same. 

     

    Subscribers have reported that Netflix shows different images to represent titles depending on a user’s race or ethnicity. Netflix has denied such racial targeting and has insisted that they only use a user’s past viewing history to inform promotional images.  

     

    According to a recent study in Science, it is becoming increasingly easy to identify someone in the United States based existing ancestral data voluntarily given to companies such as 23andMe. Police have been able to use long-range familial searches to identify and arrest suspects in cases. This investigatory technique is particularly successful in identifying individuals of certain ethnic backgrounds— the study found that approximately 60% of the searches for individuals of European-descent will result in a third cousin or closer match.

     

    This news roundup was compiled by Alexia Ramirez