A recent BuzzFeed News story demonstrated the relative ease with which people can be tracked around New York City, through a cross reference of public cameras accessed via EarthCam and Instagram stories geotagged to Times Square.


The European Court of Justice ruled in Google v. CNIL that Google is required to carry out a delisting (or “dereferencing”, as the Court put it) of search results on all versions of the search engine which correspond to EU member states when ordered to delist by one member state. It is important to note that the Court seemed to invite member states to consider even broader extraterritorial regulation in this vein.

Rep. Mark Takano (CA) introduced a bill that would help defendants in federal criminal trials access forensic algorithms used in probabilistic genotyping software.


ImageNet, an image database organized by nouns (modeled on the WordNet hierarchy), announced that it would remove roughly half of the images in its “person” category, in order to combat some of the biases that have become embedded in the dataset. In response, ImageNet Roulette has announced that its website would be taken off the internet on September 27, 2019.

The Department of Housing and Urban Development has announced its intention to re-work its disparate impact policies, which would likely make it much more difficult for potential complainants to prove that they’ve been victims of discrimination in housing. The proposed rule also contains a section on the use of algorithms in housing decisions, which reads, “Paragraph (c)(2) provides that, where a plaintiff identifies an offending policy or practice that relies on an algorithmic model, a defending party may defeat the claim by: (i) Identifying the inputs used in the model and showing that these inputs are not substitutes for a protected characteristic and that the model is predictive of risk or other valid objective; (ii) showing that a recognized third party, not the defendant, is responsible for creating or maintaining the model; or (iii) showing that a neutral third party has analyzed the model in question and determined it was empirically derived, its inputs are not substitutes for a protected characteristic, the model is predictive of risk or other valid objective, and is a demonstrably and statistically sound algorithm.” To submit a comment on the proposed rule, go here.

The use of facial recognition technology in public housing has received some backlash.

The FTC is still requesting comments on the effectiveness of the amendments to the Children’s Online Privacy Protection (COPPA) Rule.

Compiled by Stav Zeitouni and Tom McBrien.