Wednesday, May 23, 2018

“Go ahead and lie. Who will they believe, us or a bunch of techies?”
FBI inflated encrypted device figures, misleading public
Contrary to what the FBI told the public, we now know that instead of 7,775 encrypted smartphones proving stumbling blocks to FBI criminal investigations, there are no more than 2,000.
… Wray called this a "major public safety issue", and used it to push a "responsible encryption" mantra – in other words, encryption backdoors.
The FBI denied ZDNet's request for information on these phones. The bureau said the information was exempt from disclosure, as the records "could reasonably be expected to interfere with enforcement proceedings."
Internally though the FBI knew they miscounted the devices as of a month ago. The bureau still doesn't have an accurate count of how many encrypted phones it has from last year.




I guess we don’t want to “fall behind” China.
Amazon is selling police departments a real-time facial recognition system
The Verge: “Documents obtained by the ACLU of Northern California have shed new light on Rekognition, Amazon’s little-known facial recognition project. Rekognition is currently used by police in Orlando and Oregon’s Washington County, often using nondisclosure agreements to avoid public disclosure. The result is a powerful real-time facial recognition system that can tap into police body cameras and municipal surveillance systems. According to further reporting by The Washington Post, the Washington County Sheriff pays between $6 and $12 a month for access to Rekognition, which allows the department to scan mug shot photos against real-time footage. The most significant concerns are raised by the Orlando project, which is capable of running real-time facial recognition on a network of cameras throughout the city. The project was described by Rekognition project director Ranju Das at a recent AWS conference in Seoul…”




There are probably many, many “special circumstances.” No doubt some future AI will deal with them.
Google Under Fire For Revealing Rape Victims' Names
The company's been accused of displaying the names of rape victims through its Autocomplete and Related Search functions – even when the victims have been granted anonymity by the courts.
The problem is that both features use data gathered from previous searches to predict what information the user is looking for and make suggestions. If enough people know a victim's name and use it as one of their search terms, Google's algorithm will provide a helpful prompt to those that don't.
In the US, there's no legal prohibition on publishing the names of rape victims, although the media tend to avoid doing so. In many countries, however, it's against the law. And the UK's Times newspaper has uncovered several cases in which Autocomplete and Related Search have revealed the names of rape victims and others who have official anonymity.


(Related) Somehow, “send us your private porn so we can block your private porn” does not seems to be entirely satisfactory. Imagine the lawsuits if this database leaks!
Facebook Safety
People shouldn’t be able to share intimate images to hurt others
By Antigone Davis, Global Head of Safety
It’s demeaning and devastating when someone’s intimate images are shared without their permission, and we want to do everything we can to help victims of this abuse. We’re now partnering with safety organizations on a way for people to securely submit photos they fear will be shared without their consent, so we can block them from being uploaded to Facebook, Instagram and Messenger. This pilot program, starting in Australia, Canada, the UK and US, expands on existing tools for people to report this content to us if it’s already been shared.


No comments: