How your digital tracks get into the hands of the police

Michael Williams is everyone the movement was followed without his knowledge – just before the fire. In August, Williams, an associate of the R&B star and the alleged rapist R. Kelly, allegedly used explosives to destroy the car of a potential witness. When police arrested Williams, evidence cited in a Justice Department statement was largely extracted from his smartphone and online behavior: texting the victim, cell phone records, and his search history.

Investigators sent Google a “keyword warrant,” asking the company to provide information about any user who searched for the victim’s address at the time of the arson. Police narrowed the search, identified Williams, and filed another search warrant for two related Google accounts. They found other searches: diesel “detonation properties,” a list of countries that do not have extradition agreements with the United States, and YouTube videos of R. Kelly’s alleged victims speaking to the press. Williams pleaded not guilty.

Data collected for one purpose can always be used for another. Search history data, for example, is collected to refine referral algorithms or to build online profiles, not to catch criminals. Normal. Smart devices, such as speakers, televisions, and portable devices, preserve such precise details of our lives that they have been used both as incriminating evidence and as exonerating in murder cases. Speakers should not hear crimes or confessions to be useful to investigators. They keep time-stamped logs of all applications, along with details of their location and identity. Investigators can access these logs and use them to check the whereabouts of a suspect or even catch them lying.

They are not just speakers or portable items. In a year when some Big Techs have promised support for police reform activists, they have continued to sell furnished devices and applications that allow government access to much more intimate data from far more people than permits and methods would allow. traditional police forces.

A November report published in Vice found that users of the popular Muslim Pro app may have had data about their location sold to government agencies. Any number of applications requires location data, for example, weather, or tracking your exercise habits. The Vice report found that X-Mode, a data broker, collected data from Muslim Pro users in order to recall the prayer, and then sold it to others, including federal agencies. Both Apple and Google have banned developers from transferring data to X-Mode, but have already collected data from millions of users.

The problem is not just any individual application, but a too complicated and controlled data collection system. In December, Apple began asking developers to disclose key details about privacy policies in a “nutrition label” for applications. Users “accept” most forms of data collection when they click “OK” after downloading an application, but privacy policies are notoriously incomprehensible and people often do not know what they agree with.

An easy-to-read summary, such as the Apple nutrition label, is useful, but even developers don’t know where the data their applications collect will end up. (Many developers contacted by Vice acknowledged that they did not even know that user data had accessed X-Mode.)

The pipeline between commercial and state oversight expands as we adopt more devices that are always on and serious privacy issues are dismissed by clicking on “I agree”. The national debate on police and racial equity this summer has brought that quiet cooperation to a strong relief. Despite declining diversity, indifference to white nationalism, and mistreatment of non-white employees, several technology companies have rushed to offer public support for the Black Lives Matter and reconsider their ties to law enforcement.

Amazon, which has hired millions in racial equity groups this summer, has vowed to cut off (but not stop) sales of facial recognition technology to police after defending the practice for years. But the company also noticed an increase in police requests for user data, including internal logs kept by its smart speakers.

.Source