September 12, 2021 by archyde
Listen to this article on the application
Listen to this article on the application
Andrew Hatley had it all planned. Or almost. To avoid being recognized by the cameras when he stormed the Capitol with a Trumpist crowd on January 6, he wore a cowboy hat and a kind of gas mask that covered part of the body. face. The ace ! The FBI found him anyway. This South Carolina truck driver took a selfie in front of a statue and sent it to his pals. One of them forwarded the photo to the police. Hatley began by saying it was a doppelganger, but the FBI verified the location of his cell phone through the Life360 app, which he was logged into. Surveillance videos confirmed his presence.
William Vogel got stuck because of his Snapchat videos and his car’s license plate photographed eight times by cameras on the highway between New York State and Washington. Another man has been identified via facial recognition, thanks to a database including the photo of his driver’s license.
Growing use of artificial intelligence
After September 11, the United States frantically strengthened its surveillance network. They created the Department of Homeland Security, passed the Patriot Act, which increases the powers of federal agencies in matters of wiretapping, data collection … twenty years later, if the country has not heard any news attack from the outside, the security apparatus has grown stronger, thanks to all kinds of inexpensive and interconnected new technologies, such as those used by the FBI to track down rioters on Capitol Hill. And Americans, so concerned about their freedoms, seem resigned to this erosion of privacy.
Limited offer. 2 months for 1 € without commitment
“The big news in recent years is the use of artificial intelligence,” notes Charles Rollet, analyst at IPVM, a research firm on video surveillance. Cameras have gotten smarter. They no longer just capture images but can spot a protester with a weapon or a fight, including from a drone. Instead of having to watch hours of video to find the suspect of a bombing, just ask the software to select the blue Fords or the people in the red shirts.
Although not as widely used as in China, facial recognition – which analyzes faces captured by a camera and compares them to a series of photos of wanted people – is gaining ground. Today the police use platforms like Clearview AI, a controversial start-up. You just have to download a snapshot of a Trumpist insurgent in the app to see all of his public photos appear on the Internet. Clearview prides itself on having more than 3 billion photos in its database retrieved from Facebook, YouTube or payment sites. And it’s not just the face. In the future, we will be able to recognize a person by the way he walks, the drawing of his veins on the wrist, his heart rate and even the shape of his posterior!
“Blurred” border between private and state surveillance
According to a recent report, around 20 federal agencies have adopted this tool. “Who would have imagined that the Department of Aquatic and Wild Life or the Post Office needed facial recognition,” exclaims Albert Fox Cahn, director of the Surveillance Technology Oversight Project, which advocates for the protection of the right to privacy. “It’s worrying, it shows a normalization of mass surveillance. China is often presented as an example of the worst security abuses, but the United States has long been at the same level.” In 2019, there was 1 camera for 4.6 individuals in the United States in 2019, compared to 1 for 4.1 in China, according to IHS Markit.
Apps and phone operators also accumulate an impressive amount of information on the user’s location and activities. The police regularly use Google’s massive database that collects an individual’s whereabouts when they log into Google Maps or other sites. During the Black Lives Matter rallies, Dataminr, a start-up specializing in social media analysis, helped law enforcement track the movements of protesters, according to investigative site The Intercept. For Charles Rollet, “this proves the increasingly blurred separation between private and state surveillance. The irony is that the FBI probably wouldn’t have identified so many people on Capitol Hill if they hadn’t posted. selfies “.
To follow the analysis and decryption wherever you are
Download the app
Enough to mobilize the defenders of data confidentiality who denounce these Big Brother methods, practiced without much control. With not always reliable results. Facial recognition systems, for example, are less able to detect dark skin. Half a dozen states and nearly 25 cities, including San Francisco, have restricted or banned police and public services from using the technology. Amazon, IBM and Microsoft, for their part, announced a moratorium on sales of this type of equipment to law enforcement agencies, while lawmakers vote on a regulatory framework. But Congress is praying. In the meantime, the trials of some 500 people indicted in the January 6 uprising have had to be delayed. Their lawyers are drowned in the thousands of hours of video and millions of photos of the attack that they must scrutinize.
Pair Marylin Maeso
By Sylvain Fort
By Vincent Pons
Christophe DonnerPUT 1xbet