Home / TECHNOLOGY / How a new type of AI is helping police skirt facial recognition bans

How a new type of AI is helping police skirt facial recognition bans

How a new type of AI is helping police skirt facial recognition bans
How a new type of AI is helping police skirt facial recognition bans


A new artificial intelligence technology known as Track is stirring significant debate, particularly regarding its implications for privacy and civil liberties. Developed by Veritone, a tech firm specializing in AI solutions, Track aims to assist law enforcement agencies in identifying potential criminals and understanding patterns of behavior without relying on facial recognition systems. This is crucial in areas where such biometrics are restricted by law. The tool allows for tracking individuals even when their faces are obscured or otherwise not visible, raising a host of ethical questions.

Veritone CEO Ryan Steelberg emphasizes the tool’s intent: “If we’re not allowed to track people’s faces, how do we assist in trying to potentially identify criminals or malicious behavior or activity?” This ambition, however, has garnered criticism from organizations such as the American Civil Liberties Union (ACLU). The ACLU expressed its concerns after learning about Track through the MIT Technology Review. They highlighted that the tool is the first nonbiometric tracking system being used at scale in the United States, warning that it poses many of the same privacy concerns as facial recognition technology, while also creating new ones.

In practice, Track can analyze video footage from a broad range of environments, from public gatherings such as the January 6 riots to everyday scenarios like subway stations. Users of the system can identify individuals based on numerous attributes, including body size, gender, hair color and style, clothing, and accessories. These features allow law enforcement to assemble comprehensive timelines, effectively allowing them to track a subject’s movements across various video feeds.

The technology operates on Amazon and Microsoft cloud platforms, indicating its accessibility and ease of integration into existing systems. In an interview with Veritone, Steelberg noted that the capabilities of Track would continue to evolve, hinting at future developments that may include the ability to analyze live video feeds instead of just recorded footage. While the tool currently utilizes numerous characteristics for identification, a company spokesperson confirmed that it does not allow users to search for individuals based on skin color, although it is one of the parameters the algorithm considers.

Track can utilize footage from various sources to conduct its analysis, including police body cameras, drone videos, and even citizen-uploaded content from devices like Ring cameras and smartphones. Steelberg describes the technology as akin to a “Jason Bourne app,” suggesting it could potentially play pivotal roles in high-stakes investigations.

While Track has only a small footprint in the public sector—around 6% of Veritone’s overall business, primarily consisting of media and entertainment clients—the company reports that this sector is growing rapidly. With clients across states such as California, New Jersey, and Illinois, the demand for such technology suggests an emerging trend in law enforcement that seeks to merge advanced AI capabilities with public safety initiatives.

Nevertheless, this rapid expansion has alarmed privacy advocates. ACLU senior policy analyst Jay Stanley has previously warned that AI technologies might simplify the tedious processes involved in reviewing surveillance footage, lacking oversight or due process. He remarked that Track seems to be the first tool to make widespread tracking of specific individuals technologically feasible. This brings up essential questions about where to draw the line in the intersection of crime prevention and individual privacy rights.

As AI technologies like Track continue to grow, the balance between ensuring public safety and protecting civil liberties remains a contentious issue. Questions linger about how deeply intertwined these systems will become in everyday law enforcement practices and what regulations, if any, will be placed around their use. The tool’s development exemplifies the challenges tech firms and lawmakers face as they navigate an era where surveillance technology has become increasingly sophisticated yet can easily infringe on personal freedoms.

The looming fear surrounding tools like Track is not just about the action of monitoring but also about the potential for abuse. As law enforcement agencies gain access to more advanced AI tools, the concern extends to how data is collected, who gets to decide when and how it is used, and the consequences of misidentification or wrongful arrest based on algorithmic outputs.

Steelberg acknowledges that the technology could face scrutiny in court cases, suggesting an understanding of the delicate ethical landscape they are navigating. “I hope we’re exonerating people as much as we’re helping police find the bad guys,” he added, which reflects the dual-edged sword of employing such technologies in law enforcement. The potential for misuse, mistakes, or invasive surveillance raises valid apprehensions among civil rights organizations, many of which advocate for greater transparency and accountability.

As discussions about the viability and ethical implications of AI-driven policing continue, communities must grapple with the question: Is the trade-off worth it? The pursuit of enhanced security must not come at the expense of individual freedoms and privacy rights. Voices from the ACLU and similar organizations will likely continue to challenge the deployment of AI in public spaces, urging for regulations that prioritize constitutional protections.

In conclusion, Track represents a significant evolution in AI technology designed for law enforcement. While it has the potential to reshape how agencies approach criminal investigation and public safety, its deployment prompts essential questions about privacy, oversight, and civil liberties. As this technology takes root in various jurisdictions, it becomes increasingly critical for society as a whole to engage in discussions about its role, aiming for a balance where safety measures do not encroach on personal freedoms. It’s a balance that requires vigilant oversight and thoughtful discourse to navigate an increasingly surveilled society responsibly.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *