AI is Enhancing Social Network Analysis

Surveillance technology and its relationship with artificial intelligence have been rapidly advancing, and it is now being used to not only identify individuals but also to figure out who their friends and associates are. This technology, called “co-appearance” or “correlation analysis” can identify individuals who have appeared in surveillance footage near a specific person within a certain time frame, and it can mark potential interactions between them on a searchable calendar.

Vintra, a San Jose-based company, sells co-appearance software as part of an array of video analysis tools, and the company has relationships with the San Francisco 49ers and a Florida police department. The Internal Revenue Service and other police departments across the country have also paid for Vintra’s services, according to a government contracting database. However, while co-appearance technology is already being used by authoritarian regimes like China’s, Vintra appears to be the first company to market it in the West.

Little Public Scrutiny and Few Formal Safeguards

Unfortunately, companies like Vintra are testing new AI and surveillance applications with little public scrutiny and few formal safeguards against invasions of privacy. For example, New York state officials recently criticized the company that owns Madison Square Garden for using facial recognition technology to ban employees of law firms that have sued the company from attending events at the arena.

Camera

While some state and local governments in the US restrict the use of facial recognition technology, especially in policing, no federal law applies. Moreover, few states have any restrictions on how private entities use facial recognition. Co-appearance searches like Vintra’s are not expressly prohibited by law, but whether using such technology would violate constitutionally protected rights of free assembly and protections against unauthorized searches is an open question, according to Clare Garvie, a specialist in surveillance technology with the National Assn. of Criminal Defense Lawyers.

The PredPol Example

The Los Angeles Police Department ended a predictive policing program, known as PredPol, in 2020 amid criticism that it was not stopping crime and led to heavier policing of Black and Latino neighborhoods. The program used AI to analyze vast troves of data, including suspected gang affiliations, in an effort to predict in real time where property crimes might happen.

The Balance of Security and Privacy

In the absence of national laws, many police departments and private companies must weigh the balance of security and privacy on their own. Senator Edward J. Markey, a Massachusetts Democrat, plans to reintroduce a bill that would halt the use of facial recognition and biometric technologies by federal law enforcement and require local and state governments to ban them as a condition of winning federal grants.

camera

Vintra executives did not return multiple calls and emails from The Times. However, the company’s chief executive, Brent Boekestein, was expansive about the potential uses of the technology during the video presentation with IPVM. He stated that the technology could be used to create a network of associations between people and that “96% of the time, there’s no event that security’s interested in but there’s always information that the system is generating.”

Conclusion

Surveillance technology, powered by artificial intelligence, is advancing at a rapid pace, and co-appearance analysis is just one of many new applications being developed. Companies like Vintra are at the forefront of this development, but little is known about how their technology is being used, and there are few formal safeguards in place to protect individual privacy. Policymakers and civil society groups will need to carefully consider the trade-offs between security and privacy as they work to develop appropriate legal and regulatory frameworks.