When your tech is used to imprison and torture people

Leading U.S. facial recognition expert Anil K. Jain, the head of Michigan State University’s Biometrics Research Group, is facing major backlash for his involvement with Chinese academics, especially since his research is used by the Chinese government for their facial recognition tech to track Uyghurs and send them to internment camps.

In 2018, Jain traveled to Xinjiang’s capital Urumqi to give a speech about facial recognition at the Chinese Conference on Biometrics Recognition (CCBR). Jain was also on the CCBR’s advisory board and was pictured receiving an honorary certificate.

Jain is regarded as one of the world’s most influential computer scientists and a pioneer in areas of pattern recognition and biometric recognition systems. He has won countless awards and honors and is often quoted on U.S. facial recognition issues in publications like Wired and Slate. In the same month as Jain presented a paper titled “From the Edge of Biometrics: What’s Next?” at the CCBR conference in Urumqi, a United Nations human rights panel described Xinjiang as resembling a “massive internment camp that is shrouded in secrecy.” LINK

Biometrics have played a major role in the Chinese “anti-terror” crackdown on its Muslim minority the Uyghurs. Thousands have been detained in so-called “re-education camps,” and facial recognition, DNA collection, iris scans, and more have played a major role in it.

We’re talking here about a new internet, one where the internet and tech (such as in smart cities) is a tool to help us move forward. But what if the research isn’t used to help us move forward, but instead is used against us?

David Tobin, a lecturer at the University of Glasgow who studies security in China, said researchers in technical fields often ignore the real-world applications of their research. “It is imperative that natural scientists be trained in social sciences to understand these effects and the world they make things for and in ethics to be able to ask these questions when they construct, conduct, and disseminate their research,” he said. “However, such training and knowledge is sadly lacking in these fields and public debates rely on false dichotomies between natural and social worlds and between facts and values.”


(from a 2012 interview for Science by Mara Hvistendahl)

I know this is a massive question, and I keep getting back to it, because I really want us to always ask this question: whatever solution we come up with — may it be decentralized messaging apps, how the internet is funded, social media moderation, or smart cities — cracks down on the manner in which these solutions could be used by “evil forces.”

1 Like

This is super scary, and no one wants to touch the subject.

It is not even a new subject. Northern Italy is the home to a healthy industry around firearms and assorted weaponry. Everything is overboard. Workers are well treated and unionized. Companies pay their taxes and sponsor cultural events. They say they “supply police forces” and “national defense systems”, and they do. But they also make dark, cruel products.

One such company, called Valsella, was destroyed by a massive scandal in the 1990s, when it became known it was selling anti-personnel landmines to Iraq, later deployed in the Gulf War and against the Kurdish populations. The political pressure became too strong, and the company had to fold.

In Internet Tech, too, Italians have their own dark history. If you have been paying attention, you will remember a company called Hacking Team (Wikipedia), who sold offensive intrusion and surveillance systems to governments like Sudan, Venezuela or Saudi Arabia. This became known in 2015, when the company was itself hacked, and internal documents put on BitTorrent and picked up by Wikileaks. The scandal led to the Italian government revoking the company’s license to sell spyware outside of Europe.

So what to do? Not sure. Double edges are an integral part of innovation. The printing press was developed to print indulgence certificates, and its “venture investor” was the archbishop of Mainz; but it was immediately appropriated by Martin Luther’s reformers to condemn the practices of selling indulgences. It was also used to print bibles, pornographic literature, advertisements, etc. etc. It comes down to whether you believe in liberal democracy. If you do, you build an elaborate system of checks and balances, and hope for the best.

It would also help to break the alignment between obvious evils like imprisonment and torture and profit. Alexandria Ocasio-Cortes says that prisons should never, ever be run by private companies, and I can see where she is coming from. Maybe you could consider taking biometrics out of the private sector?

4 Likes

@pbihr in his newsletter links to this Carnegie Endowment report about the expansion of global AI surveillance:

https://carnegieendowment.org/2019/09/17/global-expansion-of-ai-surveillance-pub-79847

1 Like

"The fact that so many democracies—as well as autocracies—are taking up this technology means that regime type is a poor predictor for determining which countries will adopt AI surveillance.

A better predictor for whether a government will procure this technology is related to its military spending. A breakdown of military expenditures in 2018 shows that forty of the top fifty military spending countries also have AI surveillance technology.17 These countries span from full democracies to dictatorial regimes (and everything in between). They comprise leading economies like France, Germany, Japan, and South Korea, and poorer states like Pakistan and Oman. This finding is not altogether unexpected; countries with substantial investments in their militaries tend to have higher economic and technological capacities as well as specific threats of concern. If a country takes its security seriously and is willing to invest considerable resources in maintaining robust military-security capabilities, then it should come as little surprise that the country will seek the latest AI tools. The motivations for why European democracies acquire AI surveillance (controlling migration, tracking terrorist threats) may differ from Egypt or Kazakhstan’s interests (keeping a lid on internal dissent, cracking down on activist movements before they reach critical mass), but the instruments are remarkably similar."

1 Like