Microsoft funds facial recognition technology secretly tested on Palestinians throughout the Occupied Territories

Microsoft funds facial recognition technology secretly tested on Palestinians throughout the Occupied Territories
Comment: AnyVision uses facial recognition to surveil Palestinians in direct violation of their basic democratic freedoms, writes Granate Kim.
4 min read
12 Dec, 2019
Because it is so highly exploitable, some US cities have banned facial recognition tech [Getty]
If you've been paying attention, it should come as no surprise that the latest in facial recognition technology is already being weaponised by governments and corporations. 

Most recently, AnyVision, an Israeli facial recognition tech company funded by Microsoft, has been wielding its software to help enforce Israel's military occupation, using the occupied West Bank to field-test technology it plans to export around the world.

Banned in America, promoted in Palestine

As soon as Microsoft announced its funding for AnyVision in June, the decision was met with scrutiny by journalists and activists alike. It was a shockingly unethical decision by a company attempting to establish itself as a "moral leader" in the tech industry.

Executives at Microsoft should not have been surprised by the outcry. Headed by former Israel Defense Forces security officials and advised by CIA-equivalent Israeli Mossad officers, AnyVision uses facial recognition to surveil Palestinians in direct violation of their basic democratic freedoms.

Because it is so highly exploitable, some cities in the US ban the use of facial recognition technology entirely. Activists in the US, organising as the Ban Facial Recognition campaign*, argue that facial recognition technology is faulty, and in the wrong hands can lead to harm and false arrests. 

Because the technology is prone to misidentifying women and people of color, activists worry that its use will only exacerbate inequalities in a criminal legal system that already arrests and incarcerates Black and Brown people at a disproportionate rate.

It should come as no surprise that the latest facial recognition technology is being weaponised by governments

Microsoft and 'lawful surveillance'

In 2018, amid concerns around artificial intelligence, and in an attempt to set itself apart as an "ethical" tech corporation, Microsoft publicly committed to six principles to guide its facial recognition work. One of these principles is a commitment to "lawful surveillance": Microsoft will "advocate for safeguards for people's democratic freedoms in law enforcement surveillance scenarios and will not deploy facial recognition technology in scenarios that we believe will put these freedoms at risk."

At the same time Microsoft was unveiling its principles on facial recognition tech in 2018, AnyVision was winning Israel's top defense prize.

According to an NBC investigation, Israel's defense minister lauded the company - without using its name during the presentation for the prize, to avoid a paper trail - for its usefulness to the Israeli military through its use of "large amounts of data."AnyVision gleaned this data through the surveillance of Palestinians living under military occupation in the West Bank without freedom of movement or democratic freedoms such as national voting rights.

In other words, the company carried out its operations in a part of the world where democratic freedoms are not only "at risk," but nonexistent - and, in doing so, it directly violated Microsoft's principle of "lawful surveillance."

AnyVision: eyes everywhere

NBC's October investigation of AnyVision and Microsoft's funding of the company found five sources confirming that, "AnyVision's technology powers a secret military surveillance project throughout the West Bank. One source said the project is nicknamed 'Google Ayosh,' where 'Ayosh' refers to the occupied Palestinian territories and 'Google' denotes the technology's ability to search for people."

Following the social media and petition campaign and the NBC report, Microsoft announced it would end its relationship with AnyVision if an independent audit finds that AnyVision violates any of Microsoft's principles.

On November 15, Microsoft announced that former US Attorney General Eric Holder will lead the independent audit of AnyVision. The extent of the audit's investigation, and what it will uncover, has yet to be determined.

Field-testing, or a violation of rights?

But a larger question still remains: Is tech developed in an illegally occupied territory, and on people denied democratic freedoms, ethical to use?

Most would argue no. But too many corporations willfully ignore the occupation.

Technology developed secretly and at the expense of those freedoms violates any ethical principles

They choose to believe Israel's narrative, ignoring the damning reports by Amnesty International, Human Rights Watch and United Nations reports of human rights violations of Palestinians; and disregard a Palestinian-led call for an international boycott, and a robust Palestinian rights movement here and around the world.

If Microsoft chooses to cease investment in AnyVision, it will set a precedent for the facial recognition industry and the tech world at large.

This summer, my organisation, Jewish Voice for Peace, launched a campaign calling on Microsoft to stop funding AnyVision and to end the relationship between the companies.

We believe - along with the combined 75,000 signers on our petition and that of partners MPower Change and SumOfUs - that Palestinians deserve freedom and democracy, and that technology developed secretly and at the expense of those freedoms violates any ethical principles.

Given the evidence, Microsoft should drop AnyVision now.

*Disclosure: The author's organisation, Jewish Voice for Peace, is a member of this campaign.

Granate Kim is the communications director of Jewish Voice for Peace.

Follow her on Twitter: @granate

This article was reprinted with kind permission from Truthout, where it was originally published. Copyright, Reprinted with permission. 

Opinions expressed in this article remain those of the author and do not necessarily represent those of The New Arab, its editorial board or staff.