Tech companies' use of AI risks erasing evidence of potential war crimes

Tech companies' use of AI risks erasing evidence of potential war crimes
The use of artificial intelligence (AI) by social media platforms like Facebook, Youtube and Instagram to remove graphic videos is jeopardising the preservation of evidence for potential war crimes, a report has shown.
3 min read
01 June, 2023
The filming and uploading of videos of atrocities carried out in warzones has been vital in conflicts such as that in Syria [Getty]

Evidence of potential war crimes might be irrecoverably lost after being deleted by tech companies, according to a report by the BBC.

Platforms such as Facebook, Instagram and YouTube remove graphic videos using Artificial Intelligence (AI).  However, footage that could potentially help to legally prove war crimes can be removed by this process without being archived.

The AI is designed to remove content that may be harmful and inappropriate for viewers but lacks the nuance to differentiate between obscene violence posted for entertainment purposes and the uploading of videos documenting possible war crimes.

This has already led to problems for citizen journalists documenting Russian war crimes in Ukraine. Ihor Zakharenko documented the bodies of 17 people murdered in a suburb of Kyiv by Russian invasion forces, but when he uploaded them to Facebook and Instagram, they were taken down, he told the BBC.

In-depth
Live Story

In an experiment, the BBC attempted to upload Zakharenko’s videos to YouTube and Instagram using dummy accounts.  Instagram took down three of the four videos within one minute, while YouTube removed them within ten.

Over the past decade or so, the role of citizen journalism and social media and video-sharing platforms have played a vital role in documenting war crimes and attacks in war zones, such as in Syria, Yemen and Sudan.

The deleting of these videos by social media firms is taking away one of the main weapons that victims of war criminals have.  Imad from Aleppo, who owned a pharmacy in the city until it was hit by one of Assad’s barrel bombs in 2013, told the BBC how the deleting of such videos almost cost him asylum.

When applying for asylum in the EU years after the attack, he was asked to provide evidence of it.  He turned to social media and YouTube for the many videos of the attack, both by amateur and professional journalists, but they had all been deleted.

Thankfully Mnemonic is a Berlin-based company that has set up a tool to automatically save videos of war crimes without fear of deletion.  So far, 700,000 videos that were deleted on social media had been saved by Mnemonic, including the one needed by Imad.

However, such is the scope of social media, tools like Mnemonic cannot fill the void left by the deletion of vital videos documenting atrocities by overcautious AI programmes. 

For their part, Meta has said that it will try to “develop the machinery, whether that's human or AI, to then make more reasonable decisions” when it comes to discerning the potential recording of war crimes from other forms of graphic content, as quoted by the BBC.