What would the internet be like without the visuals? It would look like a throwback to its early days, when the connection speeds were still too slow for the high volume of images and videos we now consume globally. It is also impossible to imagine public life in politics nowadays without social media and visual accompaniments. Memes, sharepics and quote tiles from parties, politicians, the media and civil society organisations are a standard feature of any political debate today. Images and videos are key to any digital election campaign around the world, be it in the USA, Brazil, Ghana or elsewhere. They are an easy way to put a message out and communicate with the public – at times they can also be effective without any accompanying text. It should therefore come as no surprise that a large number of fake news stories are circulated on the internet with the help of images and videos.


It is important to point up the historical context of fakes and modern propaganda. After all, distorted facts, conspiracy narratives and lies have played a major role in history. Photographs were used as early as the nineteenth century to give an impression of a reality that did not exist in that form.[1] One notable example is the iconic photo of the Soviet flag on the roof of the Reichstag after it was captured in 1945. When the building was actually stormed, there were no photographers there. The picture taken by Yevgeny Khaldei is not an original in two respects: it was taken more than a day later, on Stalin’s orders, and it was subsequently edited. The smoke clouds were added in and the watch on the soldier’s wrist – which may have been stolen – was retouched out by the photographer.[2] The Nazis had murdered several members of Khaldei’s family and he saw himself as a propagandist in a good cause.

Sammlung Ernst Volland/Heinz Krimmer/Jewgeni Chaldej

Yevgeny Khaldei, Hoisting of the Soviet flag above the Reichstag, Berlin 2 May 1945 © Sammlung Ernst Volland/Heinz Krimmer/Jewgeni Chaldej; source: https://blogs.taz.de/vollandsblog/2008/07/23/die_flagge_auf_dem_reichstag_teil_3_das_manipulierte_foto/

Sammlung Ernst Volland/Heinz Krimmer/Jewgeni Chaldej

Yevgeny Khaldei, Hoisting of the Soviet flag above the Reichstag, Berlin 2 May 1945 (edited version) © Sammlung Ernst Volland/Heinz Krimmer/Jewgeni Chaldej; source: https://blogs.taz.de/vollandsblog/2008/07/23/die_flagge_auf_dem_reichstag_teil_3_das_manipulierte_foto/

Even images that have been altered for the purposes of satire cannot always be identified as such by recipients. A picture taken in a Texas school in 2002 showing former US President George W. Bush holding a book upside down was used to lampoon Bush long after the event. This kind of image manipulation is a minor intervention that has a major effect: it distorts the patriotic staging of the image and the message it conveys, while at the same time mocking Bush.[3]

Associated Press

Manipulated image of George W. Bush in a school in Houston in 2002; photo: https://www.snopes.com/fact-check/bush-upside-book/; original: Associated Press.

One important difference is that nowadays social media potentially gives anyone access to a huge audience in a short time. Fake news can circulate at lightning speed across (almost) any national border. Producing fakes has also become easier: these days you don’t need a photo lab, a TV studio or expensive photo-editing software – it is enough to have a smartphone or a laptop with internet access.


We are living in an age of user-generated content presented online. Fake images and fake news cannot always be traced back to their source. They can originate, for instance, with users of WhatsApp or Facebook Messenger. These private chats cannot be viewed by outsiders, so it is not always possible to tell whether a fake has already been doing the rounds in a closed Facebook Group or chat group before being posted on Twitter or Facebook. Viral fakes are disseminated on every possible platform and in all kinds of social groups. As a result, a fake that is created, for example, by a Facebook user and only made visible to their friends can later show up on a politician’s social media channel – the transitions are fluid.

The authors of fake content and factual distortions may be motivated by different factors, which can also operate in combination. The motive might be political – serving the interests of self-presentation or as a way to do damage to political opponents. There may also be financial incentives. Fake content and fake news are deployed, for example, in the health arena by modern-day snake-oil salesmen. They may be peddling nutritional supplements or even ineffective remedies marketed as medicine.

Fakes are also circulated by trolls. Usually, the idea here is to create confusion or to do harm to someone, much like a hate comment. Trolls often exploit moments that are in any case confusing, such as acts of violence or natural disasters. When someone carries out a terror attack or goes on a shooting spree, they publish repeated photos of alleged perpetrators who have not actually done anything. After the Islamist attack on an Ariana Grande concert in Manchester in May 2017, trolls put out pictures on Twitter of young people who were supposedly missing but who had not disappeared at all. Some better-known trolls and troll groups are part of the right-wing scene and are explicitly active in propagating a far-right ideological agenda.


Warnings about the danger of ‘deepfakes’ have been around for some time. Deepfakes are software-generated images, video or even sound. It is possible, for example, to generate faces that are completely artificial.[4] Videos can also be created in which people have words put into their mouths or are shown in compromising situations. There are multiple ways in which this technology can be used: for instance, to create videos showing people at a demonstration they did not attend. Or to attribute to politicians or activists statements they did not make or to fabricate the context in which their words were spoken.

Channel 4

This deepfake of the Queen is based on her annual Christmas message. Here, the fake queen gives an explicit ironic twist to the statements attributed to her. A TikTok dance and glitches at the end of the film reveal the trick. Video still: Channel 4, Deepfake Queen: 2020 Alternative Christmas Message, YouTube, 25 December 2020; source: https://www.youtube.com/watch?v=IvY-Abd2FfM

Deepfakes are getting better and better and in future we will need tools to help us detect them. Most of the deepfakes made today can still be identified because the people in the videos look strange when they turn their heads to the side, or they blink less often than a real person would – even if these kinds of details can be changed further down the line. However, deepfakes cannot as yet reconstruct the characteristic gestures or facial expressions and tics that each person has developed and that act like a kind of fingerprint.

Deepfakes are today already being used in disruptive ways with problematic results. In 2019, a deepfake caller managed to scam a company out of $243,000,[5] after using the technology to impersonate the firm’s CEO. Deepfake tools are also used to create pornographic material – for the most part depicting women in images and videos made without their consent. In autumn 2020, it came to light that a Telegram bot had been used to fake nude images generated from hundreds of thousands of photos of women in Russia and Eastern Europe, without the more than 680,000 victims being aware of what had happened.[6] On top of being a personal violation, this can have a range of unpleasant consequences for those affected, from harassment and unwanted attempts at contact to problems with family and difficulties in school or college or at work. These kinds of deepfakes are also being used for the purposes of blackmail.

It is clearly important for us to recognise the dangers posed by deepfakes and anticipate the impact they might have in future. But the fact is that many of today’s visual fakes are not actually created with image-editing programs and there is a spectrum ranging from deepfakes to cheap fakes along which all such manipulations can be found. This is because it’s much simpler just to take a real picture or video, strip away the context and circulate it together with a made-up story. Cheap fake techniques have been very effective thus far and, unlike the deepfake, can be implemented with minimal resources. They were used in 2020, for example, to spread fake news about the coronavirus pandemic.[7] This is a widespread phenomenon: witness the photos that did the rounds after the election in Uganda in January 2021. The pictures, as posted by Facebook users, had been taken out of context and supposedly showed rioting in Uganda. However, they were actually pictures of people from Honduras trying to cross the border into Guatemala.[8]

Agencia EFE

Photos from the Guatemalan border, used here in a Facebook Group in a falsified context to represent Uganda; names anonymised by the author; original: Agencia EFE

The following example from Thailand also demonstrates how quickly manipulative material can circulate. Photos of students taken back in January 2020 were subsequently recontextualised and recast as evidence of an alleged opposition to pro-democracy demonstrations. In fact, the pictures were taken several months before the first protests staged by the movement, which was started by young people. Nevertheless, the photos got hundreds of shares and were widely viewed as a result.[9]

AFP Check

Misleading Facebook post with photos of students from Nonthaburi (Thailand), taken out of context; image: AFP Check; original: Debsirin Nonthaburi School Student Committee


You can check many manipulated or decontextualised photos yourself in just a few steps. In fact, the tools for doing this are freely available. One option is to enter a description of the elements you see in the picture and use a search engine to track them down. In the photo below you can see coffins. The text that goes with it establishes a connection with the coronavirus pandemic and deaths in Italy caused by COVID-19. If you want to find out the possible background to the picture, you can enter the keywords ‘coffins’, ‘Italy’ and either ‘fake’ or ‘fact check’ in a search engine. If a fact check has been done on it, you could find it like this. Sometimes it is worth doing another search for the same keywords in a different language. There is also Google’s ‘Fact Check Explorer’,[10] which can be used to search fact checks from different, though not all, fact-check sites in various countries. In the case of this picture, you could find out that the photo was taken in Italy, but not in 2020. It is true that in that year the country registered the tragic figure of more than ten thousand deaths,[11] but the picture shown is from 2013, when a refugee boat capsized off the Italian island of Lampedusa and more than three hundred people lost their lives.[12]

Another way to check photos is to do a ‘reverse image search’. This can be used to find other websites where the same picture has been uploaded or to locate images that resemble the one you are searching for. For example, you can check the photo below, which apparently shows ‘Fridays for Future’ initiator Greta Thunberg with George Soros. The Hungarian American billionaire and philanthropist is one of the bogeymen of the New Right and figures in countless conspiracy narratives, many of them marked by anti-Semitism. Soros is a thorn in the side of these actors, because of his track record as a backer and supporter of various progressive organisations and politicians and his involvement in different media projects. A picture of him with Greta Thunberg is thus meant to suggest that Soros is behind her climate activism. The picture can be checked as follows:

  1. Right-click on the image and copy the address (not the image itself) or download it.
  2. Open one of the following search engines: Google, Yandex or Bing.
  3. On the website of the search engine in question click on ‘images’.
  4. Click on the camera icon (on Bing the icon looks like a camera viewfinder).
  5. Enter the address of the image or upload it.
  6. Browse through the results.

You may have to click through several search results before you find any viable information. For example, a search conducted for the Thunberg photo would reveal that the photo has been edited.[13] In the original,[14] she can be seen with former US Vice President Al Gore, an advocate for climate protection. In some cases, it is also helpful to try a different search engine. Anomalies occur in the results because the image-recognition software used by the various search engines has different functionality. For example, face recognition often works better on Bing and Yandex than on Google.


Manipulated photo supposedly showing Greta Thunberg and George Soros

Greta Thunberg

The original picture of Al Gore and Greta Thunberg. Image (right): Greta Thunberg’s Instagram account; source: https://www.instagram.com/p/BsBZea6hebZ/

For some time now, facial recognition,[15] as operated by search engines like Yandex, has not just been used by verification specialists but has also spawned TikTok videos that recommend reverse-image searches as a way for viewers to find their ‘twin’ – someone, in other words, who looks just like them.

Sometimes fake screenshots of media articles or posts on other social media sites are disseminated online. In such cases, it’s helpful to type out the headline or the text on the picture and use a search engine to track it down. Be careful to put the text in inverted commas to ensure that the words you enter are searched for as a coherent entity. If you don’t use inverted commas, all the words are searched for individually, no matter where they are positioned in a text.

The most important thing is to take a brief moment before sharing a photo or video that is emotionally provocative in order to quickly check whether it is plausible. If you take a closer look at a fake, you will often notice some kind of inconsistency. Many fakes are effective because they simply elicit an emotional response and are then circulated on the basis of the anger, fear or grief they bring up.