3. Feature WS_4 March 2024__What would we Gain by Praying
3. Pulse WS_18 March 2024_AI and Healthcare
3. Credo WS_18 March 2024_Mirror, mirror on the wall, who's the most beautiful of them all
previous arrow
next arrow

20 June 2022

On 25 February 2022, the day after Russia invaded Ukraine, a video footage of a Ukrainian pilot, known only as the ‘Ghost of Kyiv’, single-handedly shooting down six Russian planes went viral on social media. This video had two million views in less than three weeks after it was posted.

However, this spectacular and heroic feat isn’t real. It did not actually happen.

PolitiFact.com, a non-profit fact-checking project by the Poynter Institute, revealed that the clip was actually taken from a video game called Digital Combat Simulator. Reuters Fact Check confirms this in an article published on 26 February which states that:

A clip from the videogame Digital Combat Simulator has been miscaptioned online, with social media users claiming it shows a Ukrainian fighter jet shooting down a Russian plane.

The war in Ukraine has put the spotlight on the other war – the war against fake news and online falsehoods. The insidious nature of misinformation not only has to do with its potential to mislead millions across the globe, including Singaporeans, but also with its ability to cause polarisations and divisions in society.

Misinformation and opposing narratives about the conflict have also led many to become suspicious of mainstream media.

For example, speaking to Channel News Asia (CNA), interviewee Jack Wong stated that he believed that many official sources of information focus on the Western perspective. ‘I see a lot of things reported on Western media that are basically, to me, a reproduction of US foreign policy agenda,’ he said.

That is why he shares online videos that present a different narrative, ‘to balance the issue,’ and ‘to let people see both sides of the road before they cross, don’t just look at one side.’

The conflicting narratives and the debates they provoke have also posed immense problems for governments. As Carol Soon, senior research fellow at the Institute of Policy Studies, explains: ‘The debates that the Ukraine-Russian war have sparked demonstrate how challenging it is for any government to craft and maintain a specific narrative.’


Fake news and misinformation have existed for a long time, and state and independent actors alike have used them to achieve their objectives.

For example, in the years leading up to World War II, countries used propaganda to advance their causes. This led George Gallup to comment in 1939 that:

Propaganda has grown to be one of the most powerful weapons of modern warfare – useful both in demoralising enemy forces and in influencing the opinion of neutrals. How aware is the American public that propaganda is used in the present war, and how effective has that propaganda been so far?

What has changed since these words were penned is not so much the basic strategy, but the speed and extent to which misinformation can be circulated – thanks to the internet, and especially social media.

In addition, as Carol Soon has pointed out, because of the transboundary nature of cyberspace, an international audience can now access (dis)information about local and regional conflicts quite easily.

Social media is fast becoming the main avenue by which many access news, gradually replacing mainstream media. In 2020, the Reuters Institute for the Study of Journalism at Oxford University published a fascinating report of the state of digital news around the world.

The report highlights some helpful statistics which indicate the percentage of social media users who go to the various platforms for news:

  • Facebook – 36 percent
  • YouTube – 21 percent
  • WhatsApp – 16 percent
  • Twitter – 12 percent

All these social media platforms have been and continue to be conduits for fake news.

According to an article by Mark Travers, published on the Forbes website in 2020, Facebook is ‘by far the worst perpetrator when it comes to spreading fake news. Worse than Google. Worse than Twitter. And worse than webmail providers such as AOL, Yahoo!, and Gmail.’

Another major source of disinformation is TikTok – a video-sharing app that allows users to create and share short videos on a practically any topic.

In relation to the war in Ukraine, TikTok is rapidly emerging as the number one source of misinformation. Experts believe that this is due to the huge number of users and the minimal effort put in by the platform to filter content.

In its report on TikTok, NewsGuard, which tracks online misinformation, states that:

NewsGuard’s findings add to the body of evidence that TikTok’s lack of effective content-labelling and moderation, coupled with its skill at pushing users to content that keeps them on the app, have made the platform fertile ground for the spread of disinformation.

This is deeply concerning because a third of TikTok users (in the United States) are 19 or younger.

Social media is the most common way in which fake news and disinformation are disseminated because it practically allows anyone to communicate anything to a huge audience.

In addition, social media sites encourage sharing and re-posting. This makes it difficult to track down the original source of the post. In a 2016 survey, the Pew Research Centre found that 23 percent of adults in the United States have knowingly or unknowingly shared fake news.

As Richard Lachman, in an article in The Conversation puts it: ‘Opening extra tabs to cross-check information is just not part of the experience, which helps false information to spread.’


The spread of misinformation and fake news can also be attributed to other factors, such as the evolution of our culture and the collective sensibilities it creates.

Many commentators have highlighted the fact that in the postmodern culture, which jettisons all forms of metanarrative, truth often becomes an easy casualty. This has led many to believe that we now live in a ‘post-truth’ society, where objective truth is eclipsed and set aside, leaving only perspectives, biases and preferences.

This phenomenon has even led the Oxford Dictionary to declare ‘post-truth’ as the Word of the Year in 2016. This adjective, it explains, relates to ‘circumstances in which objective facts are less influential in shaping public opinion than emotional appeals.’

That this declaration was made in 2016 is not at all surprising.

Both the political campaigns leading to the ‘Brexit’ vote in the UK and the US presidential elections, which took place in that year, have shown just how little objective facts and truth matter. As Bonnie Kristian, commenting on the US presidential election, has put this sharply: ‘The last five years of American politics have been a time of “alternative facts” and “truth [that] isn’t truth”’.

This brings us back to social media and the communicative praxis it shapes – which fits hand-in-glove with the postmodern sensibility.

The jettisoning of objective truth and the privileging of ‘alternative facts’ have worked in tandem with the key way in which social media platforms encourage users to share content, namely, by social proof.

Put simply, the more ‘likes’ or ‘shares’ a post has, the more it is likely to be liked and shared. In this way, the post spreads in ever-widening circles, and this soon develops into what some commentators have called an ‘information cascade’ that is quite unstoppable.

Added to this is the psychology of the social media user. Posts are shared simply because they arouse certain emotions – pleasure, sadness, anger. As Tony Watkins explains:

We may share something just because the headline or image has stimulated the pleasure centre in the brain – even though we have not engaged with the actual content. If we later see something revealing that what we have shared is false, that affects us less. A rebuttal does not stimulate the brain’s pleasure centres; so we do not bother sharing it. In other words, our response to much of what we seen within social media is primal, not rational.

Christians, who are supposed to be concerned about the truth, are unfortunately no less gullible than the general population when it comes to online misinformation.

According to a survey conducted by Lifeway, half of Protestant pastors in America say that they ‘frequently hear members of [their] congregation[s] repeating conspiracy theories they have heard about why something is happening in our country.’

Scott McConnell, executive director of Lifeway Research, said that this trend appears to be most prominent ‘in politically conservative circles, which corresponds to the higher percentages in the churches led by white Protestant pastors.’

According to Daniel Darling, the senior vice president of National Broadcasters, many pastors are concerned about ‘how captive many [Christians] are to their preferred media outlets, which are growing more and more extreme, and how seemingly resistant many are to hearing reasonable rebuttals.’

Too ‘many evangelicals’, he adds, ‘are catechized more by their favourite niche political podcast and pundits and politicians’ than by the Bible.


Just as in any military conflict, the war on misinformation must be fought on various fronts.

Users of social media must be alert to the fact that the posts they read may contain misinformation, especially if they are from alternative or avant garde sources. This requires social media users to exercise responsibility and discernment over their own media consumption.

As Richard Lachman puts it, ‘Responsible social media users are supposed to check sources, search for corroborations from trusted parties, check time-stamps and assess whether the content is too good – or bad – to be true.’

Media literacy programmes should be introduced to help the public, especially youths, to discern the content that they consume. As professor of politics Chong Ja Ian has stressed, there is a need for readers to develop ‘healthy scepticism’ when engaging with media content.

Such laborious fact-checking, however, is not always possible – even for the responsible social media user. And ‘healthy scepticism’ can sometimes result in paranoia where the reader doubts everything he reads or views thinking that it is all biased reporting or propaganda.

In 2018, the National Council of Churches of Singapore (NCCS) participated in an extensive study conducted by the Singapore Government on deliberate online falsehood. In its paper to the Select Committee, NCCS made several recommendations on how the problem of fake news and online misinformation can be addressed.

The Council recognises that combating ‘fake news requires a multi-pronged approach’ that goes beyond the responsibility of the social media user. It emphasises that social media platforms must also bear the responsibility of flagging and reporting online falsehood.

The Council recognises that research has shown that policing on the part of social media platforms has not always been very successful in weeding out misinformation. But it states that it is ‘preferable to have these measures in place than not to have any gatekeeping mechanisms at all.’

More controversially – especially for Western readers who champion freedom of the press – the Council recommends that the ‘government could … enact new laws to counter fake news.’ Government intervention through the introduction of new legislation is necessary because misinformation can compromise national security. ‘There is clearly a very delicate balance between freedom of information and national security,’ the Council states.

In 2019, the Singapore government introduced the Protection from Online Falsehoods and Manipulation Act (POFMA) known colloquially as the Fake News Law. This is a statute of the Singapore Parliament, which empowers the authorities to tackle the spread of online falsehoods and misinformation.

The Council also emphasises the importance of non-legislative measures to combat fake news such as public education.

The war against misinformation will be an ongoing war. Although there are many strategies to fight this war, perhaps the most important defence is trust, especially that of the citizens in the government. The Council ends its paper with these words:

Finally, NCCS believes that building trust and social resilience is the most important and effective long-term strategy against fake news. By improving transparency and communication the Government can fight against scepticism, misperception and the populist narratives found in social media – and build trust between the state and society. This will go a long way in cultivating social resilience so that the views and emotions of the population are not easily swayed by falsehoods.

Dr Roland Chia is Chew Hock Hin Professor at Trinity Theological College (Singapore) and Theological and Research Advisor of the Ethos Institute for Public Christianity.