2024 Elections Report: Fake Polls, Cheap Voice Clones, Communal Claims Go Viral

By Nidhi Jacob

India conducted the world's largest democratic exercise in seven phases between April 19 and June 1, 2024. Over the last three months (March 1 to May 31), BOOM analysed 258 election-related fact-checks in English, Hindi and Bangla.

Of the total debunked fact-checks, 43% (111 fact-checks) involved old and unrelated election-related videos and images being falsely presented as recent. Additionally, 10 fact-checks addressed claims of rigging electronic voting machines (EVMs), 12 involved AI-generated fabricated information, and 10 contained communal content.

18 out of the total fact-checks involved false information aimed at Congress leader Rahul Gandhi, making him the top target of mis/disinformation during the election cycle. Of the 18 fact-checks involving Gandhi, 14 were smear campaigns.

Prime Minister Narendra Modi was the focus of 17 fact-checks. However, Modi was both a target and a source of misinformation, spreading false information in at least three instances, all with communal undertones. These include, misquoting former PM Manmohan Singh, spreading misleading information about the Congress manifesto, and denying that he mentioned 'Muslims' while referring to them as 'infiltrators' and 'those with more children' in his election speech.

Furthermore, 51.1% of the 258 fact-checks involved claims shared by verified social media accounts. Among these, verified BJP X accounts were responsible for spreading misinformation in at least 11 instances, Congress accounts in at least 6 instances, and Trinamool Congress accounts in 4 instances.

Additionally, X accounts such as ‘Megh Updates’, ‘Political Kida’ and individuals such as Bharatiya Janata Party’s IT cell convener Amit Malviya, Rishi Bagree, KRK were identified as repeatedly spreading false information on the social media platform.

Dominant themes of misinformation during 2024 LS elections

AI Deepfakes and Voice clones

12 out of the 258 fact-checks contained AI-generated claims. Of these, 3 were deepfakes and 9 were voice clones. Voice clone videos ranging from West Bengal chief minister Mamata Banerjee singing a ‘Jai Shri Ram’ song to Rahul Gandhi allegedly resigning from the Congress party were viral on the internet.

In addition to them, AI-generated false information targeted PM Narendra Modi, President of the All India Majlis-e-Ittehadul Muslimeen Asaduddin Owaisi, Congress MP Kamal Nath, and Aam Aadmi Party MP Swati Maliwal. Celebrities like Aamir Khan and Ranveer Singh were also among the targets of such misinformation campaigns.

Just days before India's capital went to vote, at least three AI voice clones were used to spread disinformation on social media. Two videos made with fake graphics and AI voice clones of Hindi news anchors purporting to show AAP’s West Delhi candidate Mahabal Mishra ahead in opinion polls, were shared on Facebook and X.

An AI-generated phone conversation between Swati Maliwal and YouTuber Dhruv Rathee was also shared on social media as a real conversation between the two. In the viral clip, Maliwal was heard explaining to Rathee how she was assaulted in front of Delhi chief minister Arvind Kejriwal and his wife Sunita Kejriwal, and also requesting Rathee not to make a video on it. Maliwal has accused Bibhav Kumar, a close aide of Kejriwal, of assaulting her at the chief minister's residence earlier in May. The AAP has denied Maliwal's allegations.

BOOM consulted with two different AI detection tools - Itisaar and Contrails - both of which confirmed that the viral audio between the two had been generated using AI.

In another fact-check, a video of Asaduddin Owaisi's speech was shared with a false claim that he was reciting a hymn known as ‘Shiv Tandav Stotram’. However, forensic analysis conducted through a detection tool developed by the Indian Institute of Technology Jodhpur (IIT Jodhpur) confirmed that the video had been manipulated using deep learning technology, making it a deepfake. In his original speech, Owaisi was speaking about beef exports from India.

BOOM previously reported that cheap AI voice clones were the preferred method for spreading disinformation during the elections. However, experts informed BOOM that this election cycle did not witness as many deepfakes as expected. Professor Mayank Vatsa, from the Indian Institute of Technology Jodhpur, whose department developed a deepfake detection tool called Itisaar, stated, "We anticipated being extremely busy, processing around 50,000 to 100,000 deepfakes. However, over the entire three-month period, we managed to process only about 800 to 1,000 deepfakes, which was significantly below our expectations," as reported by Decode.

He explained that most of the existing deepfake research, which is predominantly in the West, focused on the English language. Creating deepfakes in languages like Hindi or other local languages requires significant resources, investment and dedicated teams, which not everyone can afford or accomplish. This could have impacted the overall generation of sophisticated deepfakes.

Communal claims

We published 10 fact-checks with claims containing communal undertones. 9 of these directly targeted the Muslim community or accused the Congress party of appeasing Muslims.

This included multiple false claims targeting the Congress party for disrespecting the national tricolour, such as allegations of Pakistan flags at Congress rallies, Congress hoardings displaying an inverted tricolour. Further, BJP X accounts targeted the Muslim community by selectively choosing data to exaggerate the growth of the Muslim population.

PM Modi, in his speech also criticised the Congress party's manifesto and made communal remarks saying that if Congress assumes power, they would carry out a comprehensive survey of all assets in the country, including the gold owned by women with the intention of redistributing assets to Muslims. BOOM found that his claims were misleading. There is no explicit mention of Muslims in the manifesto and nowhere in it is the term "wealth redistribution" mentioned. Instead, the manifesto suggests a need for policy assessment rather than outlining specific wealth redistribution plans.

Old and unrelated videos/images

BOOM found at least 7 instances of old claims about EVM machines being rigged, which were being presented as if they were related to the recent Lok Sabha elections.

An old video showing a man removing voting slips from a VVPAT machine and placing them in a black envelope was viral with a false claim that it depicted BJP workers engaging in booth capturing in Manipur. The video was falsely linked to the elections and was shared with the caption, "Booth capturing in Manipur by BJP. Please share all over the world to see the reality in India that no mainstream media shows you."

BOOM had debunked the same video in December 2022 when it went viral with a similar false claim that it showed BJP workers engaging in EVM fraud in Bhavnagar, Gujarat. At that time, BOOM had reached out to SN Katara, the Deputy Election Officer in Bhavnagar, who clarified that the video showed no wrongdoing. "After the counting is over, the slips are transferred to the black cover. The leftover roll is then put aside. The EVMs go their own way, and this is how the slips are taken out of the VVPAT. The procedure is being followed properly," Katara had told BOOM.

Another old video of a man slamming an EVM control unit on the floor during the Karnataka Assembly elections was revived with a misleading claim that the incident is recent and took place during the 2024 Lok Sabha elections.

BOOM found that the incident in the viral video happened in Mysuru during the Karnataka assembly election in May, 2023.

Morphed, doctored videos/images

Of the 258 fact-checks, 53 involved manipulated content such as cropped or morphed videos and images.

From an edited video of Modi asking voters to elect Congress and Samajwadi Party to a manipulated video of Rahul Gandhi saying Modi will return to power, these claims targeted politicians across various parties.

A cartoon of Rahul Gandhi went viral on social media, falsely presented as the cover page of Time magazine. In the cartoon, Gandhi is seen breastfeeding a baby, symbolising Pakistan. It was claimed that this reflects Congress's image in international media. The text on the alleged cover page stated, "Congressmen of India (especially Hindus) should die by drowning. This photo published in the world's famous magazine New York Time Magazine has revealed the reality of Congress to the whole world. Even after this, if someone supports Congress, then he should get his DNA test done. Undoubtedly, there is a virus of betrayal to the country and Sanatan in his veins."

We found that the viral cartoon was photoshopped. The original cartoon, dating back to 2012, depicted a woman breastfeeding a baby and was a critique of the Republican Party's use of big business for money during the 2012 election.

Similarly, an old video of Union Home Minister Amit Shah talking about ending Muslim reservation in Telangana was doctored. It showed him promising to end reservations for Scheduled Castes (SCs), Scheduled Tribes (STs) and Other Backward Classes (OBCs) communities if the BJP is voted back to power.

BOOM found that the original video showed Amit Shah promising to scrap Muslim reservation in Telangana during the 2023 Assembly election campaign.

Other manipulated claims included a cropped video falsely presented as Kerala CM Vijayan appealing to vote for Congress, an edited video claiming that PM Modi called himself the son of a Pathan, an edited screenshot of the India Today channel predicting 17 LS seats for Samajwadi Party In UP, a false claim stating that PM Modi said 'BJP can never make a strong India,' and a cropped video of Kharge admitting the 'wealth distribution plan', among others.

© BOOM Live