AI and Its Potential Use in Disinformation
Artificial intelligence (AI) has the potential to be used in the production of disinformation at scale, according to recent reports. US secretaries of state are preparing for the use of AI to push narratives and disinformation in an attempt to influence or disrupt upcoming elections. State-level officials, such as Michigan Secretary of State Jocelyn Benson, have recognized the challenges posed by AI and the need for bipartisan solutions to tackle this issue.
The Wall Street Journal has highlighted the flood of AI-generated content on the internet, much of which is junk news. NewsGuard, a news site rating company, found 49 fake news websites using AI to generate content in May. By the end of June, the number had risen to 277. This influx of AI-produced content poses challenges for editors who are responsible for curating and selecting quality contributions. However, there is a concern that as AI-produced content becomes more prevalent, it may become derivative and stereotyped, leading to a collapse in its effectiveness. This suggests that human-generated content continues to play a crucial role in keeping AI in check.
Detecting AI-generated content is not an easy task. While there are systems in place to screen for AI-generated text, they are relatively easy to evade. A study conducted by several universities concluded that the available AI detection tools are neither accurate nor reliable, often mistaking AI-generated content for human-written text. This poses a significant challenge in combating the spread of disinformation.
Meta’s Approach to Disinformation in Threads Platform
Meta, the parent company of Facebook, is planning to implement a similar approach to combatting disinformation in its new Threads platform. Meta has focused on exposing and blocking “coordinated inauthenticity” on Facebook, and it seems that this strategy will be applied to Threads as well. One aspect of this approach involves identifying state-sponsored media, similar to the exposure of coordinated inauthenticity on Facebook. Another feature of Threads will be fact-checking, which will help counter disinformation.
Telegram’s Role in Russia’s War
Telegram, a social media platform with a large user base in Russia and Ukraine, has played a significant role in the sharing of war news. With its small staff and tolerant moderation practices, Telegram has become a platform for free speech, information, and a wide range of narratives, including disinformation and conspiracy theories. Russian authorities have largely left Telegram relatively untouched, as they believe they may be able to break its anonymity and track its users. This has allowed Telegram to flourish as a platform for different perspectives and the sharing of information during times of conflict.
Mr. Prigozhin’s Mansion and President Putin’s Presentation of Self
Yevgeny Prigozhin, the leader of the Wagner Group and the instigator of the mutinous march on Moscow, has come under scrutiny. Russian state television has aired video footage of raids on his property, revealing stacks of cash, expensive possessions, and an array of weapons. This portrayal of Prigozhin as a criminal contrasts with his previous denunciations of Ministry of Defense corruption. Analysts now view him as a crook who should be investigated and prosecuted.
In contrast, President Putin has been working to project strength and increase his public stature. He has undertaken prominent public engagements and has been featured in media outlets. It’s worth noting that the “collective West” has also highlighted the luxury and opulence associated with Putin, such as his armored train complete with a spa, cosmetological treatment suite, and gym.
In conclusion, AI’s potential use in disinformation poses significant challenges for combating the spread of false narratives. The flood of AI-generated content on the internet has made it increasingly difficult for editors to curate quality contributions. While there are systems in place to detect AI-generated content, they are easily evaded. Meta’s approach to disinformation in Threads involves identifying state-sponsored media and offering fact-checking features. Telegram has become a platform for sharing war news and different narratives, although Russian authorities may be attempting to gain control over it. The contrasting portrayals of Prigozhin and Putin highlight the complexities of power and presentation in Russian society. These developments emphasize the need for continued efforts to address disinformation and ensure the integrity of information in the digital age.

