A fake Pentagon explosion image goes viral on social media after a false profile posted it on Twitter yesterday. The photograph shows a large plume of gray smoke next to the Pentagon building, but the US Department of Defence declared that an explosion didn’t occur.
The image circulated quickly on social platforms and even made it onto Indian news networks. The incident highlights the dangers of Deepfake AI and its ability to spread misinformation.
The Fake Pentagon Explosion
The Guardian reported that the image appeared to show an explosion next to the Pentagon, the US Department of Defence headquarters in Washington DC. The picture spread quickly on Twitter, but the origin is still unknown.
According to NPR, The US Department of Defence responded quickly, and in a joint Twitter statement with the Arlington County fire department, they said:
“There is NO explosion or incident taking place at or near the Pentagon reservation, and there is no immediate danger or hazards to the public,”
Although the image was fake, it caused a brief drop of 0.26% in the stock market.
According to CNN, a fake Twitter account, which claimed an association with Bloomberg News, posted the image. The photo carried the caption, “Large explosion near the Pentagon complex in Washington DC. – initial report,”
Twitter suspended the account, but not before the post went viral across the platform and many verified accounts shared it.
One of the first verified members to share the image was OSINTdefender, a news page about international military conflicts with over 336,000 followers.
Indian TV networks reported the explosion and showed the image based on a Twitter post from RT, a Russian news network with over 3 million followers. RT has deleted the original post.
Twitter verification has hit many controversies as the new CEO, Elon Musk, tries to revamp the system. The fake Pentagon picture, however, highlights that even though accounts are verified, they can still spread false information.
AI and Misinformation
Many experts say that the image is obviously AI generated – the building looks different, and some features, such as the fence and lamppost, are blurred.
But, most people don’t analyze images before sharing them, which results in the rapid spread of misinformation.
According to Forbes, the Pentagon picture is one of several AI-generated images to go viral recently – one even won a prize from the Sony World Photography Awards.
The BBC reported that German Photographer Boris Eldagsen submitted the AI-generated photograph to the competition to test the judges. After winning, he announced the picture was fake and declined to take the award.
In March, a fake image of Pope Francis wearing a trendy puffer jacket and several AI-generated dramatic pictures of Donald Trump’s arrest also went viral.
AI is advancing rapidly, and even industry leaders suggest a halt in its development. The incident with the fake Pentagon explosion image had real effects on the stock market – and it calls attention to the threats of AI spreading misinformation.