Despite speculation that it was generated with artificial intelligence, an official photo of the Princess of Wales, Kate Middleton, and her three children released Sunday was likely just manipulated with a program like Photoshop, experts told NBC News.
Though there was no indicator of generative AI visible in the photo, the speculation shows how quickly AI has captured the public’s imagination and become integrated with online conspiracy theories.
In tabloid headlines, viral tweets and widely read Reddit posts, online watchers of the unfolding tumult over Middleton’s public appearances quipped that the image “looks like AI.” A post on X that’s been viewed more than 4 million times called the photo an “AI doctored image.”
But that’s most likely not correct.
“I think it is unlikely that this is anything more than a relatively minor photo manipulation,” Hany Farid, a University of California, Berkeley, professor who investigates digital manipulation and misinformation, told NBC News. “There is no evidence that this image is entirely AI-generated,” he said.
The circumstances around the photo’s release were ripe for conspiratorial thinking.
The photo was the first one of Middleton that was released to the public since January, when she underwent abdominal surgery. Middleton’s absence from the public eye during her recovery has become the subject of persistent conspiracy theories and speculation.
It’s common for professionally staged photographs to undergo light editing, like to tweak color or contrast. But the royal photo had been clearly overtly manipulated in several places, a violation of most news wire services’ policies. The Associated Press, Getty Images and Reuters all issued “kill notices” for the photo, advising news agencies to remove from their archives or not use it.
In a public explanation of the kill notice, the AP noted that a “close study of the image revealed inconsistencies that suggested it had been altered, for instance in the alignment of Princess Charlotte’s left hand with the sleeve of her sweater.”
There’s no indication that the image was a deepfake, meaning something a computer program created from scratch to realistically depict a person. Audio, video and still-image deepfakes have rapidly become more convincing and common in social media, and political operatives have used them in 2024 to try to mislead and sway voters.
But the manipulations in the Middleton photo appear to be the work of someone fretting with Photoshop or another basic photo manipulation software, Farid said.
“I think most likely it is either some bad photoshop to, for example, remove a stain on the sweater, or is the result of on-camera photo compositing that combines multiple photos together to get a photo where everyone is smiling,” he said.
Middletown apologized Monday, saying, “Like many amateur photographers, I do occasionally experiment with editing,” but did not release the unedited version of the photo or more details.
Maura Grossman, a research professor at the University of Waterloo’s school of computer science and an expert in image manipulation, agreed the photo was not the result of generative AI, but cautioned that the wide range of tools for manipulating media leads to a more complicated view of what counts as real.
“It’s not black and white. People want to see it as ‘this is a fake image, this is not.’ There are gradations,” Grossman said.
“The line is gonna get blurrier and blurrier,” she said.
Source: | This article originally belongs to Nbcnews.com