The one I was thinking of is this one from a Facebook page, but looking around a bit more there’s also this one from someone’s instagram. The instagram one is mainly notable because it dates the image back further to at least 2021, making it even more unlikely to be AI generated.
The common attribution appears to be this Instagram account but google images didn’t show me one from that account when looking for other version of the photo and I’m not about to make an instagram account in order to scroll through years of photos looking for the potential original.
You might be right. But my theory is that the “watermark” is typical almost-legible AI gibberish text (it almost looks like it says “Photography” but does it really?) and that it’s pulling from similar looking images in the training data, like when it tries to slap a Getty Images watermark on an output image.
The watermark is noticeably more readable in the Facebook image I linked though, and it does say photography (even there it is somewhat blurred though, so assuming it was actually clear in the original source that copy is a few recompressions along the chain).
The dates of the other sources however are what really convinces me it’s not AI. After all, who was doing good quality photorealistic AI image generation in 2021?
The one I was thinking of is this one from a Facebook page, but looking around a bit more there’s also this one from someone’s instagram. The instagram one is mainly notable because it dates the image back further to at least 2021, making it even more unlikely to be AI generated.
The common attribution appears to be this Instagram account but google images didn’t show me one from that account when looking for other version of the photo and I’m not about to make an instagram account in order to scroll through years of photos looking for the potential original.
You might be right. But my theory is that the “watermark” is typical almost-legible AI gibberish text (it almost looks like it says “Photography” but does it really?) and that it’s pulling from similar looking images in the training data, like when it tries to slap a Getty Images watermark on an output image.
the watermark is the photographer’s name, here’s the high res picture and some other angles taken from his facebook page
Okay, maybe they’re real. We may never know for sure! ;)
The watermark is noticeably more readable in the Facebook image I linked though, and it does say photography (even there it is somewhat blurred though, so assuming it was actually clear in the original source that copy is a few recompressions along the chain).
The dates of the other sources however are what really convinces me it’s not AI. After all, who was doing good quality photorealistic AI image generation in 2021?