Comment by jjeaff

2 years ago

In fairness to the AI, I have often been confused by stock images or old images on news articles that are not from the event in question.

Photo attribution is a bit of a problem. For a tornado in Kansas they may use an image from another year’s tornado in Mississippi. For the war in Azerbaijan they might use an image from Chechnya, etc.

  • That is precisely the type of editorial affordance I would expect the AI to strip. This is just another way for media organizations to distort the news. I look forward to those enhancements

    • False metadata for rich media is a damned tough problem to target.

      Putting aside any actually truthful captions, how do I know that "image of X" is actually an image of X?

      Reading some of the Bellingcat investigations, and time spent, doesn't bode well.

      I guess you could TinEye and index/hash the entire web's worth of rich media, then spot discrepancies (listed as X here, but Y there), but that seems horrendous in compute/bandwidth/storage terms.

      3 replies →

    • > This is just another way for media organizations to distort the news

      No, it's not. This is done because stories with images perform better, and obtaining images (& licenses) for photos of every event is not always possible.

      3 replies →

  • True, I have a photograph taken in Kenya that has been variously described as in California, Guatemala, Colombia, Australia and South Africa.