Artificial intelligence has been disrupting the fine line between truth and fiction. If you have read our coverage of the elections, you must have noticed how easily AI-generated images were used to divide Americans, many of whom were uncertain about figuring out facts from fantasy. To battle such misinformation, especially when it comes to journalistic coverage, many new agencies have partnered with the Content Authenticity Initiative, which The Phoblographer is also a part of. And now, a new report reveals how Agence France-Presse (AFP) has, along with Nikon, managed to certify that its images are real.
As reported by AFP and Nikon Rumors, the news agency has tested a way to prove its photos are real using a prototype Nikon camera and the C2PA standard. The images were captured during the US elections and were then examined for their authenticity. Nikon Rumors adds that the C2PA Content Credentials support will be available through a firmware update, while only the Z9 and the Z6 II have mentioned this. Perhaps the Z8 will join the team soon.
According to AFP, the “proof of concept” is ready, and one can verify the images whenever they appear on the web. The agency now intends to collaborate with the industry to advance this technology to protect its readers and photographers. This includes camera manufacturers, editing software developers, and news distributors to help safeguard photojournalism and its integrity.
The experiment showcased that when the C2PA certification standard is used, it allows one to see the “authenticity of a news photograph throughout the entire distribution chain.” This is possible through four steps:
- With the Nikon Protype camera, the agency established a secure C2PA certificate that embeds a digital signature captured when the image is shot. This feature is expected to be launched in major cameras this year.
- The image is then uploaded to the server, which is secure, and the file is duplicated. The copy of the original will be edited and captured by AFP per their editorial guidelines.
- Before the picture is distributed, AFP also adds a “unique, invisible, and encrypted watermark developed by IMATAG.” This watermark functions as a “digital fingerprint” that links the edited pictures to the original image on the server.
- With the “WeVerify” plugin, one can decode the watermark to access the original image. It is similar to checking a film negative, thus completing the transparency process for journalists and the public. This plugin has been co-developed by the agency and is used by fact-checkers and open-source intelligence (OSINT) researchers.
Phil Chetwynd, AFP’s Global News Director, added that he hopes “other industry players will join us in adopting and further developing this initiative.” Similarly, Eric Baradat, AFP Global News Deputy for Photo and Archive, added this test was critical for the agency. “The idea was to go beyond theory to prove to our peers, to the camera makers, to the television makers, that this is adaptable to video, that there is a solution to keep the trust our subscribers, our viewers have in our content, to make sure that trust is embedded all along its lifespan, everywhere it’s displayed with the image.”
This is a step in the right direction, but the platform and the C2PA initiative can only be perfected with time. As Baradat said, it will require immense effort from everyone, not just photojournalists but also those who create technologies. Everyone’s hard work, however, will certainly pay off.
