Adobe will release a Content Authenticity web app beta in 2025, helping artists protect their work from deepfakes and data misuse.

Adobe plans to launch a beta version of its Content Authenticity app in early 2025, allowing creators to attach digital certificates to their work, including images, videos, and audio. The technology uses digital fingerprints, invisible watermarks, and digitally signed metadata to protect intellectual property.

Digital fingerprinting encodes an ID into a file to ensure that even if the content authentication information is removed, the file can still be identified as belonging to the original creator. Invisible watermarking, meanwhile, makes extremely small changes to pixels, which are undetectable to the naked eye.

This technology ensures that a digital certificate will always be attached to a work, no matter where it appears on the web or mobile devices, according to Andy Parsons, Senior Director of Content Authentication at Adobe.

Adobe, with more than 33 million paid software subscribers, has the potential to attract a large audience for this tool. Even artists who don't use Adobe software can use the web app to certify their work.

Adobe has also partnered with many major industry players to promote online content authenticity. The company co-founded two industry groups that include camera manufacturers, tech companies like Microsoft and OpenAI, and major platforms like TikTok, Google, and Facebook. While not all of these companies have integrated Adobe’s content certification, they have shown interest.

To address the limitations of content certification not being available across all platforms, Adobe is launching a Chrome browser extension and an Inspect tool on its website that will help users discover and view authenticity information for works across the internet.

Users can view their creations directly on the web. Source: Adobe

Adobe isn't against the use of AI, but is trying to clarify when AI is used in artwork and prevent artists' work from being used in training datasets without consent.

The company even has its own generative AI tool called Firefly, which is trained on Adobe Stock images, ensuring commercial safety and only using licensed content.

Adobe has also partnered with Spawning, a platform that helps artists monitor the use of their work in AI datasets. Spawning’s “Have I Been Trained?” tool allows artists to check whether their work has been used in datasets without permission.

Adobe will launch a beta version of the Content Authenticity extension on the Chrome browser next week and open signups for creators to use the tool when the full version launches next year.