Adobe Proposes AI Image Opt-Out

Adobe Proposes AI Image Opt-Out

24 April 2025

Adobe is championing a 'do-not-train' mechanism for images, drawing inspiration from the robots.txt file that dictates crawler access on websites. This initiative seeks to provide content creators with greater control over whether their images are used in AI model training. The proposal involves embedding metadata within images, signalling to AI systems if the owner consents to the use of their content for machine learning purposes. This would establish a clear framework for respecting copyright and usage preferences in the age of generative AI.

This move comes as AI developers face increasing scrutiny regarding the sourcing of training data and potential copyright infringements. By creating a standardised method for indicating image usage rights, Adobe aims to foster a more transparent and ethical AI ecosystem. The adoption of such a system could significantly impact the development of AI models, potentially limiting the datasets available for training and encouraging AI companies to seek explicit consent from content creators. It remains to be seen how the industry will respond to Adobe's proposal and whether a consensus can be reached on a universal 'do-not-train' standard.

The proposal could alleviate concerns among artists and photographers who worry about their work being used without permission to train AI models that could then compete with their creations. If widely adopted, this mechanism could become a crucial tool for protecting intellectual property rights in the rapidly evolving landscape of AI and digital content creation.

Published on 24 April 2025

This content may differ from the original.

Adobe Proposes AI Image Opt-Out | Pulse24