Text-to-image-based generative artificial intelligence (AI) has impacted the photography community the most. From individuals looking to recreate actual events (as seen with the ongoing genocide in Gaza) to plagiarising the works of renowned photographers, generative AI has opened up a pandora’s box. Much of the backlash for such troublesome events is targeted at companies such as Meta and Adobe, especially since they have abdicated their responsibility to make AI ethical or not pilfer artists’ livelihoods. Although tech circles remain uncertain about the future, Adobe Stock seems to be changing this dystopian narrative.
The company, which various photographers have criticized over the past few years, has released a new guideline for its Adobe Stock contributors. First reported by Photo Archive News, Adobe Stock said the names of real artists (and even fictional individuals) can not be used as a generative AI prompt. Terms such as “…in the style of…”, “…inspired by…”, “…influenced by…”, “…in the tradition of…”, or “….drawing on…” can also not be used in association with the artist’s name.

Such a change may seem performative. Hence, its timing must be addressed. Last month, legendary photographer Ansel Adams’ photographs were replicated using AI, leading to a monumental backlash from his estate and the photographer’s admirers. Although Adobe withdrew the illustrations, the incident left a sour taste in everyone’s mouth. And now, this policy hopes to restrain the pictures that may seep through Adobe Stock’s grasp.

How Do You Safeguard Your Photographs Against AI?
Here’s a check list on what to do about the Adobe Stock AI Image Policy.
Watermark, Metadata, And Digital Signatures
The easiest way to avoid AI data theft is not to publish your work online. However, being out of sight in our visually cluttered world means you and your work are out of the audiences’ minds. The most promising route to dodge losing out on contracts and assignments would be to bring back watermarks and digital signatures. The latter makes it difficult to remove ownership, which can be verified when needed. Similarly, embedding metadata helps to prove that the photographs belong to you alone. In the case of the pictures being used for machine learning, these features can help you fight an infringement case.
Opting Out
A few portals allow you to opt out of having your data used to train AI models. For instance, AI cannot copy your styles if you don’t give third-party access to your photographs. However, there is a condition: the rules are ever-evolving, and the number of sites that allow you to opt-out is limited.
Cloak your photographs
This one may sound confusing, but it is an effective tool to combat AI training. Tools such as Glaze, Mist, and Nightshade use “adversarial cloaking,” which means that after analyzing your photographs, the software will alter the pixels of your images. The transformation is so minuscule that one’s human perceptions are unfazed by the changes. But when AI ‘reads’ your photographs, its algorithms will be confused and reproduce messy images. One can deem it a camouflage for your picture. This, too, has one caveat: highly advanced AI can break through this shield.
Copyright Laws
Legal action may sound cumbersome, but it is one of the best ways to safeguard your work against AI. Copyright infringement applies when someone uses your entire photograph or a considerable part of your images without permission or replicates them. For instance, you may use your pictures to emulate an exact painting of the same or be ‘inspired’ by it and make a similar-looking photograph. Likewise, if a corporation trains its AI utilizing your pictures, it is called copyright infringement.
In such instances, gather as much evidence as possible before you file a case to strengthen your lawsuit. This includes collating screenshots, codes, or evidence revealing that your photos have been used without approval. You can send a legal notice to the infringer, such as companies or individuals who deploy AI, threatening to sue them if they do not stop. But you can take them to court if the person continues violating your wishes. Your lawyer can help to ensure you get compensation for the damages. Please bear in mind that a legal case is the last resort in retaining the copyright of your images.
Pivot to Websites Like Cara
While portals like Cara are limited, they are the beacons of hope we need when social media platforms go into AI purgatory. Founded by photographer Jingna Zhang, the website protects human-created work while ensuring that only their creatives are promoted on the portals. Moreover, it utilizes Glaze and Nightshade to ensure that AI machines can not replicate your style.
More Ways to Help Against AI:
- Search your photographs on famous search engines.
- Setting up Google alerts for your names, images’ titles, or keyphrases will help identify your work.
- Employing digital tracking tools created explicitly for intellectual rights management. It will determine whether your work is being used without consent, license, or attribution.
- Ask trusted individuals like family members or friends to be vigilant about your copyright infringement.
- Upload low-res photographs for sharing online.
- Use ‘off-Facebook activity‘ to limit the sharing of personal information with Meta to train its AI.
- Read the Terms & Conditions before you upload your photographs.
While these solutions cannot absolutely guarantee the safety of your pictures, they can surely help retain your copyright. Combatting AI from bleeding into our creative lives is demanding, but it can be accomplished if we utilize the loopholes to our advantage. It may sound arduous, but this is the only way to ensure you don’t acquire the end of the short stick in this AI-driven world.
