Apple finally jumped on the artificial intelligence bandwagon later than its peers at WWDC 2024. As one of the most crucial events, this year was mired with mixed reactions. While the company wears the proverbial crown for impeccable privacy and data protection, its new partnership with Open AI has raised some eyebrows. Some would even say that Apple is bending its knee to market trends when, at one point, it was a trendsetter. But, as we let the past be where it belongs, there is one crucial question worrying Apple users (and rightly so): Is their data safe, especially those of photographers?
The lead image is taken from Apple’s Blog.
Per the company, AI (which now means Apple Intelligence) will use the iPhone, iPad, or Mac’s onboard RAM to process data privately. However, if the need arises to employ substantial processing capability, users can switch to cloud-based systems to complete the task. Interestingly, the AI services will be limited to newer models, as they are designed to meet the needs of artificial intelligence integrations.

Primarily, iOS 18 packs major updates, which impacts users’ experiences of owning an Apple. Some “improvements” indicate that Apple is revamping its features to match those seen on an Android. For instance, its photo clean-up tool is akin to Google’s Magic Eraser or its natural language search for photos and videos, mimicking Google Photos. Interestingly, the company’s AI can not only choose the best images but also create “them into a movie with its narrative arc.”
But that’s not all. Apple Intelligence can generate images inside messages that cater to your conversations. So, if you plan to throw a bachelorette party, its AI has you covered with references. Your device can also recreate illustrations from images of individuals you are conversing with. Similarly, Apple’s AR goggles get a new photo feature: holograms. All you and your friend have to do is tap on the images you want to see together, and Vision Pro will blow them up for you as 3D images. Filmmakers (either commercial or indie) also gain Apple’s Immersive Video tool. Again, the company reiterates that the photographs and clips stored on your device will not be shared with third parties.

But as OpenAI is lending its services, any photographer’s concern about data privacy is rationale. After all, your pictures can be used in varied ways without you receiving credit or pay. Moreover, Open AI’s ChatGPT isn’t the best role model for ethical, licensed data training. For example, they, along with Google and Microsoft, face lawsuits for AI scrapping and violation of copyright.
Generally speaking, too, images generated as illustrations or AI-generating photos to cater to your needs are problematic. If I need a mood board, I can create one. If I need an illustration, I can ask an artist to make one (and even pay them in the process). These changes are frightening, as they indicate that AI studies your every move, behaviour, likes and dislikes for a tailored experience. Do I want my device to know me that well? Maybe not. This kind of scrutiny is better suited for the government.



To answer the calls of distress, Apple’s blog and some X users have further explained that the ‘Private Computing Cloud’ will ensure your data doesn’t fall into the wrong hands. They state that user data will not be retained, and there is no privileged access for Apple’s SRE (site reliability engineering). Apple also ensures your hardware and OS remain safe from attacks and hacks.
It seems too early to comment on how Apple Intelligence will unfold for users, but one thing remains clear: Apple can’t risk tainting its brand image. While photographers have to battle for the right to their livelihoods more so now, Apple joining the AI fray is yet another punch in the gut. But keep this in mind: whatever Apple does next for iOS 18, they have to calculate the risks and challenges. After all, it takes decades to build your reputation but seconds to ruin it entirely.
