Is Facial Manipulation Detection Software an Attack on Free Speech?

Large corporations, publications, and photographers have long used Photoshop to manipulate the human form – people are fighting against it.

First of all, Photoshop isn’t at all a bad piece of software. It has been the foundation of photo editing for such a long time. And if used well, it can help creators blend imagery and imagination, making some interesting work in the process. But it’s also used in ways that I believe are unhealthy – especially when it comes to body identity.

The Photoshop Era

“Can you use Photoshop to change how I look?” It’s a question most photographers have heard – either directly or indirectly. Once word got out that software like Photoshop existed, people were happy to share their insecurities, and ask photographers to fix them in post.

Before reaching the masses, the idea of changing a persons body came from large brands and publications. Skin softening, waist tightening, and hip expanding became normal to the point it was widely accepted. A micro extension of that is the social media filter. Companies like Instagram and Snapchat realized people wanted to change how they looked, so they created a simple, one-click solution that allowed people to manipulate their appearance – without complex tools like Photoshop.

Of course, we know the consequence of this. Low self-esteem, jealousy, comparing oneself to fake body standards. All of this can lead to anxiety and depression – it’s a sad state for people to be in.

Computer Scientists Fight Back

So when a team of computer scientists came together, they had one mission – to call out image manipulation. On Cornell University’s website, they wrote:

“Most malicious photo manipulations are created using standard image editing tools, such as Adobe Photoshop. We present a method for detecting one very popular Photoshop manipulation — image warping applied to human faces — using a model trained entirely using fake images that were automatically generated by scripting Photoshop itself.”

“We show that our model outperforms humans at the task of recognizing manipulated images, can predict the specific location of edits, and in some cases can be used to “undo” a manipulation to reconstruct the original, unedited image. We demonstrate that the system can be successfully applied to real, artist-created image manipulations.”

Ironically, Adobe – the company behind Photoshop – teamed up with the group to help launch the software on a mainstream scale. Adobe announced it was coming back in 2019, but it’s yet to materialize fully. Likely because these things take time and, of course, because of the pandemic.

I was recently reminded of this software, and it got me thinking: how would it be used on a grand scale?

Freedom to create

I’ve long been on the side of freedom of speech. With that should be the freedom to create art and content. But the older I get, and the more I see what a mess society is in at times, I begin to think we should control certain things.

For example, a profitable business that uses models should not be able to manipulate the shape of an individual. And tabloids shouldn’t be allowed to do any photo editing at all of candid photos. Celebrating a fake body is harmful, and I believe we should try to eradicate it from society.

Tools like the one mentioned above can help keep corporations and publications remain accountable. The “Can you use Photoshop to change how I look?” era needs to die – quickly. As does the silly filters people put on their photos – which they then publish to dating apps and other social platforms.

Humans are beautiful, and we should celebrate that. If stronger regulations and image manipulation detection can help us do that, then I’m all for it!

Photoshop wasn’t used for the editing of the lead image.

Dan Ginn

Dan Ginn is a content writer and journalist. He brings with him five years' experience writing in the photographic niche. During that time he has worked with a range of leading brands, as well as a host of professional photographers within the industry.