A new style of photography is suddenly popping up all over the web. It features very exotic (mostly female) human constructs in equally exotic and near perfect backgrounds. The images are mostly convincing as photographs but they are constructed by computer programs using artificial intelligence. I also see many portraits on Instagram of (mostly female) models that look almost perfect but in many cases there is something just enough "off" to cue one to dig deeper to see just how the images were made. Eyes and skin too perfect, the proportions just a bit off, etc. ( A program reminder that most modern entertainment technology evolves first in the pornography spaces...).
Programs such as DALL-E, Mid-Journey and Stable Diffusion work by translating textural descriptions (shit people write....) into illustrations which directly mimic photographs. People are essentially describing what they would like to see, entering it into one of the popular programs and then looking at the many iterations the programs quickly generate, choosing one of the images and then tweaking it in post production.
How did the programs become "smart enough" about photography to get to the point where they can do this? Easy, they stole your photographs off the internet, along with the photographs created by hundreds of millions of other photographs, then analyzed them endlessly and used the analysis to and create content fabricated from bits and pieces; constructs based on similarities and bits of direct appropriation. Classic machine learning, I think. But the companies that are making this sort of AI software were totally dependent on gaps in current property and copyright laws to be able to steal our work and use it to program these "weapons" which will, almost surely, devastate the commercial markets for photography going forward. You can try to explain it all away or protest that I am being an alarmist but I think, as photographers, we're facing an existential inflection point that will make the market disruption caused by "penny stock" photography back in the 1990s look like a very minor blip.
Should you care? Not if you don't care about original human art, the theft of private property, the appropriation of human work, and the ability of advertisers and corporations to create alternate realities with which to more intrusively manipulate your reactions to their products and their process of "appropriating" copyrighted materials to strip you of wealth, security and stability while showcasing damagingly unreal body and facial construct images to your children and grandchildren with devastating psychological results. If you don't care you can just go along for the ride.
Being able to create images that look like real photographs just from written descriptions creates new weapons for bad operators to create convincing deep fakes, near endless political misinformation, destructive propaganda and even worse things. And make no mistake, the same technology is coming for video. Soon bad actors/terrorists/governments will be able to "create" news events that never happened, speeches from trusted leaders which were never spoken, never actually delivered, and all will be used in the service of stripping away your money and your rights.
But the first victims will be creative artists. Creative visual artists.
Popular photo websites backed by international corporations will jump in soon to "make it all okay." They'll extoll how much fun you too could be having by using the programs to "create new art." But the sad coda to that campaign of getting people to love their own creative destruction will be the demise of the jobs of those people writing about how great generative AI is right now. Once they convince enough of the population that we shouldn't care about the bad effects of unrestricted generative AI they'll be as disposable as the rest and a new generation of highly refined ChatGBT and other AI applications will take their places. Their jobs. Their pulpits. And why not? If you were a leader in a mega-corporation wouldn't you love to replace a gaggle of writers and editors with robot writers that never get tired? Never push back when you ask them to work in the absence of morals and ethics? When their primary mission is to extend the power and reach of their owners by manipulating content.
At the point where we lose control of the creative process and abdicate our rights to own and control our personal creative content it sure won't matter if Sony cameras AF the quickest or Fuji has the nicest color science because our robot overlords will no longer need us to use actual cameras and lenses to make more material; more data points for study. And that hobby/profession/fun pastime will disappear. And then we can skulk back to our homes and watch more TV. Or continue to cruise the web. The programming for which will also be generated by artificial intelligence with the sole purpose of controlling human thought, individual action and ultimately channeling cultural momentum. A dream scenario for authoritarians.
Fun times ahead. Of course this is just my take, pre-coffee. Let the apologists for misguided technology push back in the comments. I'd be interested to see how deeply the robots and their masters have implanted their rationalizations into the general population...
Gotta stop watching Transformer movies...