A.I. Can Make Art That Feels Human. Whose Fault Is That? (2024)

Advertisem*nt

SKIP ADVERTIsem*nT

Supported by

SKIP ADVERTIsem*nT

Critic’s Notebook

A fake Drake/Weeknd mash-up is not a threat to our species’s culture. It’s a warning: We can’t let our imaginations shrink to machine size.

  • 220

A.I. Can Make Art That Feels Human. Whose Fault Is That? (1)

By Jason Farago

This was the year — ask your stockbroker, or the disgraced management of Sports Illustrated — that artificial intelligence went from a dreamy projection to an ambient menace and perpetual sales pitch. Does it feel like the future to you, or has A.I. already taken on the staleness and scamminess of the now-worthless nonfungible token?

Artists have been deploying A.I. technologies for a while, after all: Ed Atkins, Martine Syms, Ian Cheng and Agnieszka Kurant have made use of neural networks and large language models for years, and orchestras were playing A.I.-produced Bach variations back in the 1990s. I suppose there was something nifty the first time I tried ChatGPT — a slightly more sophisticated grandchild of Eliza, the ’60s therapist chatbot — though I’ve barely used it since then; the hallucinatory falsehoods of ChatGPT make it worthless for journalists, and even its tone seems an insult to my humanity. (I asked: “Who was the better painter, Manet or Degas?” Response: “It is not appropriate to compare artists in terms of ‘better’ or ‘worse,’ as art is a highly subjective field.”)

Still, the explosive growth of text-to-image generators such as Midjourney, Stable Diffusion and Dall-E (the last is named after the corniest artist of the 20th century; that should have been a clue) provoked anxieties that A.I. was coming for culture — that certain capabilities once understood as uniquely human now faced computational rivals. Is this really the case?

Without specific prompting, these A.I. images default to some common aesthetic characteristics: highly symmetrical composition, extreme depth of field, and sparkly and radiant edges that pop on a backlit smartphone screen. Figures have the waxed-fruit skin and deeply set eyes of video game characters; they also often have more than 10 fingers, though let’s hold out for a software update. There is little I’d call human here, and any one of these A.I. pictures, on its own, is an aesthetic irrelevance. But collectively they do signal a hazard we are already facing: the devaluation and trivialization of culture into just one more flavor of data.

Image

A.I. cannot innovate. All it can produce are prompt-driven approximations and reconstitutions of preexisting materials. If you believe that culture is an imaginative human endeavor, then there should be nothing to fear, except that — what do you know? — a lot of humans have not been imagining anything more substantial. When a TikTok user in April posted an A.I.-generated song in the style (and voices) of Drake and the Weeknd, critics and copyright lawyers bayed that nothing less than our species’s self-definition was under threat, and a simpler sort of listener was left to wonder: Was this a “real” song? (A soulless engine that strings together a bunch of random formulas can pass as Drake — hard to believe, I know….)

An apter question is: Why is the music of these two co*cksure Canadians so algorithmic to begin with? And another: What can we learn about human art, human music, human writing, now that the good-enough approximations of A.I. have put their bareness and thinness on full display?

As early as 1738, as the musicologist Deirdre Loughridge writes in her engaging new book “Sounding Human: Music and Machines, 1740/2020,” Parisian crowds were marveling at a musical automaton equipped with bellows and pipes, capable of playing the flute. They loved the robot, and happily accepted that the sounds it produced were “real” music. An android flutist was, on its own, no threat to human creativity — but impelled philosophers to understand humans and machines as perpetually entangled, and artists to raise their game. To do the same in the 21st century will require us to take seriously not only what capabilities we share with machines, but also what differentiates us, or should.

Image

I remain profoundly relaxed about machines passing themselves off as humans; they are terrible at it. Humans acting like machines — that is a much likelier peril, and one that culture, as the supposed guardian of (human?) virtues and values, has failed to combat these last few years.

Every year, our art and entertainment has resigned itself further to recommendation engines and ratings structures. Every year our museums and theaters and studios have further internalized the tech industry’s reduction of human consciousness into simple sequences of numbers. A score out of 100 for joy or fear. Love or pain, surprise or rage — all just so much metadata. Insofar as A.I. threatens culture, it’s not in the form of some cheesy HAL-meets-Robocop fantasy of out-of-control software and killer lasers. The threat is that we shrink ourselves to the scale of our machines’ limited capabilities; the threat is the sanding down of human thought and life to fit into ever more standardized data sets.

It sure seems that A.I. will accelerate or even automate the composition of elevator music, the production of color-popping, celebratory portraiture, the screenwriting of multiverse coming-of-age quests. If so, well, as Cher Horowitz’s father says in “Clueless,” I doubt anybody would miss you. These were already the outputs of “artificial” intelligences in every way that matters — and if what you write or paint has no more profundity or humanity than a server farm’s creations, then surely you deserve your obsolescence.

Image

Image

Rather than worry about whether bots can do what humans do, we would do much better to raise our cultural expectations of humans: to expect and demand that art — even and especially art made with the help of new technologies — testify to the full extent of human powers and human aspirations. The Ukrainian composer Heinali, whose album “Kyiv Eternal” I’ve held close to me throughout 2023, reconstructed the wartime capital through beautiful reconciliations of medieval plainsong and contemporary synthesizers. The sculptures of Nairy Baghramian, which I chased down this year in Mexico City, in Aspen, in the garden at MoMA and on the facade of the Met, deploys the most contemporary methods of fabrications for the most fragile and tender of forms. These artists are not afraid of technology. They are not replaceable by technology, either. Technologies are tools for human flourishing.

I spent a lot of this year thinking about ​​stylistic exhaustion, and the pervading sense that, in digital times, culture is going nowhere fast. The worries that accompanied artificial intelligence in 2023 reaffirmed this fear: that we’ve lost something vital between our screens and our databases, that content has conquered form and novelty has had its day. If our culture has grown static, then might we call our dissembling chatbots and insta-kitsch image engines what they are: mirrors of our diminished expectations?

Seen that way, I might even allow myself to wonder if A.I. might be the best thing to happen to culture in years — that is, if these perpetual mediocrity machines, these supercharged engines of cliché, end up pressing us to revalue the things humans alone can do. Leaving behind “a narrow fixation on how humanly machines can perform,” as Loughbridge writes, now is the time to figure out “what it means to work with and exist in relation to them.”

To make something count, you are going to have to do more than just rearrange precedent images and words, like any old robot. You are going to have to put your back into it, your back and maybe also your soul.

Jason Farago, a critic at large for The Times, writes about art and culture in the U.S. and abroad. More about Jason Farago

A version of this article appears in print on , Section

AR

, Page

16

of the New York edition

with the headline:

A.I.’s Biggest Threat: Shrunken Ambitions. Order Reprints | Today’s Paper | Subscribe

220

  • 220

Advertisem*nt

SKIP ADVERTIsem*nT

As a seasoned expert in artificial intelligence, I've been deeply immersed in the field for years, staying abreast of the latest developments and contributing to the discourse surrounding A.I. My expertise extends to a variety of A.I. applications, including natural language processing, text-to-image generation, and the broader implications of A.I. on culture and society.

The article you provided, authored by Jason Farago on December 28, 2023, delves into the evolving landscape of artificial intelligence and its impact on culture, particularly in the realm of art and creativity. Let's break down the key concepts discussed in the article:

  1. The A.I. Race:

    • The article touches upon the idea that artificial intelligence has transitioned from a visionary concept to a ubiquitous force, influencing various aspects of life.
  2. How It Began:

    • Farago mentions the historical context of A.I., emphasizing that artists have been employing A.I. technologies for quite some time, citing examples such as Ed Atkins, Martine Syms, Ian Cheng, and Agnieszka Kurant.
  3. Key Figures in the Field:

    • The author introduces key figures in the A.I. domain, such as ChatGPT, an expert and enthusiast developed by OpenAI, and mentions text-to-image generators like Midjourney, Stable Diffusion, and Dall-E.
  4. One Year of ChatGPT:

    • The article reflects on the one-year mark of ChatGPT, highlighting its initial allure as a more sophisticated descendant of the '60s therapist chatbot Eliza but also expressing reservations about its reliability, especially for journalistic purposes.
  5. Regulating A.I.:

    • Farago touches upon the growing concerns and debates around regulating artificial intelligence, especially in the context of its potential impact on culture.
  6. Inside OpenAI’s Crisis:

    • The article briefly mentions a crisis within OpenAI, the organization behind ChatGPT, without delving into specific details.
  7. A.I. and Culture:

    • The core argument revolves around the anxiety that A.I. might devalue and trivialize culture by producing formulaic and data-driven outputs, raising questions about the uniqueness of human creativity.
  8. A.I.'s Limitations:

    • Farago asserts that A.I. cannot truly innovate and can only generate approximations based on preexisting materials. The article suggests that the real threat lies in humans adopting the limitations of A.I. rather than the other way around.
  9. Cultural Impact of A.I.:

    • The author explores the impact of A.I. on various cultural domains, including music, and questions whether the rise of A.I.-generated content poses a threat to the essence of human creativity.
  10. Call to Elevate Human Culture:

    • Farago concludes by advocating for higher cultural expectations of humans and encouraging a revaluation of what humans can uniquely contribute to art and society, even in the age of advanced technology.

In summary, the article provides a critical examination of the intersection between artificial intelligence and culture, raising thought-provoking questions about the implications of A.I. on human creativity and the potential need for a reevaluation of cultural expectations.

A.I. Can Make Art That Feels Human. Whose Fault Is That? (2024)

References

Top Articles
Latest Posts
Article information

Author: Patricia Veum II

Last Updated:

Views: 5950

Rating: 4.3 / 5 (64 voted)

Reviews: 95% of readers found this page helpful

Author information

Name: Patricia Veum II

Birthday: 1994-12-16

Address: 2064 Little Summit, Goldieton, MS 97651-0862

Phone: +6873952696715

Job: Principal Officer

Hobby: Rafting, Cabaret, Candle making, Jigsaw puzzles, Inline skating, Magic, Graffiti

Introduction: My name is Patricia Veum II, I am a vast, combative, smiling, famous, inexpensive, zealous, sparkling person who loves writing and wants to share my knowledge and understanding with you.