What Happens to Illustrators When Robots Can Draw Robots? – US 247 News

[ad_1]

The first time Michael Whelan was warned that robots were coming for his job, it was 198os. He had just finished painting the cover for a mass-market paperback edition of Stephen King’s “The Dark Tower: The Gunslinger,” a gritty portrait of the title character with the outline of a tower glimpsed through the haze behind him.

The art director for the project told him to enjoy these cover-art gigs while he could, because soon they would all be done by computers. Whelan dismissed him at the time. “When you can get a good digital file or photograph of a dragon, let me know,” he recalled saying.

For the next three decades, Whelan kept painting covers the old way — on canvas, conjuring dragons, spaceships and, of course, robots for science fiction and fantasy giants including Arthur C. Clarke, Isaac Asimov, Ray Bradbury and Brandon Sanderson.

Over time, Whelan forgot the art director’s name, but not his words. Now, he said, the day he was warned about is here. Robots have already started taking book illustration jobs from artists — and yes, they can paint dragons.

Over the past few months, users working with A.I. art generators have created hundreds of images in Whelan’s style that were slightly altered knockoffs of his work, he said, forcing him to devote considerable time and resources to getting the images removed from the web.

“As someone who’s been in this genre for a long time, it doesn’t threaten me like it does younger artists who are starting out, who I have a lot of concern for,” Whelan said. “I think it’s going to be really tough for them.”

While much of the discussion in publishing around generative artificial intelligence tools such as ChatGPT has focused on A.I.’s unauthorized use of texts for training purposes, and its potential to one day replace human authors, most writers have yet to be directly financially affected by A.I. This is not true for the commercial artists who create their book covers.

Even though questions about the ownership A.I.-generated art is currently a legal hornet’s nest, with a judge ruling in August that such work cannot be copyrighted, this hasn’t stopped A.I. generators from using work by human artists for training. Nor has it prevented A.I.-generated art from being used on major book covers.

Late last year, Tor Books, a major publisher of science fiction and fantasy, said it had used an image on the cover of the novel “Fractal Noise,” by Christopher Paolini, that it later learned had likely been created in part by A.I. In May, Bloomsbury Publishing admitted that the British edition of Sarah J. Maas’ “House of Earth and Blood” inadvertently used an A.I.-generated image on the cover. In both cases, the publishers said that the art came from stock image sites and that the design teams were unaware they were likely generated by A.I. until fans reacted with an uproar.

The explanation did little to reassure proponents of human-generated art. After all, if books by best-selling authors like Maas and Paolini aren’t safe from the robots, whose are?

“It seems a bit like sci-fi come into real life,” said the illustrator Chris Sickels, who created the cover for the recent robot fairy tale “In the Lives of Puppets.”

But the battles between humans and machines haven’t played out in exactly the manner predicted by science fiction. Asimov’s famous concept, the Three Laws of Robotics, forbids artificial intelligence from harming human beings — but does not mention potential copyright infringements or replacing human jobs.

“The funny thing is, in so much science fiction it was theorized that robots, A.I., all this stuff was going to take over the drudgery, the hard labor, and free up humans to do creative work,” Paolini said. “Instead the A.I. is taking over the creative work, and we’re all stuck doing the hard labor.”

For Kelly McKernan, a Nashville-based artist and illustrator, this strange new reality has meant fewer projects from self-published authors, bands, and small presses — the types of illustration gigs that used to be a significant source of income.

“I’m finding that a lot of people who would have hired me before are now moving to A.I.,” McKernan said. For the first time, McKernan, who uses they/them pronouns, has had to take on work beyond illustration to make ends meet. That’s part of the reason that in January, along with the artists Karla Ortiz and Sarah Andersen, McKernan filed a class-action lawsuit against the makers of several popular A.I. art generators.

McKernan regularly sees echoes of their work in A.I. creations and knows that at least 50 of their images have been collected in data sets used to train A.I. models. By the end of last year, their name had been used more than 12,000 times as a style prompt by users of A.I. art generators.

McKernan’s suit is facing an uphill battle. On Oct. 30, U.S. District Judge William Orrick dismissed all but one of the claims, and gave McKernan, Ortiz and Andersen 30 days to amend their complaint.

Even before the ruling, Pamela Samuelson, a law professor at the University of California, Berkeley, was not optimistic about the suit’s chances. “At some level of abstraction, style is not protectable by copyright law at all,” Samuelson said. Nor is the loss of wages for human artists a compelling legal argument, she added: “Copyright law is not a jobs program.”

However, human-created art did score a major victory in August when a federal court in Washington, D.C. ruled that only works with human authors can receive copyright protection. This makes using A.I. to create commercial work unattractive in many instances, Samuelson said, because any images created would be in the public domain.

Even so, some artists are going to great lengths to protect their work from being used to train A.I. art generators without their consent. Researchers at the University of Chicago released a tool called “Nightshade” that aims to “poison” A.I. models by allowing artists to upload their images with code intended to mislead A.I. art generators.

While the courts and lawmakers work out the response to the emerging technology, professionals in the book illustration world have been left to grapple with A.I.’s growing influence, even as it produces art that most agree is not up to the standards of human artists — at least not yet.

Sickels, who grew up on a small family farm, sees parallels between A.I.’s rise in illustration and the industrialization that occurred long ago in other fields. “Small family farms can’t compete with the larger tens-of-thousands-of-acre farms that exist now,” he said. He believes artists may be forced to adjust much like farmers once did — by “doing something really well on a small scale.”

Though Paolini was dismayed that an A.I.-generated image had appeared on the cover of one of his books, he believes the technology can eventually be used responsibly. “Just because something has a lot of potential concerns doesn’t mean that it isn’t also incredibly useful,” he said. “But I do think that we need strong protections put in place for creatives, for artists, for writers.”

Whelan, too, sees the possibilities inherent to the new technology and uses A.I. art generators regularly to brainstorm.

“I’ve got mixed feelings about A.I.,” he said. “Enabling people to do rip-offs of my work is not something that I like, but personally speaking, as a tool, it’s very helpful.”

In of Whelan’s works, the Asimov character Hari Seldon stares out from the surface of Trantor, the capital of the destined-to-crumble Galactic Empire. In another, a masked character from Joan D. Vinge’s novel “The Snow Queen” poses against a star-filled sky.

And in some of these visions, the robots are our friends: On the cover of Asimov’s “Robots and Empire,” two robots shake hands as they work to help humanity.



[ad_2]