"I write better poetry than you do," said EPICAC, coming back to ground his magnetic tape-recorder memory was sure of.
--Kurt Vonnegut, from his short story "EPICAC" (1950)
Why should I weigh in on AI? Well, I've written a little about it before. Maybe I just have to occasionally, if only to make sure the voices in my head are still primarily OI (Organic Intelligence).
I read some of the experts, of course, but no one knows everything, including AI itself. Not yet. Still, I'm curious about what's here already and what may be coming, especially in my world, the world of books. So I scan the splashy headlines and read the big stories, but I'm more interested in peeking into the virtual corners for signs. And, yes, I often go back to Vonnegut for fictional/historical precedent/prescience.
Just paying attention to the little things, the things that matter to book people like me. Here are some things I found recently:
"The most popular way publishers were considering leveraging AI was for marketing materials (49%)," BookNet Canada noted in its just released eighth edition of The State of Publishing in Canada 2023. The report added that while there was not much training provided on AI to publishing employees last year, publishers were thinking about how best to leverage the technology, with the primary areas being in marketing materials, including summary blurbs, promotion plans, etc. (49%) and metadata creation and management (19%).
In addition, small Canadian publishers were thinking of ways to implement AI across more areas than mid-sized and large publishers, though over half of small publishers were not considering leveraging AI at all (55%).
Before that survey was released, the Bookseller had reported that the largest book publisher in the Netherlands confirmed plans to use AI to translate some of its books into English. A spokesperson for Veen Bosch & Keuning said, "We are working on a limited experiment with some Dutch authors, for their books to be translated into English language using AI. There will be one editing phase, and authors have been asked to give permission for this."
In the U.K., AI-generated images were being used by a creative agency to promote Jodi Picoult's new novel on posters and social media, without the publisher's knowledge. A distorted poster for the U.K. edition of By Any Other Name, featuring a woman writing, demonstrated some of the typical signs of AI-generated images, the Bookseller reported. "AI was used without our knowledge," said Laura Gross, Picoult's agent. "Obviously, Jodi and I are extremely dismayed to discover this fact."
Last month, the Authors Guild announced it would offer its 15,000 members a new, "Human Authored" sticker to place directly on their book covers. "It isn't just to prevent fraud and deception," said Douglas Preston, bestselling writer and member of the Authors Guild Council. "It's also a declaration of how important storytelling is to who we are as a species. And we're not going to let machines elbow us aside and pretend to be telling us stories, when it's just regurgitating literary vomitus."
Wherever you turn...
Earlier this year Connor Osborn of the Bookworm, Omaha, Neb., told KETV that behind the counter the shop keeps an example of what you might get if you fall for an AI-created book scam. Osborn read from the book: "he go directly to room and saw Jacob romancing a lady."
Osborn said he shows that book to customers who ask to special order a title that raises red flags, including low page count and a lack of author information online, adding that mass-produced AI is flooding online marketplaces. He also noted that some books in the store have AI-generated cover art. "AI can only go off what already exists," Osborn said. "The whole part of art... is to create new things."
I was also intrigued by a recent Fast Company piece that noted: "As a society, we face a conundrum: It seems that no one, or no one thing, is morally responsible for the AI's actions--what philosophers call a responsibility gap. Present-day theories of moral responsibility simply do not seem appropriate for understanding situations involving autonomous or semiautonomous AI systems. If current theories will not work, then perhaps we should look to the past--to centuries-old ideas with surprising resonance today."
Fast Company time traveled the 13th and 14th centuries, when a similar question perplexed Christian theologians: How can people be responsible for their actions, and the results, if an omniscient God designed them--and presumably knew what they would do?
"Clearly, the relationship between AI developers and their creations is not exactly the same as between God and humans," Fast Company noted. "But as professors of philosophy and computing, we see intriguing parallels. These older ideas might help us today think through how an AI system and its designers might share moral responsibility."
The more I read about AI, the more it feels like I'm on a starship entering warp speed, with everything rushing past instantaneously--time, distance, information, etc.--and I'm sort of "keeping up," whatever that means now.
I don't think I'm ready for a medieval, theological perspective on AI, but I do seem to increasingly find a little consolation in the mindset of another literary tech icon. In his novel Slapstick, Vonnegut writes: "History is merely a list of surprises.... It can only prepare us to be surprised yet again."