Is Our Poetic Soul Safe?

By Runcong Liu

Source: Runcong Liu

Whether or not you keep abreast of recent developments in artificial intelligence, you have probably come across news about Generative Pre-trained Transformer 3 (commonly known as GPT-3), a language model released by the AI laboratory OpenAI this summer.

“Feed me a prompt, I will repay you whatever you want.” If GPT-3 can speak, that is probably how it will sell itself. The MIT Technology Review heralds GPT-3 as “the most powerful language model ever,” which seems to be a well-deserved recognition. Compared to its predecessor, GPT-2, which “was already able to spit out convincing streams of text in a range of different styles when prompted with an opening sentence,” “GPT-3 is a big leap forward.” GPT-3 “has 175 billion parameters (the values that a neural network tries to optimize during training), compared with GPT-2’s already vast 1.5 billion. And with language models, size really does matter.” 

What kind of texts can GPT-3 generate? Well, basically everything. A news article, an advertisement, a short story, guitar tabs, a computer code – you name it. GPT-3 even wrote an op-ed for the Guardian titled, “A robot wrote this entire article. Are you scared yet, human?”, which, according to the editor, “took less time to edit than many human op-eds.” “Artificial intelligence will not destroy humans. Believe me,” promises GPT-3 in that article. As a non-native English speaker who desperately desires a marketing communications job that would require my (English) writing skills, I read through the article in the hope of spotting a small error in order to (how pathetically!) convince myself that I write better than an AI does.

“But all I hear is doom and gloom. And all is darkness in my room.” 

My (future) job is at stake; so are others’. GPT-3 can produce any kinds of texts, and of course, poetry, which demands a masterful command of language and is considered by many as the top of the literature pyramid. As the British writer Arnold Bennett comments, “[i]maginative poetry […] is the highest form of literature. It yields the highest form of pleasure, and teaches the highest form of wisdom. In a word, there is nothing to compare with it.” While we may have already gotten accustomed to living with the mass fear of AI taking over blue-collar jobs, the possibility that AI can produce the highest form of literature triggers anxiety that one may find hard to deal with.

It is not the first time that AI has wanted a slice of the action in the poetry market.

While you are reading this article, somewhere in the world, there is probably an unknown poet who is tossing and turning due to the fear of GPT-3 taking over their career path. It is not the first time that AI has wanted a slice of the action in the poetry market. Poetry, a literary genre that features extensive use of metaphors and creative use of words, provides a huge space for AI to perform. In 2012, the American-based artist Ranjit Bhatnagar invented Pentametron, a program that produced a collection of sonnets titled, “I got an alligator for a pet!”; in 2017, a poetry collection generated by Xiaoice, an AI system developed by Microsoft in 2014, was published by Beijing-based Cheers Publishing. The year 2019 witnessed the release of Transformer Poetry, a chapbook of 26 poems; each is a hybrid of the original opening stanza of a well-known poem and additional stanzas produced by GPT-2.


Even though AI can write poetry, our poetic soul is safe for the time being, Dennis Tang reassures the concerned poets. In January this year, he published an article on Lit Hub titled, “The Machines Are Coming, and They Write Really Bad Poetry,” in which he presents poems generated by GPT-2 in the styles of Emily Dickinson, William Shakespeare, Robert Frost, Maya Angelou and Sylvia Plath. Tang points out that although these poems are grammatically correct and demonstrate the style specific to each poet, they exemplify a common problem in GPT-2’s writing:

There’s a common thread here. GPT-2’s writing is grammatically correct. It all more or less sounds true to its source, if all you heard was the tone. But what those sequences mean, therein lies the rub. GPT-2’s poetry prizes style over substance. Which is understandable, because it doesn’t know what substance is. 

Being a pure language model, GPT-2 has no knowledge of what words actually refer to, only the probability of a word appearing next to others. To it, a word like “chair” is just a string of characters, not a cluster of images or objects, let alone some more nebulous conceptual grouping of things that humans sit on… 


Since GPT-2 lacks “a knowledge of reality, the thing for which man made language to describe,” the poems it generates, no matter how stylistically similar they are to the poets’ works, are just a “pretty turn of phrase mean[ing] nothing.” Tang’s argument that AI-generated poems lack meaning resonates with the tech writer Bennat Berger’s article “What Does It Matter If AI Writes Poetry?”, which was also published in January. According to Berger,

The AI-provided poem sounds pretty, but is at best vague, and at worst devoid of meaning altogether. It’s a notable lapse, because poetry, at its heart, is about creating meaning and crafting implication through artful word selection. The turn-of-phrase beauty is essential – but it’s in no way the most important part of writing verse. In this context, AI-provided poetry seems hollow, shallow, and without the depth or meaning that drives literary tradition. 

It is true that AI has no idea of what a word refers to as we humans do. From a linguistic perspective, what AI plays with is always the signifier, not the signified. But does that mean poetry generated by AI has no meaning? This directs back to age-old questions of what the meaning of a literary work is and where it comes from. As Stein Haugom Olsen argues,

[I]t has become a critical commonplace that the literary work is a verbal expression, a verbal construct or an utterance, and that as such its peculiar nature is defined through the special way in which it means […] There are three main types of theory about literary meaning: autonomy theories, developed on the back of the practice of the New Criticism; semiotic theories, developed under the influence of the rising discipline of theoretical linguistics; and intentionalist theories, inspired by speech-act theory and Gricean analyses of meaning. 


No matter which of these theories you support, the meaning of a literary work involves a complex trifecta: the work per se, the author, and the reader. Disbelievers in the existence of meaning in AI-generated poems, like Tang, hold the view that “in the hands of a mature, human user, all these patterns [idioms and syntaxes, bits of aesthetically pleasing strings] are ultimately anchored to meaning – the goal of conveying a feeling or thought.” In other words, only human poets can create a poem embedded with meaning (while AI cannot).

The meaning of a poem entails not only the author but also the reader.

However, the meaning of a poem entails not only the author but also the reader. Getting the meaning of a poem across to its reader is not like the author handing a Euro banknote directly to the reader, but more of a conversion to different currencies, after which the meaning changes. As Susan Sontag notes, “[i]t doesn’t matter whether artists intend, or don’t intend, for their works to be interpreted,” and interpretation is more of translation:

Directed to art, interpretation means plucking a set of elements (the X, the Y, the Z, and so forth) from the whole work. The task of interpretation is virtually one of translation. The interpreter says, Look, don’t you see that X is really—or, really means—A? That Y is really B? That Z is really C?

When readers read a poem, they interpret it the way they like, and new meanings, which the author may not intend to convey, emerge from interpretation. Therefore, that AI-generated poems have no meaning is more of a biased view that neglects the agency of readers.

Apart from the lack of meaning, human poets might also be reassured to see the seeming absence of emotion in AI-generated poetry. “The history of intelligent machines is one of moving goalposts: Sure, a machine can do this, but can it do that? The ‘that’ is often an achievement that strikes us as strongly connected to emotion—that seems especially human.” Poetry is often considered a literary genre rich in emotions. According to Eran Hadas, “[w]e believe that poetry transmits an emotional message from one person to another which bears a distinctive token of humanity, something that only a human soul can convey and share with another human soul.” It seems that AI-generated poems do not carry or transmit emotions as they are not written by humans.


The ways different AI programs generate poems may vary, but they are all likely to incorporate in their works words and imagery which inherently register a certain emotion. In Tang’s article, he presents a stanza generated by GPT-2 based on the opening stanza in one of Emily Dickinson’s poems:

Because I could not stop for Death

Because I could not stop for Death —
He kindly stopped for me —
The Carriage held but just Ourselves —
And Immortality.

That was all, a very solemn moment with a very brave mind.
The sun fell on us, we were left behind,
the world went dark —
Our only light was the moon —
And we were still there.
(generated by AI)

AI uses words such as “solemn” and “dark” to convey a quiet, calm and solemn mood. A poem’s mood or atmosphere is evoked by words like these, so the argument that AI-generated poems do not transmit emotional messages seems shaky.

Wait! Do not jump to the conclusion that our poetic soul is at stake. Even though AI-written poems have meanings that may rely on readers’ interpretations to surface and emotions conveyed through certain words, we still have every reason to hold on to the belief that there is, at least for now, no natural or artificial creature that is able to write good poetry as we humans do. When you compare the AI-generated stanza above with Dickinson’s original poem, you must be relieved to find that the former is overshadowed by the latter. AI can write mediocre poems, but not excellent ones, for the time being at least. In this regard, we should thank Dickinson, and every great poet we can name, for creating poetry that helps maintain our belief in humans’ incomparable capacity for artistic writing. Moreover, as humans, we are endowed with qualities that AI poets do not possess – care for reality, insights into the world, the ability to make a judgement; these are the foundations of our poetic soul. We must take AI’s ability to write poetry seriously, but we should still have confidence in ourselves.


Runcong Liu is a Master’s student in Comparative Literary Studies at Utrecht University. Her research interests span the fields of life writing and posthumanism, especially A.I.-related topics.


References

Bennett, Arnold. “Arnold Bennett Premium Collection.” Google Books, Google, 2019, books.google.nl/books?id=5LboDwAAQBAJ

Berger, Bennat. “What Does It Matter If AI Writes Poetry?” Medium, Becoming Human: Artificial Intelligence Magazine, 12 June 2020, becominghuman.ai/what-does-it-matter-if-ai-writes-poetry-54665a950a37.  

GPT-3. “A Robot Wrote This Entire Article. Are You Scared Yet, Human?” The Guardian, Guardian, 8 Sept. 2020, www.theguardian.com/commentisfree/2020/sep/08/robot-wrote-this-article-gpt-3

Hadas, Eran. “Poetry and Artificial Intelligence Are Tricky to Define.” Poetry International, 2018, www.poetryinternational.org/pi/article/29448/Poetry-and-Artificial-Intelligence-are-tricky-to-define/nl/tile.  

Heaven, Will Douglas. “OpenAI’s New Language Generator GPT-3 Is Shockingly Good-and Completely Mindless.” MIT Technology Review, 20 July 2020, www.technologyreview.com/2020/07/20/1005454/openai-machine-learning-language-generator-gpt-3-nlp/.  

Olsen, Stein Haugom. “The ‘Meaning’ of a Literary Work.” New Literary History, vol. 14, no. 1, 1982, pp. 13–32. JSTOR, www.jstor.org/stable/468955.  

Rockmore, Dan, and John Seabrook. “What Happens When Machines Learn to Write Poetry.” The New Yorker, 2020, www.newyorker.com/culture/annals-of-inquiry/the-mechanical-muse.  

Sontag, Susan. Against Interpretation and Other Essays. Penguin, 2013. 

Tang, Dennis. “The Machines Are Coming, and They Write Really Bad Poetry.” Literary Hub, 21 Jan. 2020, lithub.com/the-machines-are-coming-and-they-write-really-bad-poetry/

The Rolling Stones. Lyrics to “Doom and Gloom.” YouTube, 2012, https://www.youtube.com/watch?v=1DWiB7ZuLvI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: