ChatGPT
vs Professor: Struggling with Fiction and Poetry
Although some
people are using ChatGPT to write children's books for sale, most
people are looking for information or entertainment when they
type in a prompt. In this study, I became more interested in what
type of narrative it might create with a specific or vague set
of instructions. For the most part, its stories were trite and
hackneyed generalizations befitting a mere sentence generator,
but when it strayed from the norm, it sometimes indulged in the
wondrous and the odd.
This was a
rare occurrence, and instead I focused on how the machine made
its choices, or rather how it rationalized those choices. I was
interested in how it decided who would be hero and who would be
foe, what the gender of a character might mean, and curiously,
what stories it chose not to tell. I used the tool to generate
a list of story requirements, and then showed that it didn't follow
any of its own dictates. Instead, it generated fairy tales with
flippant ease, as though developing a narrative was merely a matter
of following a formula and jamming words together which statistically
cohere.
Although on
the whole the machine didn't impress me with its ability to create
clichéd stories about shallow characters caught in typical circumstances,
some elements of the stories rose above the rest. When the machine
didn't have a good handle on what the prompt meant, or didn't
know how to manage the characters, it encouraged warfare and incest,
nonsense and incoherence, and in other ways showed that it still
had much to learn about both humanity and our stories.
|