Why A.I. can't create Unique Fiction?
The reason behind why fictions created by AI is only superficially good.
AI is progressing fastly. While many AI experts believe that AGI won’t be possible in the coming few years or so as the ‘wall’ is near. I am a bit sceptical because the improvements in AI have been very rapid in the past few years (or months even). Maybe we won't get closer to AGI any time soon, development in other fields of AI will continue to happen - and so will the competition surrounding AI.
Before November of last year - before ChatGPT was launched, we saw AI in limited form and with limited capabilities. Most of these AI tools were either used in recommendation systems or other specific purpose tasks. Interactivity was low in them. Nonetheless, tools like Grammarly for proofreading and many other paraphrasing tools had shown some essence of AI that we could interact directly with.
While these were nearly good if not the best, the whole generative ai landscape transformed after the release of ChatGPT. Chaptgpt in a way raised the standards of Generative AI and its capability to achieve things that are at par with human beings.
But critics can argue that this is not quite right - saying that Generative AI is not really as perfect as humans. Hallucinations and misinformation are hard blocks to remove in the AI. Instances like making up facts, citing wrong resources and (more majorly) showing biases in the way it responds creeps up many times - and sometimes it's hard to detect.
The Superficial Creativity
And the fault is not in ChatGPT, the fault seems to be in the AI model itself on which it is running. Generative Pre-trained Transformer or GPT, which is the strongest generative NLP model out there, still contains fallacies and misinformation. Sure, gpt4 made major improvements compared to its predecessor.
Source: https://openai.com/research/gpt-4
Well, okay, if ChatGPT can't be that perfect in producing technical and information-heavy topics, then how about tasks that actually don't need accurate cent per cent facts?
What I mean is that, how about if I want ChatGPT to create a fictional story, or write a simple poem, or make an interesting comical speech, or essentially anything where the accuracy of the content does not matter much?
Well, ChatGPT would do quite well in this respect (the fact that it can make a poem just on anything ).
But if you look carefully, there is a bit of a problem when we talk about generating fictional pieces. It may only look good superficially.
First of all, a piece of literature stands for one of two reasons - it presents an idea which is unique, or the content is unknown to many, or the writing style has a different personality and tone that the readers resonate with.
Secondly, any story is just a draft if it lacks a structure - what the writers call 'unpolished’ writing. Joining random phrases, and stacking them with each other without any context is useless if you want to create a fiction story in true essence.
The Adventure of Sherlock Holmes or the Goosebumps series was fictional - but if you have read them you would realize it is more than just fictional facts stacked with each other. It has a form, an outline, and a flow and basically resonates with "human" readers.
Unique Trends can’t give Unique Predictions
So where does AI lack? Well, AI is trained on data and all it does is that it tries to try to find a trend.
For example, if I give a prompt to create a poem on ABC, it first checks the pattern in all poems out there ( for instance, it finds that most of the poem follows a ‘rhyming scheme’), and then it tries to find information about ABC - finding interesting adjectives and major qualities to use in the poem. And then when it’s done, it creates a poem who's style 'resembles' other poems written about the same topics.
You see, the caveat is that your generated story is just a great mixture of already used data mixed in a way it seems unique - while in actuality it is not unique. Noam Chomsky, American philosopher, and cognitive scientist describes this as “high-tech plagiarism”.
So is that creativity? Maybe not - I will call it the 'wanna be creative' ability of AI. If you are not a fan of using satirical metaphors, then what I essentially mean is that these models find common trends of actual creative works - combining them together smartly so that it seems 'creative enough'.
Much great literature usually doesn't follow a set of rule books (or training data). For example, poems by Walt Whitman are usually free verse without any rhyming scheme - still they are considered poetry. Moreover, many book titles and covers do not match the content of the book itself - but still, they were successful.
They all were great successes. Why? Because they didn't follow a trend like an AI, all they did was they resonated with the readers well. Sure, fictional authors have the creative liberty to do whatever they want and present it to their readers. But would Sherlock Holmes be a success if Holmes would have solved any cases out of thin air using abracadabra? It won't.
Copying the natural landscape as it is (let's say for instance) does not make you creative. It just so happens that you copied an already creative work, and then copied many such creative works and combined them all and posed it as yours. While these seem trivial, this was the same reason why many artists sued Midjourney - an AI image generation tool - over copyright.
Writers and AI
Writers CAN use AI. Many of my fellow writers use ChatGPT for various reasons - some use it to deal with their writing block, or some use it for brainstorming new ideas and even creating outlines for their articles. I am not very good at prompt engineering yet - so I stay away from ChatGPT while writing posts for this newsletter.
I find pleasure to present my views and opinions on my own - polished and presented directly to the readers ( partly the reason why I started Creative Block was to share my views and insights without any censor). Personally, that is better, because if my articles fail tomorrow, or cause any trouble, I know that it's because of my craft. I don't want to get into controversies just because an AI messed up while helping me write an article.
However, there are writers out there who know how to use ChatGPT quite well in their writing endeavours. And if it works for them - then good for them.
I asked a fellow copywriter on LinkedIn, and they said while ChatGPT is great at creating written material, your role as a human writer isn't finished. While their time for writing and ideation of topics was reduced heavily, it led to increased time for proofreading and fact-checking. Remember what I mentioned before, Generative AI carries misinformation and bias that creeps out of nowhere.
So interestingly now essentially you are spending more time fact-checking and proofreading.
In some cases, especially for amateur writers, this paradoxically increases the time for content creation. One person uses ChatGPT to decrease their time in forming the content, but later spends more time fact-checking.
This is, of course, not true for all writers - especially for those who have acquainted themselves with ChatGPT and its integration in their writings.
Conclusion
This is quite interesting. When I started writing articles - covering ChatGPT, I thought it would do exceptionally well while writing fictional content - even if it struggles with informative and heavy writings.
But after experimenting, I found much of the fiction literature it created was arguably good but yet seemed dry. Sometimes it has humour but does not fit well in the content.
Can AI take the place of writers? I think both yes and no. It can take the place of amateur writers and copywriters, but those writers who are really good at their craft can’t be replaced by AI.
The reason is that proficient writers have their distinct voice and style that sets them apart from the ‘trend’.
What do you think? Let me know in the comments.
See you in the next post.
great article! please keep us up to date!