ChatGpt, if he writes bullshit it's because he looks like us

ChatGpt, if he writes bullshit it's because he looks like us


In the deep waters of the Gulf of Mexico, a young woman named Rachel is clinging to the edge of an oil rig. Despite the wind that ruffles her auburn hair and the ocean spray that soaks her jeans, the woman continues to climb, determined to unearth evidence of an illegal drilling. Once aboard, however, she will discover something much more sinister.

The paragraph you have just read is a synopsis of Oil and Darkness, a horror film set on an oil rig. The film stars Rachel, an environmental activist, Jack, a guilt-ridden foreman, and Ryan, a shady company executive who is conducting dangerous research into a "highly flammable new type of oil". It's the kind of movie you might stumble upon late night channel surfing, or might have watched with one eye closed on a long flight. Too bad this movie doesn't really exist.

The potential of ChatGpt

Oil and Darkness was developed and written by ChatGpt , an artificial intelligence (Ai) chatbot. Ai enthusiast and marketer Guy Parsons fed the film format into the system, then asked him to come up with a title, tagline, key characters, and plot details; as an argument he suggested to the chatbot “a horror film set on an oil rig”. The results of OpenAi's new software have been astounding: the synopsis has tension, well-delineated characters, and hints at a dark secret. It promises jaw-dropping action scenes and maybe even a dash of political commentary. It is one of many examples that have been making the rounds on social media and WhatsApp chats in recent days, showcasing the seemingly magical powers of ChatGpt.

The new Ai chatbot is trained through texts taken from books, articles and websites that have been "cleaned" and structured with a process called supervised learning. ChatGpt is able to code, invent songs, and compose poetry or haiku. It remembers what users have written and can make accurate changes on demand. Accept even the most casual suggestions, using them to compose stories that tie even the most disparate strands in an orderly way, where details that seem irrelevant in the first paragraph prove to be useful in the last. The chatbot can also tell jokes and explain why they are funny. He can write punchy and intriguing magazine-style headlines, with believable but completely fabricated quotes.

The paucity beneath the surface

ChatGpt's strength is also its biggest flaw The new chatbot has conquered the internet and demonstrated how engaging conversational Ai can be, even when you invent facts. All this makes playing with ChatGpt a fun, fascinating and engaging experience, but – for professional writers – also quite worrying. Before long, though, you begin to sense a lack of depth behind the system's effective prose. The chatbot makes factual errors, confusing events and people; relies heavily on tropes and clichés, proposing the worst stereotypes of our society; even if on the surface they seem remarkable, the words that the system chooses are mostly without substance. In most cases ChatGpt produces what The Verge describes as " flowing bullshit ".

In one sense this shouldn't come as a surprise. ChatGpt was trained with text taken from the real world, and the real world is essentially based on scrolling bullshit. Perhaps the plausibility of an invented film like Oil and Darkness is not due so much to the skill of AI, but to the fact that the film industry fails to come up with original ideas. From a certain point of view, when asked to create the subject of a film, the artificial intelligence does nothing but imitate the pre-packaged process by which many Hollywood blockbusters are made: it looks around, sees what has been successful and it takes up its elements (actors, directors, plot structures) combining them in a form that seems new but in reality is not.

The same thing happens in publishing, where tendencies rather limited can overwhelm the industry and dominate for years, filling the shelves of bookstores with covers that are always the same or titles that follow the same old-fashioned pattern.

Too faithful an imitation

The problem is not limited to the creative industries alone. Bullshit is everywhere: in viral LinkedIn posts and podcasts propounding the rules we should adopt in our lives, in fundraisers and academic journals, and even in the article you're reading. The world of politics and business is full of people who have reached the top because they have been able to speak credibly for a long time without actually saying anything. The most prestigious schools and universities structure instruction to instill in students a single skill: absorb information very quickly, regurgitate it confidently in a predetermined format and then immediately forget it to move on to something else. Successful people then pour into government, the consultancy sector and, yes, even journalism.

The debate around ChatGpt has highlighted the detrimental effects the system could have on society, for example promoting torture, reproducing sexism, or allowing kids to copy homework. We are concerned about the impact that AI-generated responses could have if they end up in the data with which future chatbots are trained, creating a Ready Player One-style indistinct mush of references: a dull hodgepodge, which is then shredded and repurposed to us, a virus that invades any innovation.

Yet, to be honest, the old man-made sliding bullshit weaponized by social media has already had disastrous effects . In the post-truth era smoothness is everything and bullshit is everywhere: it stands to reason that ChatGpt's smooth bullshit seems plausible. It couldn't have been otherwise, since people were used to train the system.

Ultimately, ChatGpt's bullshit reminds us that language is a bad substitute for thinking and understanding. No matter how fluid and coherent a sentence may seem, it will always be subject to interpretation and misunderstanding. And in a world where everything is smooth bullshit, ChatGpt is just another entry in the pile (and yes, even the paragraph you just read, in its original version, was written by the chatbot).

This article originally appeared on UK.

Powered by Blogger.