Or: How I Learned Where To Draw A Line In The Sand.
Earlier this month I published a list of my favourite video games of 2025, and at the very top of that list (or more accurately the bottom, because it was a countdown) was Clair Obscur: Expedition 33.
Clair Obscur has been picking up accolades left and right this year, including a record-breaking nine wins at The Game Awards. The reason for this is simple: Clair Obscur is, by any metric, a fantastic game. It features brilliant turn-based combat; the art design and music are gorgeous; the story is superbly written, and it features some of the best acting I’ve ever seen in the medium. But I can no longer in good conscience keep it on my list of the best games of the year.
In the last couple of days, The Indie Game Awards made the decision to strip Clair Obscur of their Game of the Year gong after it emerged that the developer, Sandfall Interactive, used generative AI to make some placeholder assets during the development of the game:
There are doubtless people out there who would view all of this as a pretty massive overreaction. But these people are wrong, and the organisers of the Indie Game Awards were completely right to take back their trophies.
Generative AI works by taking a prompt from a user and feeding it into a large language model (LLM), which is a massive database of text and images, and creates a new image or piece of text by looking at the information inside it and making predictions about what information should go together. Boiled all the way down, it’s the same principle as your phone’s keyboard suggesting what the next word in a sentence should be based on what you’ve typed before. The difference is that LLMs are trained on massive amounts of data, including copyrighted data, which has been scraped from the internet without the permission of the people holding the copyright. Generating an image or a piece of text using an LLM is, almost by definition, an act of plagiarism.
Even ignoring the ethical problems with generative AI (which, to be clear, we absolutely should not), it’s also an ecological disaster that’s actively undoing much of the progress we’ve made in reducing carbon emissions over the last two decades. A single Chat-GPT prompt uses ten times the energy of a single Google search. The total energy it uses in a day could power Belgium. And that’s before we even get to the frankly obscene amount of water needed to cool the massive data centres that are handling all of these prompts on a daily basis.
To get back to the Clair Obscur of it all, Sandfall’s defenders would argue that this was a pretty defensible use of generative AI. After all, they might argue, we’re talking about a handful of placeholder assets that were never meant to appear in the finished version of the game. In fact, those images above aren’t even in the game anymore; Sandfall patched them out within a week of the initial release back in April. Surely, at a time when developers are facing layoffs and massive amounts of crunch to meet tight deadlines, a tool that saves them time should be applauded? But there are some pretty glaring holes in this argument.
Firstly, why bother using AI to generate a placeholder for a background texture in the first place? Why not buy an image from a stock image library, or just write “ADD IMAGE HERE LATER” in neon green Comic Sans? It’d be much harder to miss something like that in the quality control stage before the game ships. Besides, there’s something sort of charming about those kind of errors. Like the thumbprints that appear in a frame of stop-motion animation, they’re evidence of the human effort that went into crafting the finished piece of art.
Secondly, this is the games industry that we’re talking about, where the greatest priority is always the bottom line. Even developers that have a proven track record of making games that are critical and commerical successes are at risk of being shut down for the sake of making a number on a line graph continue to go up. The executive class is already collectively salivating at the idea of replacing entire teams of people with generative AIs — after all, an LLM doesn’t need to be paid a salary. If we normalise using generative AI for supposedly mundane tasks like creating placeholders, it won’t be long until they infect every step of the creative process.
What makes this sting all the more is the fact that Clair Obscur is a game whose story is a love letter to the very human process of creating art. It even goes so far as to argue that every time a person creates something, they imbue it with a portion of their own soul. Knowing that at least some of the people at Sandfall were willing to use a soulless machine to churn out plagiarised slop makes the whole thing feel bittersweet; as if they fundamentally misunderstood the very thing they were trying to make. It’s an indelible stain over everything else that made Clair Obscur a modern masterpiece. I’m so glad that I experienced it, and I’m so deeply sad that I now feel like I can never touch it again.
Spend any time listening to those who proselytise the use of generative AI and they’ll eventually preach that its widespread adoption is an inevitability. “It’s here to stay whether you like it or not,” they say, “So you might as well get on board.” But it’s important to remember that this isn’t a statement of fact but one of intent. If generative AI really was the future, they wouldn’t have to work so hard to cram it down our throats in the present. That’s why we have to draw a line in the sand now, and declare that the only acceptable amount of AI-generated content in a piece of art is none at all.

Leave a comment