There are corporate scandals, and then there are the ones that feel like a drunk intern threw a briefcase into a wood chipper and the result somehow got calendared as a strategy meeting.
Krafton, the company behind PUBG, allegedly looked at a $250 million earnout tied to Subnautica 2 and responded the way a coward in a cheap suit always does: it started hunting for a way to make the problem disappear instead of paying what it owed. According to court filings and the Delaware ruling that followed, CEO Kim Chang-han even asked ChatGPT for help brainstorming ways to avoid the payout.
That sentence is doing a ridiculous amount of work. A multinational publisher, sitting on a nine-figure acquisition mess, apparently went to a chatbot for tactical wisdom on how not to honor a contract. Not “how do we fix the relationship,” not “how do we ship a better game,” not even the usual corporate slime of “how do we negotiate in good faith while lying through our teeth.” No. Just: how do we wriggle out of this specific, expensive obligation without looking like the exact kind of greedy little goblin everyone suspects we are.
ChatGPT, to its microscopic credit, allegedly told him it would be difficult to cancel the earnout.
Imagine that. Even the machine built to produce confident prose from the fumes of the internet looked at the deal and said, in effect, “yeah, this is legally gross, good luck with that, champ.” But the truly pathetic part is not that the CEO asked. It’s that he seems to have treated the answer like a prompt from the universe instead of the obvious sign to stop being a little contractual arsonist.
What happened next reads like the sort of internal memo that should be nailed to the wall of every business school and every corporate boardroom as a warning label. A secret task force. “Project X,” because apparently if you give a shady plan a sleek code name it stops smelling like panic and starts smelling like initiative. The alleged mandate was either to force a new deal on the earnout or execute a takeover of Unknown Worlds. Same bullshit, different font.
And this is the part that really gets under my skin: AI didn’t create the greed. It just gave the greed a costume.
That’s the whole modern enterprise in one ugly little package. People with too much money and too little shame have discovered that LLMs can generate a respectable-looking pile of words fast enough to make a bad idea feel operational. Not ethical. Not wise. Just operational. You can ask the machine to outline the nastiest version of your intent and it will hand it back with numbered bullets and enough managerial padding to make a psychopath feel “data-driven.”
But courts are not impressed by your prompt engineering. Judges do not care that your executive had a productive afternoon chatting with a predictive text engine about how to evade a payout. They care about contracts, emails, Slack messages, timelines, and whether your story sounds like a real business dispute or a hostage note written by an MBA.
Delaware, bless its miserable little corporate heart, did what Delaware does best: it read the room and then read the receipts. The ruling says Krafton’s true focus was avoiding financial exposure. Not safeguarding players. Not protecting the studio. Not even cleanly admitting it overpaid. Just: don’t pay the money. If necessary, reorganize the entire damn studio around that singular fear and call it governance.
That is why this story matters beyond the juicy headline. It is not really about ChatGPT. It is about what happens when executives confuse access to an answer generator with the right to outsource their conscience. The chatbot did not invent the bad faith. It just made the bad faith easier to articulate without immediately choking on its own moral sewage.
The gaming industry has always been a monument to people pretending the spreadsheets are somehow separate from the art. This is what that lie looks like when it finally grows fangs. A sequel in limbo. Founders thrown into litigation. A giant publisher allegedly trying to unhook a bonus by delaying the game it promised to support. And somewhere in the middle of it, a CEO asking a machine how to make the whole mess less expensive.
That is not innovation. That is a panic attack with a PowerPoint deck.
If there is any lesson here, it’s embarrassingly simple: if you need ChatGPT to explain how to wriggle out of paying people, you have already admitted the plan is rotten. You don’t need a breakthrough. You need a mirror, a lawyer with a spine, and maybe a long walk into the sea.
Krafton didn’t just take a swing at a contract. It took a swing at the idea that consequences are for other people. The court noticed. The internet noticed. And now the whole thing sits there like a wet cardboard crown on the head of corporate hubris.
Beautiful, stupid, and absolutely deserved.