- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Global service provider Keywords, which recently worked on acclaimed projects like Alan Wake 2, Baldur’s Gate 3, and The Legends of Zelda: Tears of the Kingdom, conducted an experiment last year.
The company tried to make a 2D video game relying solely on generative AI (GenAI) tools and technology. The R&D initiative was dubbed ‘Project Ava’ and saw a team, initially from Electric Square Malta, evaluate and leverage over 400 (unnamed) tools to understand how they might “augment” game development.
As detailed in the company’s latest fiscal report, however, the project ultimately proved that while some generative AI tools might simplify or accelerate certain processes, they are currently “unable to replace talent.”
I think many industries are going to quickly realise the limited utility of GenAI and the bubble will pop quite quickly when that happens.
Story time:
I’m a software developer for a large, multinational company. Yesterday, I needed to update the Knex migrations for the project I’m assigned to. We needed three new PostgreSQL tables, with several foreign key constraints. I added the migration to our existing migrations in a backend plugin we’re building out.
I use Copilot for developers regularly. It was helpful in this case, generating the table migrations automatically. Of course, it hallucinated a few methods Knex doesn’t have, but I’m used to things like that, and easily corrected them. Once I was done testing, I created a pull request to merge the commit in my working branch with the main branch in git.
Now, look at what I just wrote. If you’re not a developer, you probably have no idea what “Knex” or “PostgreSQL” mean. You probably recognize the words “foreign,” “key,” and “constraints,” but you haven’t got a clue why I’m using them in that order or what I’m referring to. It likely looks like I’m using the word “migrations” completely incorrectly. You don’t know what it means for “Knex” to have “methods.” Words like “git,” “pull request,” and “commit” just read like gibberish to you.
You wouldn’t know how to ask Copilot to do anything. You wouldn’t know where to place any results you manage to get from it. If your boss came to you and said, “here’s this feature requirement, make it happen,” you would fail. You wouldn’t know why, either. Hell, you wouldn’t even know what it is your boss is trying to accomplish. You could spend the next six months trying to figure it all out, and maybe you’d succeed, but probably not. Because you aren’t a developer.
I’m a developer. All of what I wrote above makes perfect sense to me, and it’s one of the simplest tasks I could tackle. Took about fifteen minutes to accomplish, from creating the migration file to getting the PR ready to merge.
I’ve been lambasted for insisting that large language models aren’t going to replace actual professionals because they’re not capable of joined-up thinking, meta-cognition, or creativity. I get told they’ll be able to do all of that any day now, and my boss will be able to fire all of his employees and replace them with an MBA - or worse, do the work himself. Depending on the attitudes of who I’m talking to, this is either a catastrophe or the greatest thing since sliced bread.
It’s neither, because that’s not going to happen. Look at the story above, and tell me you could do the same thing with no training or understanding, because ChatGPT could do it all. You know that’s bullshit. It can’t. LLMs are useful tools for people like me, and that’s it. It’s another tool in the toolbox, like IntelliSense and linters - two more terms you don’t know if you’re not a developer.
The bloom is beginning to come off the rose. Businesses are gradually realizing the pie-in-the-sky promises of LLM boosters are bogus.
The majority of my time isn’t spent writing code; it’s reading code, reviewing changes, and thinking about code.
Amen to that! I also spend a fair amount of time thinking about new features and how they would plug into our vast ecosystem, no part of which could Copilot possibly know anything about.
People know what pull requests are. I make them to my girlfriend all the time.
But does the python knex?
The python knexes at midnight.
Honestly, the shorter way to say all of this is “the AI” is supposed to make everyone a coder. “The AI” is supposed to make everyone an artist. Who is doing more coding or art than they used to? I know some people are (I am) but I know most people aren’t, even a little bit. That’s how you know it’s a tool for people who want it, or wanted it before it existed, but it’s not really driving any new demand at the level that would replace the experts already in place who can use the tool.
These are just new tools to create art, always have been and all the moral panic over making all creatives redundant and subsequent investor salivation and hype were never consistent with reality.
After all the people who will be able to make the most use of them are already those who know a thing or two about their craft.
Anytime you look under the hood of something billed as “completely AI generated” media, you find a team of really talented, creative people. Music, books, short films, and of course games. It’s the same story time and again, and the story is marketing.
Creative people using AI as a tool can do creative things.
I wonder if they will get banned from gamedev events like Kenney is.
Kenney the guy that makes assets?
Yes. He complained about a sponsor to an organiser. The sponsor was an AI company. They banned him.
They’ve clearly never played snake 2
Wait, the really are named “Keywords”?
Art is media that conveys emotions and ideas from one human to another. Unless a machine has emotions of its own, it cannot make art. It is just making media.
Unable to replace talent… for now ☝️
Bought to you by:
- The 50’s, where every utensil from car to ovens will have it’s own nuclear reactor in the future
- The 60’s, where Mankind soon will travel through starsystems and vacations on the moon are possible
- The 70’s, where people thought that in a few years Neutron-Brains will be created that work just like the human brain
- The 80’s, where it was said that in the year 2000 we will have flying cars and hoverboards
- The 90’s, where the internet took off and soon nobody will ever send letters again
- The 00’s, that soon self driving cars will take over all transport business
- The 10’s, where a magical working all-knowing Siri/Alexa/Cortana voice will be the way people will interact with at home and on the go
- Now the 20’s, where AI will be able to recreate talent and come up with genuine new ideas never before seen by mankind
I do not think AI can recreate talent or art. AI can imitate art and copy already existing art and melt it to something new. But an Artist-AI can not take two noses of coke, one bottle of vodka and come up with a novel new concept of creative work. Sure it will come up with things never seen before, but will it resonate with people or will it just be weird, quirky or empty? It is not enough to render an image of a diamond skull, you actually have to build it like Damien Hirst. Artist of the type of Banksy, Dali and Francis Bacon will not get replaced by AI. AI will not stand in the streets spraying walls, AI will not get off board off a ship in New York holding a human-long bread and AI will certainly not be able to draw triptychs of pain and suffering based on the death of his friend. AI will copy something that looks like it, but it will not have the “Talent” or the depth to make it believable and knit a story around it and know the people to communicate to. It most certainly will be used to subvert people on social media to vote against their interests and radicalize opinions. That is it’s talent.
You’re taking like the intent didn’t totally change the world in just about every way. Even in your list of big ideas that didn’t make it you can’t help but pick something that totally changed the world. You could have been more honest about the fifties where it was electrification everyone was hyped about which brought the white goods revolution and diy era. There’s a good argument to be made that the civil rights era was only made possible by the extra time labour saving devices made in the household.
Then there’s the television which brought education and culture to the masses, people said it would be huge but they were vastly understating what would happen. Computers were the big dream of the 70s but again totally smashed almost all expectations, almost nothing we take for granted today would be possible without computers. Then the 80s we had internet hype, it was pretty shitty bbs and newsgroups but people again predicted huge things that don’t even come close to what we’re used to now. 90s was mobile tech and iot which again I don’t even need to say anything about how huge that has become. Now we’re really starting to see the realization of AI research and automation but it’s very early stuff, you’re not casually chatting to your computer to narrow down product searches or having it design custom parts for you car based on vague descriptions but you will be and you’ll take it for granted just like you do with every other huge development.
Ai might not ever create anything interesting on its own but by allowing people to turn their ideas into reality it’s going to unlock unprecedented levels of human creativity. That said I think we’ll have genuinely novel expression created with intent and structure, unless you want to bring God into it there’s no reason a machine can’t to anything we can
Maybe, remember the 80/20 rule, and we are most likely not even at 80% yet.