AI has joined the game: How artificial intelligence is changing the video game industry

In a San Francisco convention hall that used to be packed with big booths from brands like Sony PlayStation and Epic Games, the 2025 Game Developers Conference had a decidedly different vibe: rows and rows of tiny stalls where indie developers showed off their latest games, controllers, and technological innovations, and crowds of people gathered around to watch.

But the indie takeover of the Moscone Center’s South Hall wasn’t the only change at this year’s GDC, held annually (except during COVID) since 1988. Though recent years have seen a focus on virtual reality and other tech that will give players a portal into a fully-immersive metaverse, this year’s “it” technology was clear, and it was everywhere: artificial intelligence.

Nowhere was that more apparent than in one of those tiny stalls, where a startup called Ovomind was offering a demo of a spooky horror game. There was no VR headset, and no players swinging imaginary weapons wildly through the air — just a laptop, an XBOX controller, and a little black wristband that looked like a Fitbit.

“We’re developing a technology that can measure emotion,” explained Ovomind CEO Yann Frachi, pointing out the little sensors on the wristband. “We’re measuring skin temperature, micro-sweating, heart rate behavior.”

All of those individual measurements are fed into a generative AI model in the cloud, Frachi explained, which uses that data to determine how the player might be feeling, and reports it back to the game using adjectives like “bored,” “excited” or “alarmed.” It also sends back other data, like the player’s heart rate and breathing cadence. Those factors, in turn, can influence elements of the game like a character’s endurance, shooting accuracy, and peripheral vision — or even when and where enemies appear.

“It’s going to be a different experience every time,” Frachi said. “You cannot be bored, because it can adapt and change to get you excited. … That’s the game changer.”

Frachi said the ever-growing set of data used to train Ovomind’s AI began with a decade of academic research. The company is selling a developer’s kit that includes the wristband, but eventually hopes to use the sensors in an ordinary smartwatch like those from Apple or Samsung, to broaden the technology’s reach.

AI that can read your feelings may be a game changer for players, but the game is changing in other ways for the tens of thousands who attend the Game Developers Conference — including many with hopes of landing their next job at a game studio.

“Mass layoffs, sort of industry-wide,” explained Ryan Dunagan, a professor who teaches game programming at the University of Denver. “There’s way more talent to go around than there is positions.”

Indeed, first-time attendees arrived at GDC with high hopes, but tempered expectations. They realize they’re playing this quest on hard mode.

“My primary interest is to get a job,” said Robert Breglio, whose all-access conference badge listed his company as N/A. “Anyone listening, I would love a job. I’m a real hard worker.”

It’s a problem we first began hearing about a year ago, when we walked the conference floor with legendary game designer Louis Castle.

“Our industry’s in a bit of a crisis, because the cost of developing content has become so high,” he told us at the time.

But for a brief moment, the industry found a magic healing potion.

“The games industry did see quite a lift during COVID,” said Chris Hewish, chief strategy officer at game commerce company Xsolla.

Stuck inside with nothing to do, he said, people turned to games — and many of them are still playing today. But now, the world outside is open again.

“The hyper growth has slowed a little,” he said. “We’re still growing, but we’re not seeing that high, double-digit growth.”

And that’s the other place where AI comes in. When we spoke last year, Castle expressed a hope that AI could help studios stay profitable by speeding up the process of building games and getting them to market. For example, when he designed the classic 90s game Blade Runner, optical motion capture was just emerging as a way to animate game characters. His team set up a full motion capture studio in their offices, with specialized equipment from Vicon.

“Every single person moves slightly differently,” explained Vicon VFX product manager David Edwards. “And you can also communicate a lot through how you move.”

Motion capture became the de facto way to create expressive animations for games, because using an actor’s performance as a starting point was dramatically faster than creating every animation from scratch. Now, Vicon wants to speed things up even more through what it calls “markerless” motion capture. Instead of requiring actors to wear special black suits dotted by dozens of gray plastic balls or “markers” on each joint and fingertip, the new system uses an array of video cameras and an AI model to extract animation data from the movements of people in their regular clothes.

“They spend less time worrying about everything being set up perfectly,” Edwards said, “and more time worrying about, ‘Is this the performance I need? Is this the animation my game needs?’”

During its first public demo of the technology at GDC, Vicon was careful to explain that its new AI-based motion capture isn’t intended to replace the “old way” with suits and markers — which is still more precise at gathering detailed information right down to the subtlest finger movements. Instead, the new technology is intended to speed the process of creating prototypes and rough drafts — a goal that’s supported by the International Game Developers Association.

“AI has great uses, such as quick iteration, quick prototyping, seeing if things are going to be fun to play,” said IGDA executive director Jakin Vela. “It’s when studio heads or executives think that AI should be replacing workers, that is the concern.”

Vela said there’s also concern about AI training data — the massive sets of real-world examples that are fed into AI models so they can learn to generate images, sounds and text. Vicon and Ovomind both say they’re gathering all those examples in-house, so there’s no question about who owns them. But when the source of the data isn’t so black-and-white — as has been the case for some online image generation AI models — that can be a concern for artists, including actors.

“Voice actors also are experiencing some concerns,” Vela added. “To ensure that AI is not ripping their voices off, or their likeness off.”

And then there’s the challenge of making sure generative AI models are appropriate for kids — and making sure their training data only includes clean, wholesome examples, so they only learn how to generate G-rated content. That’s a priority for Roblox, a game that saw a huge surge in popularity during the COVID pandemic. In Roblox, players can create their own 3D interactive experiences — in essence, becoming game designers themselves. Now, Roblox has a new open-source generative AI model called Cube that’s poised to make that process easier.

“Creating in 3D is very, very hard,” explained Roblox senior engineering director Kiran Bhat. “So … it’s a way for our users, especially the younger users, to be able to experience the magic of creation.”

The new AI model can generate 3D objects from a text description, and Roblox is actively working on enabling it to design entire scenes all at once. Bhat sees a future where creators can try out dozens of ideas for a new Roblox experience, and pick the one they like the best — all by typing in words and phrases, and watching objects magically appear in 3D — and soon, what Roblox calls 4D.

“In the Roblox context, the fourth dimension is functionality,” Bhat said. “So if you think of a door, you’ve generated the geometry of the door, it’s 3D — but the ability to open and close the door is 4D.”

As a generation of gamers grows up building worlds with AI, Bhat hopes some of them will become the next generation of game developers. And Hewish thinks it will happen.

“Every new technology comes with a fear of people being displaced and replaced, all the way back to the printing press,” Hewish said. “I think we’ll have opportunities for creators who are versed in prompts with AI — who can actually work with AI as a partner.”

Because, as Vela points out, there’s no turning back the clock on AI. It’s even been called the Fourth Industrial Revolution.

“It’s here,” Vela said. “It’s not going anywhere. We have to learn how to work together.”


评论

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注