A new artificial intelligence system can learn how a video game operates by watching less than two minutes of gameplay, and recreate the basic software of a game which governs everything from character movement to rendering graphics.
A new artificial intelligence system can learn how a video game operates by watching less than two minutes of gameplay, and recreate the basic software of a game which governs everything from character movement to rendering graphics. The tool may help game studios and enthusiasts speed up process of game development and experiment with different styles of play. The AI, developed by researchers at the Georgia Institute of Technology in the US, watches less than two minutes of gameplay video and then builds its own model of how the game operates by studying the frames and making predictions of future events, such as what path a character will choose or how enemies might react.
To get their AI agent to create an accurate predictive model that could account for all the physics of a 2D platform-style game, the team trained the AI on a single “speedrunner” video, where a player heads straight for the goal. This made “the training problem for the AI as difficult as possible.”
Researhers trained their system with the popular Super Mario Bros and have started replicating the experiments with games like Mega Man and Sonic the Hedgehog as well. “Our AI creates the predictive model without ever accessing the game’s code, and makes significantly more accurate future event predictions than those of convolutional neural networks,” said Matthew Guzdial, lead researcher and PhD student in computer science.
- Govt discontinues Corona Kavach, Aarogya Setu is now India's go-to COVID-19 tracking app
- From medical consultations, dating to workouts: How internet is keeping our worlds running amid lockdowns
- Most people in India are taking the coronavirus lockdown seriously, suggests Google's COVID-19 Community Mobility Report
“A single video won’t produce a perfect clone of the game engine, but by training the AI on just a few additional videos you get something that’s pretty close,” said Guzdial. They next tested how well the cloned engine would perform in actual gameplay. They employed a second AI to play the game level and ensure the game’s protagonist would not fall through solid floors or go undamaged if hit by an enemy. The AI playing with the cloned engine proved indistinguishable compared to an AI playing the original game engine, researchers said.
“The technique relies on a relatively simple search algorithm that searches through possible sets of rules that can best predict a set of frame transitions,” said Mark Riedl, associate professor of Interactive Computing at Georgia Tech. “To our knowledge this represents the first AI technique to learn a game engine and simulate a game world with gameplay footage,” Reidl said.