GT Sophy, the AI developed by Sony, surpassed four of the best gamers and won in the racing game Gran Turismo on PlayStation.
The special thing is that GT Sophy taught himself how to play this game in 2 days. And then it surpassed 95% of people playing this game and continued to improve in the following weeks.
Unlike chess or Go, which is played in turn, the racing game Gran Turismo requires players to have judgment, make decisions and handle situations in real time based on what is happening. Therefore, to be able to play well GT Sophy needs to make a decision in a split second.
Sony said that GT Sophy has found the fastest way to move in the bends to shorten the time and after each race it learns more tactics about acceleration / deceleration, braking … When encountering obstacles, GT Sophy can automatically change lanes to pass.
The fact that GT Sophy beat top gamers in Gran Turismo shows that a trained AI can outrun the best racers on a virtual track.
The developers have made GT Sophy receive information from multiple streams so it can learn, handle obstacles and choose the shortest route, while minimizing accidents. This helps Sony’s AI optimize the running time on the track.
According to experts, the achievement achieved by GT Sophy has proven that an AI can learn strategies on its own to work in situations that require very fast judgment and handling such as racing. This is also a breakthrough that can open up the best computational methods to apply to driverless cars.
Sony also admits that the GT Sophy still has a lot of room for improvement. One of them is that without the collision penalties on the track this AI would “become significantly more aggressive”.
Racing game is the latest field of artificial intelligence winning the best. Previously, AI also won against humans in many games such as poker, chess, Go and Starcraft game.