Alphabet’s DeepMind AI achieves Grandmaster status playing Starcraft 2

Alphabet, the parent company of Google used DeepMind AI anonymously to play against real gamers and in so doing became a Grandmaster of Starcraft 2.

Alphabet sought the assistance of Blizzard who are the developers of Starcraft in accessing a massive database of past games to help train the algos. Apparently there were three separate neural networks trained to represent each race of alien in the game. Then AI agents were ‘taught’ from some of the moves of the strongest players from the Blizzard database. Copies of the agents were used to compete between each other to improve their skills in a process known as reinforcement learning.

Apparently 200 years of human gaming were condensed in to 44 days for the learning process using high speed computing.

You will be interested to learn that although DeepMind did exceptionally well, there are quite a number of human gamers who did better.

Whilst learning a lot from the exercise of pitting DeepMind against players of Starcraft 2 the backers of the project said it would never develop technologies for lethal autonomous weapons.

However, in a previous iteration where DeepMind was pitted against South Korea’s top ‘Go’ player and won, the Chinese government published a paper citing the achievement as having enormous potential for the use of AI in combat command!

Leave a Comment