The above screen shot shows the status of my longest-running (background) project: an attempt to write software that can play Go on a personal computer. The right side of the screen displays the status of the game in progress. On the left side of the screen are the controls and the program’s analysis of point groups, stone groups, connectivity and an evaluation of the positions. I call the program “Mingo.” Mingo definitely plays Go and sometimes makes surprisingly astute moves, but also can make some blunders. It might be able to win against someone just learning the game.
The ability to play Go on a personal computer is challenging. But as the computing power that can be purchased with “personal” resources grows, the ability to compute moves in an acceptable amount of time improves. Computers have been able to whip Chess grand masters for some time now, but only very recently have computers been able to beat professional Go players. That feat required a team of computer science gurus and millions of dollars of powerful neural network parallel computing hardware. My current desktop runs a 3GHz Intel Core I7 CPU chip with four cores capable of handling 8 threads.
Go is an elegant game that originated in the orient about 4,000 years ago and is the most difficult board game known. There are an estimated 10 to the 100th power reasonable Go games! That is about 10 to the 30th power times the number of reasonable Chess games. Go rules are extremely simple, but the strategies kill. It takes very intelligent people many years of effort to achieve a professional level of play. It stretches the best human intellects to comprehend the nuances of all stone positions on a standard 19 x 19 board.
The Go move tree is very deep and has an uncommonly large fan-out, so brute force min/max look-ahead cannot possibly get you very far. The only approach which might work is to be able to evaluate the subtleties of a board position with enough insight to prune the move tree quickly. That’s very hard. Recent progress harnessing AI learning techniques and the power of GPU floating point math acceleration is looking good.
You must be logged in to post a comment.