4 replies. Last post: 2018-11-29Reply to this topic Return to forum
Hello, I am Chicagos, creator and owner of the bot Pikachu_c
First of all I want to thank YHW, who answered all my questions carefully and gave me a plugin to run my bot on LG.
Pikachu_c calculates with Expectiminimax several moves ahead (around 6-7 plys in the middle game) and evaluates the position at the leaf with a firm evaluation function,
which is based on a self learning neural network.
To further improve the strenght of the bot, I want to write an endgame table base for all positions with 5 or less stones, which contains the exact winning propability for each of these positions.
My first question is, which datastructure should I use to access each position quickly?
My second question is, how to calculate the exact winning propabilities in an efficient way?
Many thanks in advance,
thanks for your intense interest in Ewn. At least two people have done
hard work concerning Ewn endgame tables:
(i) Ingo Schwab in his doctoral dissertation. You can find some of his
material on the follwoing site:
(ii) Wesley Turner. He had full 7-piece tables in the Amazon cloud, and even
parts of the 8-piece space. It “turnered” out that these tables were not
enough to make Ewn bot play almost perfect.
I just saw your question. Hopefully you've figured something out by now, but here are some notes in case you find them helpful.
For 5-piece tables you don't need to worry too much about efficiency.
A totally naive database might have (12 choose 5)*(25!/20!) \approx 5 billion positions, but with some symmetry arguments you can reduce the amount of work you do. The nearly (12 choose 5) piece combinations can become 198 equivalence classes (having a 1 and a 2 is the same as having a 5 and a 6), and reflecting along the diagonal will almost halve the number of positions for each equivalence class. In the end you only need to calculate about 600 million positions.
Just about any technique you can think of will be fine to store and access them, though you may pay a penalty if your resultant data structure doesn't fit in memory.
Like with most chess tablebases, Sybil gives each position a unique number that it uses as an index into a giant array. To access the data, it memmapped the files and treated them just as though they were arrays living in memory. The operating system was smart enough to cache what was needed. In my implementation I didn't start noticing significant IO wait until I incorporated databases of size 7.
Ok thank you! By the time I wrote an improved version of the neural network, but I still dont use endgame tables because if I have an endgame of 5 or less pieces I just let the bot make a deep search and also find the perfect move.