Leela Chess Zero


Leela Chess Zero is a free, open-source, and neural network-based chess engine and distributed computing project. Development has been spearheaded by programmer Gary Linscott, who is also a developer for the Stockfish chess engine. Leela Chess Zero was adapted from the Leela Zero Go engine, which in turn was based on Google's AlphaGo Zero project, also to verify the methods in the AlphaZero paper as applied to the game of chess.
Like Leela Zero and AlphaGo Zero, Leela Chess Zero starts with no intrinsic chess-specific knowledge other than the basic rules of the game. Leela Chess Zero then learns how to play chess by reinforcement learning from repeated self-play, using a distributed computing network coordinated at the Leela Chess Zero website.
, Leela Chess Zero had played over 300 million games against itself, and is capable of play at a level that is comparable with Stockfish, the leading conventional chess program.

History

The Leela Chess Zero project was first announced on TalkChess.com on January 9, 2018. This revealed Leela Chess Zero as the open-source, self-learning chess engine it would come to be known as, with a goal of creating a strong chess engine. Within the first few months of training, Leela Chess Zero had already reached the Grandmaster level, surpassing the strength of early releases of Rybka, Stockfish, and Komodo, despite evaluating orders of magnitude fewer positions while using MCTS.
In December 2018, the AlphaZero team published a new paper in Science magazine revealing previously undisclosed details of the architecture and training parameters used for AlphaZero. These changes were soon incorporated into Leela Chess Zero and increased both its strength and training efficiency.
The work on Leela Chess Zero has informed the similar AobaZero project for shogi.
The engine has been rewritten and carefully iterated upon since its inception, and now runs on multiple backends, allowing it to effectively utilize different types of hardware, both CPU and GPU.
The engine supports the Fischer Random Chess variant, and a network is being trained to test the viability of such a network as of May 2020.

Program and use

The method used by its designers to make Leela Chess Zero self-learn and play chess at above human level is reinforcement learning. This is a machine-learning algorithm, mirrored from AlphaZero used by the Leela Chess Zero training binary to maximize reward through self-play. As an open-source distributed computing project, volunteer users run Leela Chess Zero to play hundreds of millions of games which are fed to the reinforcement algorithm. In order to contribute to the advancement of the Leela Chess Zero engine, the latest non-release candidate version of the Engine as well as the Client must be downloaded. The Client is needed to connect to the current server of Leela Chess Zero, where all of the information from the self-play chess games are stored, to obtain the latest network, generate self-play games, and upload the training data back to the server.
In order to play against the Leela Chess Zero engine on a machine, 2 components are needed: the engine binary, and a network. The network contains Leela Chess Zero's evaluation function that is needed to evaluate positions. Older networks can also be downloaded and used by placing those networks in the folder with the lc0 binary.

Self-Play Elo

Self-play Elo is used to gauge relative network strength to look for anomalies and general changes in network strength, and can be used as a diagnostic tool when there are significant changes. Through test match games that are played with minimal temperature-based variation, lc0 engine clients test the most recent version against other recent versions of the same network's run, which is then sent the training server to create an overall Elo assessment.
Standard Elo formulas are used to calculate relative Elo strength between the two players. More recent Self-Play Elo calculations use match game results against multiple network versions to calculate a more accurate Elo value.
There are several unintended consequences of the Self-Play approach to gauging strength and are as follows:
An example of Self-Play elo inflation is the Test 71.4 run, a Fischer Random Chess run, which nearly has 4000 cumulative self-play elo 76 nets after the start of its run. Self-Play Elo estimates of this run can be roughly compared with other runs to gauge the impracticality of pure cumulative self-play elo. A pure self-play elo comparison with one of the Test 60 networks 3000 nets into the run reveals that 63000 can consistently beat 714070 in head-to-head matches at most, if not all "fair" time controls. Yet, 63000 nets from the Test 60 run have a self-play elo around 2900, while the Self-Play Elo of early Test 71.4 is already near 4000. This contradiction of self-play Elo strength is enough to credit the claim that self-play Elo is not an objective measure of strength, nor is it one which allows one to easily compare network strength to Human strength.
Self-play rating for the engine could be used as a rough approximation of conventional Human Elo ratings, however no universal conversion formula exists for many reasons. These include but are not limited to the scale of initial inflation of self-play Elo and the late-term self-play Elo inflation between trained runs, differing time controls, differing systems of Elo measurement between chess tournament platforms, allocated resources to the engine, network size and structure, a network's training data set, and the multiple factors of which strength is given by the binary of the engine.
Setting up the engine to play a single node with ``--minibatch-size=1`` and ``go nodes 1`` for each played move creates deterministic play, and Self-Play elo on such settings will always yield the same result between 2 of the same networks on the same start position--always win, always loss, or always draw. Self-play elo is not reliable for determining strength in these deterministic circumstances.

Variants

In season 15 of the Top Chess Engine Championship, the engine AllieStein competed alongside Leela. AllieStein is a combination of two different spinoffs from Leela: Allie, which uses the same evaluation network as Leela, but has a unique search algorithm for exploring different lines of play, and Stein, an evaluation network which has been trained using supervised learning based on existing game data featuring other engines. While neither of these projects would be admitted to TCEC separately due to their similarity to Leela, the combination of Allie's search algorithm with the Stein network, called AllieStein, is unique enough to warrant it competing alongside mainstream Lc0.

Competition results

In April 2018, Leela Chess Zero became the first neural network engine to enter the Top Chess Engine Championship, during season 12 in the lowest division, division 4. Leela did not perform well: in 28 games, it won one, drew two, and lost the remainder; its sole victory came from a position in which its opponent, Scorpio 2.82, crashed in three moves. However, it improved quickly. In July 2018, Leela placed seventh out of eight competitors at the 2018 World Computer Chess Championship. In August 2018, it won division 4 of TCEC season 13 with a record of 14 wins, 12 draws, and 2 losses. In Division 3, Leela scored 16/28 points, finishing third behind Ethereal, who scored 22.5/28 points, and Arasan on tiebreak.
By September 2018, Leela had become competitive with the strongest engines in the world. In the 2018 Chess.com Computer Chess Championship, Leela placed fifth out of 24 entrants. The top eight engines advanced to round 2, where Leela placed fourth. Leela then won the 30-game match against Komodo to secure third place in the tournament. Concurrently, Leela participated in the TCEC cup, a new event in which engines from different TCEC divisions can play matches against one another. Leela defeated higher-division engines Laser, Ethereal and Fire before finally being eliminated by Stockfish in the semi-finals.
In October and November 2018, Leela participated in the Chess.com Computer Chess Championship Blitz Battle. Leela finished third behind Stockfish and Komodo.
In December 2018, Leela participated in season 14 of the Top Chess Engine Championship. Leela dominated divisions 3, 2, and 1, easily finishing first in all of them. In the premier division, Stockfish dominated while Houdini, Komodo and Leela competed for second place. It came down to a final-round game where Leela needed to hold Stockfish to a draw with black to finish second ahead of Komodo. It successfully managed this and therefore contested the superfinal against Stockfish. It narrowly lost the superfinal against Stockfish with a 49.5-50.5 final score.
In February 2019, Leela scored its first major tournament win when it defeated Houdini in the final of the second TCEC cup. Leela did not lose a game the entire tournament. In April 2019, Leela won the Chess.com Computer Chess Championship 7: Blitz Bonanza, becoming the first neural-network project to take the title.
In May 2019, Leela defended its TCEC cup title, this time defeating Stockfish in the final 5.5-4.5 after Stockfish blundered a 7-man tablebase draw. Leela also won the Superfinal of season 15 of the Top Chess Engine Championship 53.5-46.5 versus Stockfish.
Season 16 of TCEC saw Leela finish in 3rd place in premier division, missing qualification for the superfinal to Stockfish and new neural network engine AllieStein. Leela did not suffer any losses in the Premier division, the only engine to do so, and defeated Stockfish in one of the six games they played. However, Leela only managed to score 9 wins, while AllieStein and Stockfish both scored 14 wins. This inability to defeat weaker engines led to Leela finishing 3rd, half a point behind AllieStein and a point behind Stockfish. In the fourth TCEC cup, Leela was seeded first as the defending champion, which placed it on the opposite half of the brackets as AllieStein and Stockfish. Leela was able to qualify for the finals, where it faced Stockfish. After seven draws, Stockfish won the eighth game to win the match.
In Season 17 of TCEC, held in January-April 2020, Leela regained the championship by defeating Stockfish 52.5-47.5. It qualified for the superfinal again in Season 18, but this time was defeated by Stockfish 53.5-46.5. In the TCEC Cup 6 final, Leela lost to AllieStein, finishing 2nd.

Results summary

EventResultOpponentScore
Cup 1 3thKomodo0-0
Cup 2 1stHoudini4.5-3.5
Cup 3 1stStockfish5.5-4.5
Cup 4 2ndStockfish4.5-3.5
Cup 5 2ndStockfish2.5-1.5
Cup 6 2ndAllieStein2.5-1.5

EventYearTime ControlsResultRef
CCC 1: Rapid Rumble201815+53rd
CCC 2: Blitz Battle20185+23rd
CCC 3: Rapid Redux201930+52nd
CCC 4: Bullet Brawl20191+22nd
CCC 5: Escalation201910+52nd
CCC 6: Winter Classic201910+102nd
CCC 7: Blitz Bonanza20195+21st
CCC 8: Deep Dive201915+52nd
CCC 9: The Gauntlet20195+2, 10+53rd
CCC 10: Double Digits201910+33rd
CCC 11201930+51st
CCC 12: Bullet Madness!20201+11st
CCC 1320203+2, 5+5, 10+5, 15+51st

Notable games