Update README.md
Browse files
README.md
CHANGED
|
@@ -20,7 +20,8 @@ datasets:
|
|
| 20 |
A 29.2M parameter hybrid transformer trained to play chess, built from scratch. LCM uses a novel combination of GQA attention and LIV convolution blocks from Liquid AI's LFM2 architecture, trained with dual NTP + TOP objectives on ~8 million chess games.
|
| 21 |
|
| 22 |
Play against it online [here](https://huggingface.co/spaces/MostLime/lcm-chess-playground).
|
| 23 |
-
|
|
|
|
| 24 |
|
| 25 |
---
|
| 26 |
|
|
|
|
| 20 |
A 29.2M parameter hybrid transformer trained to play chess, built from scratch. LCM uses a novel combination of GQA attention and LIV convolution blocks from Liquid AI's LFM2 architecture, trained with dual NTP + TOP objectives on ~8 million chess games.
|
| 21 |
|
| 22 |
Play against it online [here](https://huggingface.co/spaces/MostLime/lcm-chess-playground).
|
| 23 |
+
|
| 24 |
+
Read the blog post about it [here](https://mostlime.pages.dev/blog/i-trained-a-chess-engine-on-14-dollars).
|
| 25 |
|
| 26 |
---
|
| 27 |
|