{"id":22014573,"url":"https://github.com/unsignedarduino/chessbot","last_synced_at":"2026-04-10T20:49:10.984Z","repository":{"id":265154791,"uuid":"865722656","full_name":"UnsignedArduino/Chessbot","owner":"UnsignedArduino","description":"A Raspberry Pi that uses a camera to look at the board and tells you it's move on a monitor.","archived":false,"fork":false,"pushed_at":"2025-01-02T05:29:32.000Z","size":128422,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-03-23T08:42:32.448Z","etag":null,"topics":["chess","chess-ai","chess-bot","classifier","classifier-model","computer-vision","machine-learning","raspberry-pi","raspberry-pi-5","raspberry-pi-camera","roboflow","segmentation","segmentation-model","yolo","yolo-11","yolov11"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/UnsignedArduino.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-10-01T02:42:32.000Z","updated_at":"2025-01-02T05:29:36.000Z","dependencies_parsed_at":null,"dependency_job_id":"f4c54ae0-ebe9-46a1-83ee-ddc305777766","html_url":"https://github.com/UnsignedArduino/Chessbot","commit_stats":null,"previous_names":["unsignedarduino/chessbot"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/UnsignedArduino%2FChessbot","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/UnsignedArduino%2FChessbot/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/UnsignedArduino%2FChessbot/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/UnsignedArduino%2FChessbot/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/UnsignedArduino","download_url":"https://codeload.github.com/UnsignedArduino/Chessbot/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":245078149,"owners_count":20557279,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["chess","chess-ai","chess-bot","classifier","classifier-model","computer-vision","machine-learning","raspberry-pi","raspberry-pi-5","raspberry-pi-camera","roboflow","segmentation","segmentation-model","yolo","yolo-11","yolov11"],"created_at":"2024-11-30T04:15:29.823Z","updated_at":"2026-04-10T20:49:10.908Z","avatar_url":"https://github.com/UnsignedArduino.png","language":"Jupyter Notebook","readme":"# Chessbot\n\nA Raspberry Pi that uses a camera to look at the board and tells you it's move\non a monitor.\n\n## Installation\n\n1. Have a Raspberry Pi (preferably 5) and a Raspberry Pi camera module and a\n   monitor hooked up. Have the Raspberry Pi OS and desktop installed.\n2. Update the Raspberry Pi. This step may be required if your camera feed is\n   all messed up. (I wasted an entire day on this)\n3. Use Python 3.11, at the time of writing some of the used packages are not\n   compatible with Python 3.12.\n4. Clone this repo.\n5. Install [Stockfish](https://stockfishchess.org/) either with by downloading\n   from the website or using `sudo apt install stockfish` on the Pi. (although\n   it is several major versions behind it is better than compiling it)\n\nThere is an (untested) [`Makefile`](Makefile) available which performs the\nrest of the installation steps. For the Pi, run `make install-for-pi` and for\nWindows, run `make install-for-windows`. (If you don't have `make` installed,\nyou can run the commands manually)\n\n### Model training\n\nThe bot requires two models, one for segmenting the board and another for\nclassifying the pieces found on squares. This requires two Roboflow datasets\nand ML models to be trained.\n\nThe camera should be facing the board down from above, centered on the four\nmiddle central squares. See the board segmentation dataset to see the setup I\nused.\n\n#### Training board segmentation\n\n[![View dataset on Roboflow](https://img.shields.io/badge/-View_dataset_on_Roboflow-gray?logo=roboflow\u0026logoColor=%236706CE\u0026labelColor=white\u0026color=%236706CE)](https://universe.roboflow.com/unsignedarduino-9db8i/chessbot-boards)\n[![View dataset on Kaggle](https://img.shields.io/badge/-View_dataset_on_Kaggle-gray?logo=kaggle\u0026logoColor=%2320BEFF\u0026labelColor=gray\u0026color=blue)](https://www.kaggle.com/datasets/unsignedarduino/chessbot-boards)\n[![Train on Colab](https://img.shields.io/badge/-Train_on_Colab-gray?logo=googlecolab\u0026logoColor=%23F9AB00\u0026labelColor=gray\u0026color=blue)](https://colab.research.google.com/github/UnsignedArduino/Chessbot/blob/main/src/train/train_board_segmentation.ipynb)\n\nAfter configuring your dataset on Roboflow, use\n[`gather_board_images.py`](src/train/gather_board_images.py) to gather board\nimages, which get uploaded to Roboflow automatically. Afterward, create a\ndataset version\nand train with\n[`train_board_segmentation.ipynb`](src/train/train_board_segmentation.ipynb).\nDownload `best.pt` to the [`src/models`](src/models)\ndirectory and rename it to\n[`board_segmentation_best.pt`](src/models/board_segmentation_best.pt).\n\nAfterward, run\n[`export_board_segmentation.py`](src/train/export_board_segmentation.py). This\nwill export the model to NCNN format. To preview, run\n[`test_board_segmentation.py`](src/train/test_board_segmentation.py).\n\n#### Training pieces classification\n\n[![View dataset on Roboflow](https://img.shields.io/badge/-View_dataset_on_Roboflow-gray?logo=roboflow\u0026logoColor=%236706CE\u0026labelColor=white\u0026color=%236706CE)](https://universe.roboflow.com/unsignedarduino-9db8i/chessbot-pieces-qxp5p)\n[![View dataset on Kaggle](https://img.shields.io/badge/-View_dataset_on_Kaggle-gray?logo=kaggle\u0026logoColor=%2320BEFF\u0026labelColor=gray\u0026color=blue)](https://www.kaggle.com/datasets/unsignedarduino/chessbot-pieces)\n[![Train on Colab](https://img.shields.io/badge/-Train_on_Colab-gray?logo=googlecolab\u0026logoColor=%23F9AB00\u0026labelColor=gray\u0026color=blue)](https://colab.research.google.com/github/UnsignedArduino/Chessbot/blob/main/src/train/train_piece_classification.ipynb)\n\nAfter configuring your dataset on Roboflow, use\n[`gather_piece_images.py`](src/train/gather_piece_images.py) or\n[`gather_piece_images2.py`](src/train/gather_piece_images2.py) to gather board\nimages. Upload the directory to Roboflow. Afterward, create a dataset version\nand train with\n[\n`train_piece_classification.ipynb`](src/train/train_piece_classification.ipynb).\nDownload `best.pt` to the [`src/models`](src/models)\ndirectory and rename it to\n[`piece_classification_best.pt`](src/models/piece_classification_best.pt).\n\nAfterward, run\n[`export_piece_classification.py`](src/train/export_piece_classification.py).\nThis will export the model to NCNN format. To preview, run\n[`test_piece_classification.py`](src/train/test_piece_classification.py).\n\n## Usage\n\nWIP\n\n```commandline\npython src/main.py --verbose\n```\n\n### Debugging\n\n\u003e Check the [`Makefile`](Makefile) for more `run-*` commands for development.\n\nIf you don't/can't use a camera, use the `--debug-use-image-dir` flag to use a\ndirectory of static images, so you can run it on any computer. (The only\nRaspberry Pi specific code is the camera access, which is replaced with image\nreading code.)\n\n```commandline\npython src/main.py --debug-use-image-dir \"test/scholars mate game white pov\" --verbose \n```\n\nThe images should be 800x606, see\n[`test/starting pos white pov`](test/starting%20pos%20white%20pov) for an\nexample. The images will be ordered by name, so best to order them with\nnumbers. (e.g. `01.png`, `02.png`, ...) Use \u003ckbd\u003ew\u003c/kbd\u003e or \u003ckbd\u003ed\u003c/kbd\u003e to\nadvance to the next image, and \u003ckbd\u003es\u003c/kbd\u003e or \u003ckbd\u003ea\u003c/kbd\u003e to go to the\nprevious image. \u003ckbd\u003eq\u003c/kbd\u003e to quit.\n\nYou can use the `--debug-play-image-dir` flag alongside the\n`--debug-use-image-dir` flag to play the specified directory of images, like a\nslideshow, in order to save some key pressing. Click any key in order to stop\nthe \"slideshow\".\n\nYou can capture your own image on the Raspberry Pi with a Picamera and\ntransfer it to your computer to be used with\n[`test_camera.py`](src/train/test_camera.py):\n\n```commandline\npython src/train/test_camera.py  -d \"/home/pi/Chessbot/test/new test\"\n```\n\nPress \u003ckbd\u003ec\u003c/kbd\u003e to save a numbered image to the directory specified with\n`-d`. (It will start with `001.jpg`, then `002.jpg`, etc.)\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Funsignedarduino%2Fchessbot","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Funsignedarduino%2Fchessbot","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Funsignedarduino%2Fchessbot/lists"}