{"id":13772502,"url":"https://github.com/SouthEugeneRoboticsTeam/vision","last_synced_at":"2025-05-11T04:31:31.103Z","repository":{"id":43790176,"uuid":"72891331","full_name":"SouthEugeneRoboticsTeam/vision","owner":"SouthEugeneRoboticsTeam","description":"👀 Powerful vision code for FRC","archived":false,"fork":false,"pushed_at":"2023-07-06T21:32:25.000Z","size":2763,"stargazers_count":25,"open_issues_count":16,"forks_count":1,"subscribers_count":7,"default_branch":"master","last_synced_at":"2024-10-29T22:46:34.461Z","etag":null,"topics":["camera","frc","hacktoberfest","hacktoberfest2021","opencv","python","vision"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/SouthEugeneRoboticsTeam.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":"support/99-webcam.rules","governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2016-11-04T22:54:15.000Z","updated_at":"2024-04-15T20:12:22.000Z","dependencies_parsed_at":"2024-08-03T17:16:05.272Z","dependency_job_id":null,"html_url":"https://github.com/SouthEugeneRoboticsTeam/vision","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SouthEugeneRoboticsTeam%2Fvision","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SouthEugeneRoboticsTeam%2Fvision/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SouthEugeneRoboticsTeam%2Fvision/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SouthEugeneRoboticsTeam%2Fvision/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/SouthEugeneRoboticsTeam","download_url":"https://codeload.github.com/SouthEugeneRoboticsTeam/vision/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253518941,"owners_count":21921074,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["camera","frc","hacktoberfest","hacktoberfest2021","opencv","python","vision"],"created_at":"2024-08-03T17:01:04.705Z","updated_at":"2025-05-11T04:31:26.094Z","avatar_url":"https://github.com/SouthEugeneRoboticsTeam.png","language":"Python","readme":"\u003cp align=\"center\"\u003e\n    \u003ca href=\"#\"\u003e\u003cimg alt=\"Logo\" src=\"./images/logo.png\" width=\"67%\" /\u003e\u003c/a\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n    \u003ca href=\"https://sert2521.org\"\u003e\u003cimg alt=\"team 2521\" src=\"https://img.shields.io/badge/team-2521-7d26cd.svg?style=flat-square\" /\u003e\u003c/a\u003e\n    \u003ca href=\"https://www.python.org/downloads\"\u003e\u003cimg alt=\"team 2521\" src=\"https://img.shields.io/badge/python-3.6-blue.svg?style=flat-square\" /\u003e\u003c/a\u003e\n    \u003ca href=\"https://github.com/SouthEugeneRoboticsTeam/vision/blob/master/LICENSE\"\u003e\u003cimg alt=\"team 2521\" src=\"https://img.shields.io/github/license/SouthEugeneRoboticsTeam/vision.svg?style=flat-square\" /\u003e\u003c/a\u003e\n    \u003ca href=\"https://travis-ci.org/SouthEugeneRoboticsTeam/vision\"\u003e\u003cimg alt=\"team 2521\" src=\"https://img.shields.io/travis/SouthEugeneRoboticsTeam/vision/master.svg?style=flat-square\" /\u003e\u003c/a\u003e\n\u003c/p\u003e\n\n## Usage\n\nBefore starting the program, you must install OpenCV \u003e= v3.1.x along with\nthe pip dependencies:\n\n```bash\n$ pip install -r requirements.txt\n```\n\nThen, you can run the program using:\n\n```bash\n$ python run.py\n```\n\nFull usage:\n\n```text\nusage: run.py [-h] [-i IMAGE] [-s SOURCE] [-t TEAM] [-d] [-na MIN_AREA]\n              [-xa MAX_AREA] [-nf MIN_FULL] [-xf MAX_FULL]\n              [-l LOWER_COLOR [LOWER_COLOR ...]]\n              [-u UPPER_COLOR [UPPER_COLOR ...]] [-tn] [-v]\n```\n\n## Usage with Docker\n\nIf you are using Docker, the installation process is much easier.\n\nNote: due to the sizes of the images that are installed as part of the build process, it is recommended to have a few GB of hard disk space available when building.\n\nFirst, `cd` to this repository and then run `docker build . -t vision`. All existing config files will be used in the build process to make the image.\n\nThis will produce a docker image which can be run using:\n\n```bash\n$ docker run --device /dev/video0 vision\n```\n\nMake sure that you have a camera connected before running this!\n\n## Connection Status GUI\n\nSERT's vision software comes with a connection status GUI to help debug\nconnection issues. This GUI can be displayed on a monitor connected to\nthe co-processor. We use an [Adafruit 5\" screen](https://www.adafruit.com/product/2260).\n\n- `RADIO` checks connection to the robot's radio (at `10.XX.XX.1`)\n- `ROBOT` checks connection to the roboRIO (at `10.XX.XX.2`)\n- `NTABL` checks connection to the NetworkTables server on the roboRIO\n\n| Good    | Warning | Bad     |\n|---------|---------|---------|\n| Connection is good and system is ready. | Some connections are down. | No connection. Ensure ethernet is plugged in. |\n|![good](./images/vision_good.png)|![good](./images/vision_warn.png)|![good](./images/vision_bad.png)|\n\n![installed GUI](./images/installed_gui.jpg)\n\n## Configuration\n\n### Command-Line Options\n\nAll command-line arguments may be configured in the `config.ini` file\n(located at `config/config.ini`). For example, the `--lower-rgb`\nargument may be edited using the `lower-rgb` line in the `config.ini`.\n\n```text\noptional arguments:\n  -h, --help            show this help message and exit\n  -i IMAGE, --image IMAGE\n                        path to image\n  -s SOURCE, --source SOURCE\n                        video source (default=0)\n  -t TEAM, --team TEAM  the team of the target roboRIO\n  -d, --display         display results of processing in a new window\n  -na MIN_AREA, --min-area MIN_AREA\n                        minimum area for blobs\n  -xa MAX_AREA, --max-area MAX_AREA\n                        maximum area for blobs\n  -nf MIN_FULL, --min-full MIN_FULL\n                        minimum fullness of blobs\n  -xf MAX_FULL, --max-full MAX_FULL\n                        maximum fullness of blobs\n  -l LOWER_COLOR [LOWER_COLOR ...], --lower-color LOWER_COLOR [LOWER_COLOR ...]\n                        lower color threshold in HSV\n  -u UPPER_COLOR [UPPER_COLOR ...], --upper-color UPPER_COLOR [UPPER_COLOR ...]\n                        upper color threshold in HSV\n  -tn, --tuning         open in tuning mode\n  -v, --verbose         for debugging, prints useful values\n```\n\n### Camera\n\nFor use with the Microsoft Lifecam 3000, the camera's exposure should be\nset manually because the Lifecam will auto-adjust otherwise, making\nthresholding difficult. This can be done with V4L:\n\n```bash\n$ sudo apt-get install v4l-utils\n$ v4l-ctl -d /dev/video0 -c exposure_auto=1 # 1=DISABLED, 3=ENABLED\n$ v4l-ctl -d /dev/video0 -c exposure_absolute=50\n```\n","funding_links":[],"categories":["Vision"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FSouthEugeneRoboticsTeam%2Fvision","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FSouthEugeneRoboticsTeam%2Fvision","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FSouthEugeneRoboticsTeam%2Fvision/lists"}