{"id":16494727,"url":"https://github.com/devrimcavusoglu/pybboxes","last_synced_at":"2025-04-08T01:39:10.803Z","repository":{"id":37043168,"uuid":"487397729","full_name":"devrimcavusoglu/pybboxes","owner":"devrimcavusoglu","description":"Light weight toolkit for bounding boxes providing conversion between bounding box types and simple computations.","archived":false,"fork":false,"pushed_at":"2024-10-07T07:24:00.000Z","size":117,"stargazers_count":149,"open_issues_count":0,"forks_count":14,"subscribers_count":3,"default_branch":"main","last_synced_at":"2025-04-01T00:37:48.466Z","etag":null,"topics":["computer-vision","deep-learning","image-recognition","machine-learning","numpy","python","pytorch","tensorflow"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/devrimcavusoglu.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-04-30T22:51:41.000Z","updated_at":"2025-02-01T18:58:55.000Z","dependencies_parsed_at":"2024-06-18T17:11:16.589Z","dependency_job_id":"7a207372-4049-40b6-8339-1c4bfce12ee7","html_url":"https://github.com/devrimcavusoglu/pybboxes","commit_stats":{"total_commits":53,"total_committers":6,"mean_commits":8.833333333333334,"dds":"0.39622641509433965","last_synced_commit":"2cd7b1b6d72e5ffcaf4d73e10d8e5d6bb98488b5"},"previous_names":[],"tags_count":10,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/devrimcavusoglu%2Fpybboxes","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/devrimcavusoglu%2Fpybboxes/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/devrimcavusoglu%2Fpybboxes/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/devrimcavusoglu%2Fpybboxes/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/devrimcavusoglu","download_url":"https://codeload.github.com/devrimcavusoglu/pybboxes/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247761050,"owners_count":20991532,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["computer-vision","deep-learning","image-recognition","machine-learning","numpy","python","pytorch","tensorflow"],"created_at":"2024-10-11T14:15:22.963Z","updated_at":"2025-04-08T01:39:10.786Z","avatar_url":"https://github.com/devrimcavusoglu.png","language":"Python","funding_links":[],"categories":[],"sub_categories":[],"readme":"\u003ch1 align=\"center\"\u003ePyBboxes\u003c/h1\u003e\n\u003cp align=\"center\"\u003e\n\u003ca href=\"https://pypi.org/project/pybboxes\"\u003e\u003cimg src=\"https://img.shields.io/pypi/v/pybboxes?color=blue\" alt=\"Python versions\"\u003e\u003c/a\u003e\n\u003ca href=\"https://pepy.tech/project/pybboxes\"\u003e\u003cimg src=\"https://pepy.tech/badge/pybboxes\" alt=\"Total downloads\"\u003e\u003c/a\u003e\n\u003ca href=\"https://pypi.org/project/pybboxes\"\u003e\u003cimg src=\"https://img.shields.io/pypi/dm/pybboxes?color=blue\" alt=\"Monthly downloads\"\u003e\u003c/a\u003e\n\u003cbr\u003e\n\u003ca href=\"https://pypi.org/project/pybboxes\"\u003e\u003cimg src=\"https://img.shields.io/pypi/pyversions/pybboxes\" alt=\"Python versions\"\u003e\u003c/a\u003e\n\u003ca href=\"https://github.com/devrimcavusoglu/pybboxes/actions/workflows/ci.yml\"\u003e\u003cimg src=\"https://github.com/devrimcavusoglu/pybboxes/actions/workflows/ci.yml/badge.svg\" alt=\"Build\"\u003e\u003c/a\u003e\n\u003ca href=\"https://github.com/devrimcavusoglu/pybboxes/blob/main/LICENSE\"\u003e\u003cimg src=\"https://img.shields.io/github/license/devrimcavusoglu/pybboxes\" alt=\"Python versions\"\u003e\u003c/a\u003e\n\u003c/p\u003e\n\nLight weight toolkit for bounding boxes providing conversion between bounding box types and simple computations. Supported bounding box types (\u003cins\u003eitalicized text indicates normalized values\u003c/ins\u003e):\n\n- **albumentations** : [Albumentations Format](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#albumentations)\n  - **_[x-tl, y-tl, x-br, y-br]_** (Normalized VOC Format) Top-left coordinates \u0026 Bottom-right coordinates\n- **coco** : [COCO (Common Objects in Context)](http://cocodataset.org/)\n  - **[x-tl, y-tl, w, h]** Top-left corner \u0026 width \u0026 height\n- **fiftyone** : [FiftyOne](https://github.com/voxel51/fiftyone)\n  - **_[x-tl, y-tl, w, h]_** (Normalized COCO Format) Top-left coordinates \u0026 width \u0026 height\n- **voc** : [Pascal VOC](http://host.robots.ox.ac.uk/pascal/VOC/)\n  - **[x-tl, y-tl, x-br, y-br]** Top-left coordinates \u0026 Bottom-right coordinates\n- **yolo** : [YOLO](https://github.com/ultralytics/yolov5)\n  - **_[x-c, y-c, w, h]_** Center coordinates \u0026 width \u0026 height\n\n**Glossary**\n\n- **tl:** top-left\n- **br:** bottom-right\n- **h:** height\n- **w:** width\n- **c:** center\n\n\n## News 🔥\n\n* 2024/10/07 - [Annotations](#annotation-file-conversion) are supported for YOLO, COCO and VOC formats.\n\n## Roadmap 🛣️\n\n- [X] Annotation file support.\n- [ ] (Upcoming) 3D Bounding Box support.\n- [ ] (Upcoming) Polygon support.\n\n### Important Notice\nSupport for Python\u003c3.8 will be dropped starting version `0.2` though the development for Python3.6 and Python3.7 may \ncontinue where it will be developed under version `0.1.x` for future versions. This may introduce; however, certain \ndiscrepancies and/or unsupported operations in the `0.1.x` versions. To fully utilize and benefit from the entire \npackage, we recommend using Python3.8 at minimum (`Python\u003e=3.8`).\n\n## Installation\n\nThrough pip (recommended),\n\n    pip install pybboxes\n\nor build from source,\n\n    git clone https://github.com/devrimcavusoglu/pybboxes.git\n    cd pybboxes\n    python setup.py install\n\n## Bounding Boxes\n\nYou can easily create bounding box as easy as\n\n```python\nfrom pybboxes import BoundingBox\n\nmy_coco_box = [98, 345, 322, 117]\ncoco_bbox = BoundingBox.from_coco(*my_coco_box)  # \u003c[98 345 322 117] (322x117) | Image: (?x?)\u003e\n# or alternatively\n# coco_bbox = BoundingBox.from_array(my_coco_box)\n```\n\n### Out of Bounds Boxes\nPybboxes supports OOB boxes, there exists a keyword `strict` in both Box classes (construction) and in functional \nmodules. When `strict=True`, it does not allow out-of-bounds boxes to be constructed and raises an exception, while \nit does allow out-of-bounds boxes to be constructed and used when `strict=False`. Also, there is a property `is_oob` \nthat indicates whether a particular bouding box is OOB or not. \n\n**Important** Note that, if the return value for `is_oob` is `None`, then it indicates that OOB status is unknown \n(e.g. image size required to determine, but not given). Thus, values `None` and `False` indicates different information.\n\n```python\nfrom pybboxes import BoundingBox\n\nimage_size = (640, 480)\nmy_coco_box = [98, 345, 580, 245]  # OOB box for 640x480\ncoco_bbox = BoundingBox.from_coco(*my_coco_box, image_size=image_size)  # Exception\n# ValueError: Given bounding box values is out of bounds. To silently skip out of bounds cases pass 'strict=False'.\n\ncoco_bbox = BoundingBox.from_coco(*my_coco_box, image_size=image_size, strict=False)  # No Exception\ncoco_bbox.is_oob  # True\n```\n\nIf you want to allow OOB, but still check OOB status, you should use `strict=False` and `is_oob` where needed.\n\n### Conversion\n\nWith the `BoundingBox` class the conversion is as easy as one method call.\n\n```python\nfrom pybboxes import BoundingBox\n\nmy_coco_box = [98, 345, 322, 117]\ncoco_bbox = BoundingBox.from_coco(*my_coco_box)  # \u003c[98 345 322 117] (322x117) | Image: (?x?)\u003e\nvoc_bbox = coco_bbox.to_voc()  # \u003c[98 345 420 462] (322x117) | Image: (?x?)\u003e\nvoc_bbox_values = coco_bbox.to_voc(return_values=True)  # (98, 345, 420, 462)\n```\n\nHowever, if you try to make conversion between two bounding boxes that require scaling/normalization it'll give an error\n\n```python\nfrom pybboxes import BoundingBox\n\nmy_coco_box = [98, 345, 322, 117]\ncoco_bbox = BoundingBox.from_coco(*my_coco_box)  # \u003c[98 345 322 117] (322x117) | Image: (?x?)\u003e\n# yolo_bbox = coco_bbox.to_yolo()  # this will raise an exception\n\n# You need to set image_size for coco_bbox and then you're good to go\ncoco_bbox.image_size = (640, 480)\nyolo_bbox = coco_bbox.to_yolo()  # \u003c[0.4047 0.8406 0.5031 0.2437] (322x117) | Image: (640x480)\u003e\n```\n\nImage size associated with the bounding box can be given at the instantiation or while using classmethods e.g \n`from_coco()`.\n\n```python\nfrom pybboxes import BoundingBox\n\nmy_coco_box = [98, 345, 322, 117]\ncoco_bbox = BoundingBox.from_coco(*my_coco_box, image_size=(640, 480))  # \u003c[98 345 322 117] (322x117) | Image: (640x480)\u003e\n# no longer raises exception\nyolo_bbox = coco_bbox.to_yolo()  # \u003c[0.4047 0.8406 0.5031 0.2437] (322x117) | Image: (640x480)\u003e \n```\n\n### Box operations\n\nBox operations now available as of `v0.1.0`.\n\n```python\nfrom pybboxes import BoundingBox\n\nmy_coco_box = [98, 345, 322, 117]\nmy_coco_box2 = [90, 350, 310, 122]\ncoco_bbox = BoundingBox.from_coco(*my_coco_box, image_size=(640, 480))\ncoco_bbox2 = BoundingBox.from_coco(*my_coco_box2, image_size=(640, 480))\n\niou = coco_bbox.iou(coco_bbox2)  # 0.8117110631149508\narea_union = coco_bbox + coco_bbox2  # 41670 | alternative way: coco_bbox.union(coco_bbox2)\ntotal_area = coco_bbox.area + coco_bbox2.area  # 75494  (not union)\nintersection_area = coco_bbox * coco_bbox2  # 33824 | alternative way: coco_bbox.intersection(coco_bbox2)\nfirst_bbox_diff = coco_bbox - coco_bbox2  # 3850\nsecond_bbox_diff = coco_bbox2 - coco_bbox  # 3996\nbbox_ratio = coco_bbox / coco_bbox2 # 0.9961396086726599 (not IOU)\n```\n\n## Functional\n\n**Note**: functional computations are moved under `pybboxes.functional` starting with the version `0.1.0`. The only \nexception is that  `convert_bbox()` which still can be used by importing `pybboxes` only (for backward compatibility).\n\n### Conversion\nYou are able to convert from any bounding box type to another.\n\n```python\nimport pybboxes as pbx\n\ncoco_bbox = (1,2,3,4)  # COCO Format bbox as (x-tl,y-tl,w,h)\nvoc_bbox = (1,2,3,4)  # Pascal VOC Format bbox as (x-tl,y-tl,x-br,y-br)\npbx.convert_bbox(coco_bbox, from_type=\"coco\", to_type=\"voc\")  # (1, 2, 4, 6)\npbx.convert_bbox(voc_bbox, from_type=\"voc\", to_type=\"coco\")  # (1, 2, 2, 2)\n```\n\nSome formats require image width and height information for scaling, e.g. YOLO bbox (resulting coordinates \nare rounded to 2 decimals to ease reading).\n\n```python\nimport pybboxes as pbx\n\nvoc_bbox = (1,2,3,4)  # Pascal VOC Format bbox as (x-tl,y-tl,x-br,y-br)\npbx.convert_bbox(voc_bbox, from_type=\"voc\", to_type=\"yolo\", image_size=(28, 28))  # (0.07, 0.11, 0.07, 0.07)\n```\n\n### Computation\nYou can also make computations on supported bounding box formats.\n\n```python\nimport pybboxes.functional as pbf\n\ncoco_bbox = (1,2,3,4)  # COCO Format bbox as (x-tl,y-tl,w,h)\nvoc_bbox = (1,2,3,4)  # Pascal VOC Format bbox as (x-tl,y-tl,x-br,y-br)\npbf.compute_area(coco_bbox, bbox_type=\"coco\")  # 12\npbf.compute_area(voc_bbox, bbox_type=\"voc\")  # 4\n```\n\n## Annotation file conversion\n`pybboxes` now supports the conversion of annotation file(s) across different annotation formats. (yolo, voc and coco are currently supported)\n\nThis is a 3 step process.\n\n### 1. Instantiate the Annotations class\n```python\nfrom pybboxes.annotations import Annotations\n\nanns = Annotations(annotation_type='yolo')\n```\n\n**Important** you have to explicitly declare `annotation_type` beforehand. post declaration, *you will be only able to load annotation in declared format* but you will be able to export to other annotation formats.\n\n### 2. Load the annotations file\nAfter you have instantiated the `Annotations` class declaring `annotation_type`, you can now load the annotations using appropriate method of the `Annotations` class. \n\n#### 2.1 Load from yolo\n```python\nfrom pybboxes.annotations import Annotations\n\nanns = Annotations(annotation_type='yolo')\n\nanns.load_from_yolo(labels_dir='./labels', images_dir='./images', classes_file='./classes.txt')\n```\n\nAs yolo normalizes the bounding box metadata, path to corresponding  images directory must be provided (via `images_dir`) so that physical dimension of image data can be inferred.\n\nAlso, path to `classes_file` (usually classes.txt) should be provided that lists all the class labels that is used for the annotation. Without this, `pybboxes` will fail to assign appropriate class labels when converting across different annotations format.\n\n#### 2.2 Load from voc\n```python\nfrom pybboxes.annotations import Annotations\n\nanns = Annotations(annotation_type='voc')\n\nanns.load_from_voc(labels_dir='./labels')\n```\n\n#### 2.3 Load from coco\n```python\nfrom pybboxes.annotations import Annotations\n\nanns = Annotations(annotation_type='coco')\n\nanns.load_from_coco(json_path='./validation.json')\n```\n\n### 3. Saving annotations to different format\n#### 3.1 Saving annotations to yolo format\nAs every image data has its own corresponding annotation file in yolo format, you have to provide path to `export_dir` where all the annotation files will be written. \n\n```python\nfrom pybboxes.annotations import Annotations\n\nanns = Annotations(annotation_type='coco') # just for the demonstration purpose\n\nanns.load_from_coco(json_path='./validation.json') # we could have loaded the annotation data from other format as well\n\nanns.save_as_yolo(export_dir='./labels')\n```\nThis will create all the required annotation files (in yolo format) in given directory. Additionally, it will also create `classes.txt` in the given folder which will list all the class labels used for the annotation.\n\n#### 3.2 Saving annotations to voc format\nJust like yolo format, in voc format, every image data has also its own corresponding annotation file. So, you have to provide path to `export_dir` where all the annotation files will be written.\n\n```python\nfrom pybboxes.annotations import Annotations\n\nanns = Annotations(annotation_type='coco') # just for the demonstration purpose\n\nanns.load_from_coco(json_path='./validation.json') # we could have loaded the annotation data from other format as well\n\nanns.save_as_voc(export_dir='./labels')\n```\n\n\n#### 3.3 Saving annotations to coco format\nTo export annotations in coco format, you just have to provide name (or path) of the output file (in json format) via `export_file`.\n\n```python\nfrom pybboxes.annotations import Annotations\n\nanns = Annotations(annotation_type='voc') # just for the demonstration purpose\n\nanns.load_from_voc(labels_dir='./labels') # we could have loaded the annotation data from other format as well\n\nanns.save_as_coco(export_file='./validation.json')\n```\n\n## Contributing\n\n### Installation\n\nInstall the package as follows, which will set you ready for the development mode.\n\n```shell\npip install -e .[dev]\n```\n\n### Tests\n\nTo tests simply run.\n\n    python -m tests.run_tests\n\n### Code Style\n\nTo check code style,\n\n    python -m tests.run_code_style check\n\nTo format codebase,\n\n    python -m tests.run_code_style format\n\n## License\n\nLicensed under the [MIT](LICENSE) License.","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdevrimcavusoglu%2Fpybboxes","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdevrimcavusoglu%2Fpybboxes","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdevrimcavusoglu%2Fpybboxes/lists"}