https://github.com/datitran/raccoon_dataset
The dataset is used to train my own raccoon detector and I blogged about it on Medium
https://github.com/datitran/raccoon_dataset
dataset tensorflow-experiments
Last synced: 2 months ago
JSON representation
The dataset is used to train my own raccoon detector and I blogged about it on Medium
- Host: GitHub
- URL: https://github.com/datitran/raccoon_dataset
- Owner: datitran
- License: mit
- Created: 2017-07-27T19:04:10.000Z (almost 8 years ago)
- Default Branch: master
- Last Pushed: 2021-10-12T12:49:25.000Z (over 3 years ago)
- Last Synced: 2025-04-01T05:32:26.667Z (2 months ago)
- Topics: dataset, tensorflow-experiments
- Language: Jupyter Notebook
- Homepage: https://medium.com/towards-data-science/how-to-train-your-own-object-detector-with-tensorflows-object-detector-api-bec72ecfe1d9
- Size: 46.9 MB
- Stars: 1,272
- Watchers: 32
- Forks: 971
- Open Issues: 84
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Raccoon Detector Dataset
This is a dataset that I collected to train my own Raccoon detector with [TensorFlow's Object Detection API](https://github.com/tensorflow/models/tree/master/research/object_detection). Images are from Google and Pixabay. In total, there are 200 images (160 are used for training and 40 for validation).
## Getting Started
##### Folder Structure:
```
+ annotations: contains the xml files in PASCAL VOC format
+ data: contains the input file for the TF object detection API and the label files (csv)
+ images: contains the image data in jpg format
+ training: contains the pipeline configuration file, frozen model and labelmap
- a few handy scripts: generate_tfrecord.py is used to generate the input files
for the TF API and xml_to_csv.py is used to convert the xml files into one csv
- a few jupyter notebooks: draw boxes is used to plot some of the data and
split labels is used to split the full labels into train and test labels
```## Copyright
See [LICENSE](LICENSE) for details.
Copyright (c) 2017 [Dat Tran](http://www.dat-tran.com/).