Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/daniel-lima-lopez/road-identification-via-semantic-segmentation
Example of the use of the ENet model trained on the Cityscapes dataset, applied to the identification of the region of pixels belonging to the road and its edge.
https://github.com/daniel-lima-lopez/road-identification-via-semantic-segmentation
neura opencv python semantic-segmentation
Last synced: 15 days ago
JSON representation
Example of the use of the ENet model trained on the Cityscapes dataset, applied to the identification of the region of pixels belonging to the road and its edge.
- Host: GitHub
- URL: https://github.com/daniel-lima-lopez/road-identification-via-semantic-segmentation
- Owner: daniel-lima-lopez
- Created: 2024-09-10T19:50:36.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2024-09-11T02:21:29.000Z (4 months ago)
- Last Synced: 2024-10-31T12:46:39.584Z (2 months ago)
- Topics: neura, opencv, python, semantic-segmentation
- Language: Jupyter Notebook
- Homepage:
- Size: 4.33 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Road-identification-via-Semantic-Segmentation
This repository presents an example of the use of the ENet model trained on the Cityscapes dataset. This example shows how to identify the region of pixels belonging to the road and its edge.## Installation
Clone this repository:
```bash
git clone [email protected]:daniel-lima-lopez/Road-identification-via-Semantic-Segmentation.git
```move to installation folder:
```bash
cd Road-identification-via-Semantic-Segmentation
```## Example
This example is applied on the following test image:The pixel region belonging to the road is characterized by the ENet model prediction, then its edge is identified by the gradient information of the image:
Finally, this information is highlighted in the original image:
This example can be run in the notebook [example.ipynb](example.ipynb)