Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/p1atdev/mangalineextraction-hf
MangaLineExtraction in transformers library format
https://github.com/p1atdev/mangalineextraction-hf
Last synced: 6 days ago
JSON representation
MangaLineExtraction in transformers library format
- Host: GitHub
- URL: https://github.com/p1atdev/mangalineextraction-hf
- Owner: p1atdev
- License: apache-2.0
- Created: 2024-02-21T04:43:09.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2024-02-21T11:35:12.000Z (11 months ago)
- Last Synced: 2024-11-22T00:47:59.888Z (about 2 months ago)
- Language: Jupyter Notebook
- Homepage: https://huggingface.co/p1atdev/MangaLineExtraction-hf
- Size: 2.06 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# MangaLineExtraction-hf
A huggingface transformers compatible implementation of MangaLineExtraction (https://github.com/ljsabc/MangaLineExtraction_PyTorch).
The converted weights are avaiable on [🤗 HuggingFace](https://huggingface.co/p1atdev/MangaLineExtraction-hf).
## Example usage with transformers
```py
from PIL import Image
import torchfrom transformers import AutoModel, AutoImageProcessor
REPO_NAME = "p1atdev/MangaLineExtraction-hf"
model = AutoModel.from_pretrained(REPO_NAME, trust_remote_code=True)
processor = AutoImageProcessor.from_pretrained(REPO_NAME, trust_remote_code=True)image = Image.open("./sample.jpg")
inputs = processor(image, return_tensors="pt")
with torch.no_grad():
outputs = model(inputs.pixel_values)line_image = Image.fromarray(outputs.pixel_values[0].numpy().astype("uint8"), mode="L")
line_image.save("./line_image.png")
```## Acknowledgements
We extend our gratitude to the authors of the [Deep Extraction of Manga Structural Lines](https://www.cse.cuhk.edu.hk/~ttwong/papers/linelearn/linelearn.html) and the contributors to the [MangaLineExtraction_PyTorch](https://github.com/ljsabc/MangaLineExtraction_PyTorch) for their pioneering work that served as the foundation for our adaptation. Our thanks also go to HuggingFace for developing [transformers](https://github.com/huggingface/transformers), enabling us to enhance this project further.
For detailed information, please refer to:
- https://www.cse.cuhk.edu.hk/~ttwong/papers/linelearn/linelearn.html
- https://github.com/huggingface/transformers