Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/furk4neg3/ibm_hf-model-loading-inference
A practical guide on loading and performing inference with Hugging Face models. Demonstrates both manual and pipeline-based approaches for text classification and generation, completed as part of IBM’s Generative AI Engineering with LLMs Specialization course.
https://github.com/furk4neg3/ibm_hf-model-loading-inference
Last synced: about 2 months ago
JSON representation
A practical guide on loading and performing inference with Hugging Face models. Demonstrates both manual and pipeline-based approaches for text classification and generation, completed as part of IBM’s Generative AI Engineering with LLMs Specialization course.
- Host: GitHub
- URL: https://github.com/furk4neg3/ibm_hf-model-loading-inference
- Owner: furk4neg3
- Created: 2024-11-10T14:38:38.000Z (about 2 months ago)
- Default Branch: main
- Last Pushed: 2024-11-10T14:41:50.000Z (about 2 months ago)
- Last Synced: 2024-11-10T15:32:31.548Z (about 2 months ago)
- Language: Jupyter Notebook
- Size: 0 Bytes
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Loading Models and Inference with Hugging Face
This project explores how to load and perform inference with pretrained models using the Hugging Face `transformers` library. Using models like DistilBERT and GPT-2, we carry out text classification and text generation tasks. This project also highlights the simplicity of the Hugging Face `pipeline()` function for efficient NLP model inference.
## Overview
This lab project demonstrates:
- Loading and tokenizing input text for model inference
- Performing text classification and text generation using Hugging Face models
- Comparing traditional inference with the Hugging Face `pipeline()` function for streamlined deployment## Table of Contents
1. [Introduction to Model Loading](#introduction-to-model-loading)
2. [Text Classification with DistilBERT](#text-classification-with-distilbert)
3. [Text Generation with GPT-2](#text-generation-with-gpt-2)
4. [Using the Hugging Face Pipeline](#using-the-hugging-face-pipeline)
5. [Requirements](#requirements)
6. [References](#references)## Introduction to Model Loading
In this section, we explore loading pretrained models and tokenizing text inputs for NLP tasks. This approach enables us to use models like DistilBERT and GPT-2 without relying solely on the pipeline method.
## Text Classification with DistilBERT
Using DistilBERT, we classify text into categories. This part demonstrates setting up the model, tokenizing inputs, and processing classification outputs.
## Text Generation with GPT-2
In this section, we use the GPT-2 model to generate text. We cover input preparation, running inference, and processing generated sequences.
## Using the Hugging Face Pipeline
The `pipeline()` function in Hugging Face simplifies model loading and inference by providing an easy-to-use API for common NLP tasks like classification and generation.
## Requirements
- Python 3.7+
- Hugging Face Transformers Library
- PyTorch## References
- [IBM AI Engineering Professional Certificate](https://www.coursera.org/professional-certificates/ai-engineer?)
- [Generative AI Engineering with LLMs Specialization](https://www.coursera.org/specializations/generative-ai-engineering-with-llms)
- [Generative AI Engineering and Fine-Tuning Transformers](https://www.coursera.org/learn/generative-ai-engineering-and-fine-tuning-transformers?specialization=generative-ai-engineering-with-llms)