Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/theSuriya/Gemma-SQL-Generator
Google Gemma Fine tune SQL Generator
https://github.com/theSuriya/Gemma-SQL-Generator
Last synced: 3 months ago
JSON representation
Google Gemma Fine tune SQL Generator
- Host: GitHub
- URL: https://github.com/theSuriya/Gemma-SQL-Generator
- Owner: theSuriya
- License: mit
- Created: 2024-04-21T10:02:36.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-07-30T08:27:27.000Z (5 months ago)
- Last Synced: 2024-07-30T11:37:45.098Z (5 months ago)
- Language: Jupyter Notebook
- Homepage: https://huggingface.co/suriya7/Gemma2B-Finetuned-Sql-Generator
- Size: 29.3 KB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Gemma 2B Fine-Tuned SQL Generator
## Introduction
The Gemma 2B SQL Generator is a specialized version of the Gemma 2B model, fine-tuned to generate SQL queries based on a given SQL context. This model has been tailored to assist developers and analysts in generating accurate SQL queries automatically, enhancing productivity and reducing the scope for errors.## Model Details
- **Model Type:** Gemma 2B
- **Fine-Tuning Details:** The model was fine-tuned specifically for generating SQL queries.
- **Training Loss:** Achieved a training loss of 0.3, indicating a high level of accuracy in SQL query generation.
## Model Link
The model Available on HuggingFace[Click Here](https://huggingface.co/suriya7/Gemma2B-Finetuned-Sql-Generator)
## Installation
To set up the necessary environment for using the SQL Generator, run the following commands:
```bash
pip install torch torch
pip install transformers
```
To set up the necessary environment for Fine Tuning the SQL Generator, run the following commands:
```bash
!pip install -q -U datasets==2.16.1
!pip install -q -U bitsandbytes==0.42.0
!pip install -q -U peft==0.8.2
!pip install -q -U trl==0.7.10
!pip install -q -U accelerate==0.27.1
!pip install -q -U transformers==4.38.0
```
## Inference
```python
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLMtokenizer = AutoTokenizer.from_pretrained("suriya7/Gemma2B-Finetuned-Sql-Generator")
model = AutoModelForCausalLM.from_pretrained("suriya7/Gemma2B-Finetuned-Sql-Generator")prompt_input = input('enter the prompt')
context_input = input('enter the context')prompt_template = """
user
You are an intelligent AI specialized in generating SQL queries.
Your task is to assist users in formulating SQL queries to retrieve specific information from a database.
Please provide the SQL query corresponding to the given prompt and context:Prompt:
{prompt}Context:
{content}
model
"""prompt = prompt_template.format(prompt=prompt_input,context=context_input)
encodeds = tokenizer(prompt, return_tensors="pt", add_special_tokens=True).input_idsdevice = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
model.to(device)
inputs = encodeds.to(device)# Increase max_new_tokens if needed
generated_ids = model.generate(inputs, max_new_tokens=1000, do_sample=True, temperature = 0.7,pad_token_id=tokenizer.eos_token_id)
ans = ''
for i in tokenizer.decode(generated_ids[0], skip_special_tokens=True).split('')[:2]:
ans += i# Extract only the model's answer
model_answer = ans.split("model")[1].strip()
print(model_answer)
```
## Issues and ContributionsIf you encounter any issues or have suggestions for improving the model, please:
- Open an issue on the GitHub repository.
- Contact Suriya at [mail here]([email protected]).
## License
This model is released under the MIT License. See the LICENSE file for more details.
## Acknowledgments
- Thanks to the Hugging Face team for providing the pre-trained Google Gemma-2b model.