https://github.com/yukinagae/promptfoo-sample
Sample project demonstrates how to use Promptfoo, a test framework for evaluating the output of generative AI models
https://github.com/yukinagae/promptfoo-sample
evaluation evaluation-framework llm llm-eval llm-evaluation llm-evaluation-framework llmops prompt-testing promptfoo prompts testing
Last synced: about 1 month ago
JSON representation
Sample project demonstrates how to use Promptfoo, a test framework for evaluating the output of generative AI models
- Host: GitHub
- URL: https://github.com/yukinagae/promptfoo-sample
- Owner: yukinagae
- License: mit
- Created: 2024-09-09T12:06:04.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-09-10T12:53:24.000Z (8 months ago)
- Last Synced: 2025-02-15T04:42:52.721Z (3 months ago)
- Topics: evaluation, evaluation-framework, llm, llm-eval, llm-evaluation, llm-evaluation-framework, llmops, prompt-testing, promptfoo, prompts, testing
- Homepage: https://medium.com/@yukinagae/generative-ai-evaluation-with-promptfoo-a-comprehensive-guide-e23ea95c1bb7
- Size: 334 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
https://medium.com/@yukinagae/generative-ai-evaluation-with-promptfoo-a-comprehensive-guide-e23ea95c1bb7
---
To get started, set your OPENAI_API_KEY environment variable, or other required keys for the providers you selected.
Next, edit promptfooconfig.yaml.
Then run:
```
promptfoo eval
```Afterwards, you can view the results by running `promptfoo view`