https://github.com/neptune-ai/neptune-deployment-tests
Tests worth running to confirm successful setup & performance of a self-hosted neptune instance.
https://github.com/neptune-ai/neptune-deployment-tests
neptune neptune-ai
Last synced: about 1 month ago
JSON representation
Tests worth running to confirm successful setup & performance of a self-hosted neptune instance.
- Host: GitHub
- URL: https://github.com/neptune-ai/neptune-deployment-tests
- Owner: neptune-ai
- Created: 2025-07-29T09:15:38.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2025-07-31T16:30:16.000Z (2 months ago)
- Last Synced: 2025-08-29T11:54:08.834Z (about 1 month ago)
- Topics: neptune, neptune-ai
- Language: Python
- Homepage: https://neptune.ai/product/self-hosted
- Size: 17.6 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Neptune Deployment Tests
Tests worth running to confirm successful setup of a self-hosted neptune instance.
Tests are divided into:
1. `basic_test.py` that only checks minimal logging functionality & can be parametrized by number of metrics & steps logged
2. `feature_tests.py` with comprehensive checks for most logging & retrieval features (recommended for deployment validation)
3. `performance_tests.py` measuring logging throughput & retrieval performance (not implemented yet)Some (<10%) of the tests will prompt you for maual validation actions, to be completed as a kind of post-deployment "checklist".
## Quick start
Dependencies:
1. Create a project called `deployment-tests` in your workspace using neptune web app
2. Clone this repository & set up necessary environment variables:
- `export NEPTUNE_API_TOKEN=my-api-token`, using your user token copied from your neptune web app
- `export NEPTUNE_PROJECT=my-workspace/deployment-tests`, using workspace & project name from step 1We recommend running the scripts using `uv`, which supports [inline script metadata with package dependencies](https://docs.astral.sh/uv/guides/scripts/#declaring-script-dependencies):
```bash
uv run basic_test.py
uv run feature_tests.py # this one will prompt you for 2-3 manual validation actions in the web app
```You can re-try specifc tests using filtering:
```bash
uv run feature_tests.py -f fetcher # only runs tests that contain "fetcher" in their funciton name - no manual actions required
```Alternatively, you can use `pip` & `virtualenv`:
```bash
python3 -m venv .env
source .env/bin/activate
pip install -r requirements.txt
```And then run the tests using:
```bash
python basic_test.py
python feature_tests.py
```To make test output less verbose, set [`NEPTUNE_LOGGER_LEVEL` environment variable](https://docs.neptune.ai/environment_variables/neptune_scale/#neptune_logger_level) to `warning`.