Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mach-kernel/databricks-kube-operator
A Kubernetes operator to enable GitOps style deploys for Databricks resources
https://github.com/mach-kernel/databricks-kube-operator
ci cicd databricks gitops helm kubernetes operators rust spark
Last synced: 2 months ago
JSON representation
A Kubernetes operator to enable GitOps style deploys for Databricks resources
- Host: GitHub
- URL: https://github.com/mach-kernel/databricks-kube-operator
- Owner: mach-kernel
- License: apache-2.0
- Created: 2022-10-30T18:25:49.000Z (about 2 years ago)
- Default Branch: master
- Last Pushed: 2024-09-04T19:39:27.000Z (5 months ago)
- Last Synced: 2024-09-06T04:12:40.949Z (5 months ago)
- Topics: ci, cicd, databricks, gitops, helm, kubernetes, operators, rust, spark
- Language: Rust
- Homepage:
- Size: 1.53 MB
- Stars: 14
- Watchers: 2
- Forks: 3
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
---
description: A Kubernetes operator for Databricks
coverY: 0
---# 🦀 databricks-kube-operator
[![Rust](https://github.com/mach-kernel/databricks-kube-operator/actions/workflows/rust.yml/badge.svg?branch=master)](https://github.com/mach-kernel/databricks-kube-operator/actions/workflows/rust.yml)
[![FOSSA Status](https://app.fossa.com/api/projects/custom%2B34302%2Fgithub.com%2Fmach-kernel%2Fdatabricks-kube-operator.svg?type=shield)](https://app.fossa.com/projects/custom%2B34302%2Fgithub.com%2Fmach-kernel%2Fdatabricks-kube-operator?ref=badge_shield)A [kube-rs](https://kube.rs/) operator to enable GitOps style management of Databricks resources. It supports the following APIs:
| API | CRD |
| ------------------- | --------------------------------------- |
| Jobs 2.1 | DatabricksJob |
| Git Credentials 2.0 | GitCredential |
| Repos 2.0 | Repo |
| Secrets 2.0 | DatabricksSecretScope, DatabricksSecret |Experimental headed towards stable. See the GitHub project board for the roadmap. Contributions and feedback are welcome!
[Read the docs](https://databricks-kube-operator.gitbook.io/doc)
## Quick Start
Looking for a more in-depth example? Read the [tutorial](tutorial.md).
### Installation
Add the Helm repository and install the chart:
```bash
helm repo add mach https://mach-kernel.github.io/databricks-kube-operator
helm install databricks-kube-operator mach/databricks-kube-operator
```Create a config map in the same namespace as the operator. To override the configmap name, `--set configMapName=my-custom-name`:
```bash
cat <
TODO: Manual client 'fixes'```bash
# Hey!! This uses GNU sed
# brew install gnu-sed# Jobs API
openapi-generator generate -g rust -i openapi/jobs-2.1-aws.yaml -c openapi/config-jobs.yaml -o dbr_jobs# Derive JsonSchema for all models and add schemars as dep
gsed -i -e 's/derive(Clone/derive(JsonSchema, Clone/' dbr_jobs/src/models/*
gsed -i -e 's/\/\*/use schemars::JsonSchema;\n\/\*/' dbr_jobs/src/models/*
gsed -r -i -e 's/(\[dependencies\])/\1\nschemars = "0.8.11"/' dbr_jobs/Cargo.toml# Missing import?
gsed -r -i -e 's/(use reqwest;)/\1\nuse crate::models::ViewsToExport;/' dbr_jobs/src/apis/default_api.rs# Git Credentials API
openapi-generator generate -g rust -i openapi/gitcredentials-2.0-aws.yaml -c openapi/config-git.yaml -o dbr_git_creds# Derive JsonSchema for all models and add schemars as dep
gsed -i -e 's/derive(Clone/derive(JsonSchema, Clone/' dbr_git_creds/src/models/*
gsed -i -e 's/\/\*/use schemars::JsonSchema;\n\/\*/' dbr_git_creds/src/models/*
gsed -r -i -e 's/(\[dependencies\])/\1\nschemars = "0.8.11"/' dbr_git_creds/Cargo.toml# Repos API
openapi-generator generate -g rust -i openapi/repos-2.0-aws.yaml -c openapi/config-repos.yaml -o dbr_repo# Derive JsonSchema for all models and add schemars as dep
gsed -i -e 's/derive(Clone/derive(JsonSchema, Clone/' dbr_repo/src/models/*
gsed -i -e 's/\/\*/use schemars::JsonSchema;\n\/\*/' dbr_repo/src/models/*
gsed -r -i -e 's/(\[dependencies\])/\1\nschemars = "0.8.11"/' dbr_repo/Cargo.toml# Secrets API
openapi-generator generate -g rust -i openapi/secrets-aws.yaml -c openapi/config-secrets.yaml -o dbr_secrets
sed -i -e 's/derive(Clone/derive(JsonSchema, Clone/' dbr_secrets/src/models/*
sed -i -e 's/\/\*/use schemars::JsonSchema;\n\/\*/' dbr_secrets/src/models/*
sed -r -i -e 's/(\[dependencies\])/\1\nschemars = "0.8.11"/' dbr_secrets/Cargo.toml
```### Expand CRD macros
Deriving `CustomResource` uses macros to generate another struct. For this example, the output struct name would be `DatabricksJob`:
```rust
#[derive(Clone, CustomResource, Debug, Default, Deserialize, PartialEq, Serialize, JsonSchema)]
#[kube(
group = "com.dstancu.databricks",
version = "v1",
kind = "DatabricksJob",
derive = "Default",
namespaced
)]
pub struct DatabricksJobSpec {
pub job: Job,
}
````rust-analyzer` shows squiggles when you `use crds::databricks_job::DatabricksJob`, but one may want to look inside. To see what is generated with [cargo-expand](https://github.com/dtolnay/cargo-expand):
```bash
rustup default nightly
cargo expand --bin databricks_kube
```### Adding a new CRD
Want to add support for a new API? Provided it has an OpenAPI definition, these are the steps. Look for existing examples in the codebase:
* Download API definition into `openapi/` and make a [Rust generator configuration](https://openapi-generator.tech/docs/generators/rust/) (feel free to copy the others and change name)
* Generate the SDK, add it to the Cargo workspace and dependencies for `databricks-kube/`
* Implement `RestConfig` for your new client
* Define the new CRD Spec type ([follow kube-rs tutorial](https://kube.rs/getting-started/))
* `impl RemoteAPIResource for MyNewCRD`
* `impl StatusAPIResource for MyNewCRD` and [specify `TStatusType` in your CRD](https://github.com/kube-rs/kube/blob/main/examples/crd_derive.rs#L20)
* Add the new resource to the context ensure CRDs condition
* Add the new resource to `crdgen.rs`### Running tests
Tests must be run with a single thread since we use a stateful singleton to 'mock' the state of a remote API. Eventually it would be nice to have integration tests targetting Databricks.
```bash
$ cargo test -- --test-threads=1
```## License
[![FOSSA Status](https://app.fossa.com/api/projects/custom%2B34302%2Fgithub.com%2Fmach-kernel%2Fdatabricks-kube-operator.svg?type=large)](https://app.fossa.com/projects/custom%2B34302%2Fgithub.com%2Fmach-kernel%2Fdatabricks-kube-operator?ref=badge_large)