Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/databricks/ide-best-practices
Best practices for working with Databricks from an IDE
https://github.com/databricks/ide-best-practices
Last synced: 4 days ago
JSON representation
Best practices for working with Databricks from an IDE
- Host: GitHub
- URL: https://github.com/databricks/ide-best-practices
- Owner: databricks
- License: apache-2.0
- Created: 2022-06-03T15:28:08.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-04-19T11:20:14.000Z (over 1 year ago)
- Last Synced: 2024-08-02T13:21:41.928Z (3 months ago)
- Language: Python
- Size: 25.4 KB
- Stars: 47
- Watchers: 10
- Forks: 47
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Best practices for using an IDE with Databricks
This repository is a companion for the example article "Use an IDE with Databricks" for [AWS](https://docs.databricks.com/dev-tools/ide-how-to.html), [Azure](https://docs.microsoft.com/azure/databricks/dev-tools/ide-how-to), and [GCP](https://docs.gcp.databricks.com/dev-tools/ide-how-to.html).
This example features the use of Visual Studio Code, Python, `dbx` by Databricks Labs (for [AWS](https://docs.databricks.com/dev-tools/dbx.html), [Azure](https://docs.microsoft.com/azure/databricks/dev-tools/dbx), and [GCP](https://docs.gcp.databricks.com/dev-tools/dbx.html)), `pytest`, and GitHub Actions.
You can adapt this example for use with other Python-compatible IDEs such as PyCharm, IntelliJ IDEA with the Python plugin, and Eclipse with the PyDev plugin.
Going through the example, you will use your IDE to:
* Get data from the [owid/covid-19-data](https://github.com/owid/covid-19-data) repo in GitHub.
* Filter the data for a specific ISO country code.
* Create a pivot table from the data.
* Perform data cleansing on the data.
* Modularize the code logic into reusable functions.
* Unit test the functions.
* Provide `dbx` project configurations and settings to enable the code to write the data to a Delta table in a remote Databricks workspace.The only time you need to use the Databricks user interface for this example is to see the results of writing the data to your Databricks workspace. (And even then, you can use the Databricks Jobs REST API or the Databricks Jobs CLI for [AWS](https://docs.databricks.com/data-engineering/jobs/jobs.html), [Azure](https://docs.microsoft.com/azure/databricks/data-engineering/jobs/jobs), or [GCP](https://docs.gcp.databricks.com/data-engineering/jobs/jobs.html) to get some high-level information about those data writes programmatically.) Otherwise, you can complete all of these development tasks from within your favorite Python-compatible IDE.
The example is hands-on. We recommend working through the example article to learn how to apply these techniques to your own Databricks development tasks from within your favorite Python-compatible IDE.