https://github.com/mlverse/pysparklyr
Extension to {sparklyr} that allows you to interact with Spark & Databricks Connect
https://github.com/mlverse/pysparklyr
databricks pyspark r spark spark-connect
Last synced: 10 months ago
JSON representation
Extension to {sparklyr} that allows you to interact with Spark & Databricks Connect
- Host: GitHub
- URL: https://github.com/mlverse/pysparklyr
- Owner: mlverse
- License: other
- Created: 2023-06-22T18:02:09.000Z (almost 3 years ago)
- Default Branch: main
- Last Pushed: 2025-06-30T21:10:15.000Z (10 months ago)
- Last Synced: 2025-06-30T22:23:58.012Z (10 months ago)
- Topics: databricks, pyspark, r, spark, spark-connect
- Language: R
- Homepage: https://spark.posit.co/deployment/databricks-connect.html
- Size: 714 KB
- Stars: 17
- Watchers: 7
- Forks: 4
- Open Issues: 17
-
Metadata Files:
- Readme: README.Rmd
- Changelog: NEWS.md
- License: LICENSE
Awesome Lists containing this project
README
---
output: github_document
---
```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
fig.path = "man/figures/README-",
out.width = "100%"
)
```
# pysparklyr
[](https://github.com/mlverse/pysparklyr/actions/workflows/R-CMD-check.yaml)
[](https://github.com/mlverse/pysparklyr/actions/workflows/spark-tests.yaml)
[](https://app.codecov.io/gh/mlverse/pysparklyr)
[](https://CRAN.R-project.org/package=pysparklyr)
Integrates `sparklyr` with PySpark and Databricks. The main reason of this
package is because the new Spark and Databricks Connect connection method does
not work with standard `sparklyr` integration.
## Installing
To install the version in CRAN use:
```r
install.packages("pysparklyr")
```
To get the development version from GitHub use:
```r
remotes::install_github("mlverse/pysparklyr")
```
## Using
To learn how to use, please visit the Spark / Databricks Connect article,
available in the official `sparklyr` website: [Spark Connect, and Databricks Connect v2
](https://spark.posit.co/deployment/databricks-connect.html)