Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/neo4j/neo4j-spark-connector
Neo4j Connector for Apache Spark, which provides bi-directional read/write access to Neo4j from Spark, using the Spark DataSource APIs
https://github.com/neo4j/neo4j-spark-connector
bolt cypher hacktoberfest neo4j-connector neo4j-driver spark
Last synced: about 5 hours ago
JSON representation
Neo4j Connector for Apache Spark, which provides bi-directional read/write access to Neo4j from Spark, using the Spark DataSource APIs
- Host: GitHub
- URL: https://github.com/neo4j/neo4j-spark-connector
- Owner: neo4j
- License: apache-2.0
- Created: 2016-03-03T16:01:08.000Z (almost 9 years ago)
- Default Branch: 5.0
- Last Pushed: 2025-01-24T16:38:04.000Z (9 days ago)
- Last Synced: 2025-01-26T00:02:18.803Z (7 days ago)
- Topics: bolt, cypher, hacktoberfest, neo4j-connector, neo4j-driver, spark
- Language: Scala
- Homepage: https://neo4j.com/developer/spark/
- Size: 2.91 MB
- Stars: 314
- Watchers: 34
- Forks: 114
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
README
# Neo4j Connector for Apache Spark
This repository contains the Neo4j Connector for Apache Spark.
## License
This neo4j-connector-apache-spark is Apache 2 Licensed
## Documentation
The documentation for Neo4j Connector for Apache Spark lives at https://github.com/neo4j/docs-spark repository.
## Building for Spark 3
You can build for Spark 3.x with both Scala 2.12 and Scala 2.13
```
./maven-release.sh package 2.12
./maven-release.sh package 2.13
```These commands will generate the corresponding targets
* `spark-3/target/neo4j-connector-apache-spark_2.12-_for_spark_3.jar`
* `spark-3/target/neo4j-connector-apache-spark_2.13-_for_spark_3.jar`## Integration with Apache Spark Applications
**spark-shell, pyspark, or spark-submit**
`$SPARK_HOME/bin/spark-shell --jars neo4j-connector-apache-spark_2.12-_for_spark_3.jar`
`$SPARK_HOME/bin/spark-shell --packages org.neo4j:neo4j-connector-apache-spark_2.12:_for_spark_3`
**sbt**
If you use the [sbt-spark-package plugin](https://github.com/databricks/sbt-spark-package), in your sbt build file, add:
```scala
resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"
libraryDependencies += "org.neo4j" % "neo4j-connector-apache-spark_2.12" % "_for_spark_3"
```**maven**
In your pom.xml, add:
```xml
org.neo4j
neo4j-connector-apache-spark_2.12
[version]_for_spark_3
```
For more info about the available version visit https://neo4j.com/developer/spark/overview/#_compatibility