Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/malexer/nose2-spark
nose2 plugin to run the tests with support of pyspark.
https://github.com/malexer/nose2-spark
Last synced: 3 days ago
JSON representation
nose2 plugin to run the tests with support of pyspark.
- Host: GitHub
- URL: https://github.com/malexer/nose2-spark
- Owner: malexer
- License: mit
- Created: 2016-11-14T18:07:16.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2016-11-15T22:55:35.000Z (almost 8 years ago)
- Last Synced: 2024-04-29T09:03:23.938Z (7 months ago)
- Language: Python
- Homepage:
- Size: 3.91 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.rst
- License: LICENSE
Awesome Lists containing this project
README
nose2-spark
===========`nose2`_ plugin to run the tests with support of pyspark (`Apache Spark`_).
Features:
1. Make "pyspark" importable in you code executed by nose2.
2. Add a list of `py-files`_ dependencies of your pyspark application (which
is usually supplied as an option ``spark-submit --py-files ...``).Install
-------.. code-block:: shell
$ pip install nose2-spark
Usage
-----Load "nose2-spark" plugin into nose2 by creating ``nose2.cfg`` in your project
directory::[unittest]
plugins = nose2_sparkRun tests with nose2-spark activated (pyspark and friends are added to
pythonpath)::$ nose2 --pyspark
nose2-spark will try to import pyspark by looking into:
1. SPARK_HOME environment variable
2. Some common Spark locations.You can set it manually in case if all of mentioned methods are failing
to find Spark. Add section "nose2-spark" to ``nose2.cfg``::[nose2-spark]
spark_home = /opt/sparkYou can add a list of required `py-files`_ to run your code::
[nose2-spark]
pyfiles = package1.zip
package2.zipExample
-------Example of ``nose2.cfg`` with spark_home defined, one `py-files`_ dependency and
auto activating nose2-spark plugin::[unittest]
plugins = nose2_spark[nose2-spark]
always-on = True
spark_home = /opt/spark
pyfiles = package1.zipThis will allow to run tests by single command::
$ nose2
.. _nose2: http://nose2.readthedocs.io/
.. _Apache Spark: https://spark.apache.org/
.. _py-files: http://spark.apache.org/docs/latest/submitting-applications.html#bundling-your-applications-dependencies