{"id":13488691,"url":"https://github.com/databricks/spark-sklearn","last_synced_at":"2025-09-30T07:30:55.232Z","repository":{"id":57469606,"uuid":"41820544","full_name":"databricks/spark-sklearn","owner":"databricks","description":"(Deprecated) Scikit-learn integration package for Apache Spark","archived":true,"fork":false,"pushed_at":"2019-12-03T18:37:45.000Z","size":801,"stargazers_count":1078,"open_issues_count":15,"forks_count":228,"subscribers_count":93,"default_branch":"master","last_synced_at":"2025-09-17T03:51:49.059Z","etag":null,"topics":["apache-spark","grid-search","machine-learning","parameter-tuning","scikit-learn"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/databricks.png","metadata":{"files":{"readme":"README.rst","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2015-09-02T18:44:51.000Z","updated_at":"2025-06-24T07:52:16.000Z","dependencies_parsed_at":"2022-09-19T10:12:08.151Z","dependency_job_id":null,"html_url":"https://github.com/databricks/spark-sklearn","commit_stats":null,"previous_names":[],"tags_count":7,"template":false,"template_full_name":null,"purl":"pkg:github/databricks/spark-sklearn","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/databricks%2Fspark-sklearn","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/databricks%2Fspark-sklearn/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/databricks%2Fspark-sklearn/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/databricks%2Fspark-sklearn/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/databricks","download_url":"https://codeload.github.com/databricks/spark-sklearn/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/databricks%2Fspark-sklearn/sbom","scorecard":{"id":324241,"data":{"date":"2024-06-17","repo":{"name":"github.com/databricks/spark-sklearn","commit":"201c4e6cccee5927aa379297ce6656b8f4ed46f0"},"scorecard":{"version":"v5.0.0-rc2-62-gda0f2b4e","commit":"da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8"},"score":3.6,"checks":[{"name":"Code-Review","score":4,"reason":"Found 9/21 approved changesets -- score normalized to 4","details":null,"documentation":{"short":"Determines if the project requires human code review before pull requests (aka merge requests) are merged.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#code-review"}},{"name":"Maintained","score":0,"reason":"project is archived","details":["Warn: Repository is archived."],"documentation":{"short":"Determines if the project is \"actively maintained\".","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#maintained"}},{"name":"CII-Best-Practices","score":0,"reason":"no effort to earn an OpenSSF best practices badge detected","details":null,"documentation":{"short":"Determines if the project has an OpenSSF (formerly CII) Best Practices Badge.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#cii-best-practices"}},{"name":"License","score":10,"reason":"license file detected","details":["Info: project has a license file: LICENSE:0","Info: FSF or OSI recognized license: Apache License 2.0: LICENSE:0"],"documentation":{"short":"Determines if the project has defined a license.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#license"}},{"name":"Signed-Releases","score":-1,"reason":"no releases found","details":null,"documentation":{"short":"Determines if the project cryptographically signs release artifacts.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#signed-releases"}},{"name":"Branch-Protection","score":-1,"reason":"internal error: error during branchesHandler.setup: internal error: githubv4.Query: Resource not accessible by integration","details":null,"documentation":{"short":"Determines if the default and release branches are protected with GitHub's branch protection settings.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#branch-protection"}},{"name":"Packaging","score":-1,"reason":"packaging workflow not detected","details":["Warn: no GitHub/GitLab publishing workflow detected."],"documentation":{"short":"Determines if the project is published as a package that others can easily download, install, easily update, and uninstall.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#packaging"}},{"name":"Dangerous-Workflow","score":-1,"reason":"no workflows found","details":null,"documentation":{"short":"Determines if the project's GitHub Action workflows avoid dangerous patterns.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#dangerous-workflow"}},{"name":"Token-Permissions","score":-1,"reason":"No tokens found","details":null,"documentation":{"short":"Determines if the project's workflows follow the principle of least privilege.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#token-permissions"}},{"name":"Binary-Artifacts","score":10,"reason":"no binaries found in the repo","details":null,"documentation":{"short":"Determines if the project has generated executable (binary) artifacts in the source repository.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#binary-artifacts"}},{"name":"Pinned-Dependencies","score":-1,"reason":"no dependencies found","details":null,"documentation":{"short":"Determines if the project has declared and pinned the dependencies of its build process.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#pinned-dependencies"}},{"name":"Fuzzing","score":0,"reason":"project is not fuzzed","details":["Warn: no fuzzer integrations found"],"documentation":{"short":"Determines if the project uses fuzzing.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#fuzzing"}},{"name":"Security-Policy","score":0,"reason":"security policy file not detected","details":["Warn: no security policy file detected","Warn: no security file to analyze","Warn: no security file to analyze","Warn: no security file to analyze"],"documentation":{"short":"Determines if the project has published a security policy.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#security-policy"}},{"name":"Vulnerabilities","score":7,"reason":"3 existing vulnerabilities detected","details":["Warn: Project is vulnerable to: GHSA-jjw5-xxj6-pcv5 / PYSEC-2020-107","Warn: Project is vulnerable to: GHSA-jw8x-6495-233v","Warn: Project is vulnerable to: PYSEC-2020-108"],"documentation":{"short":"Determines if the project has open, known unfixed vulnerabilities.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#vulnerabilities"}},{"name":"SAST","score":0,"reason":"SAST tool is not run on all commits -- score normalized to 0","details":["Warn: 0 commits out of 30 are checked with a SAST tool"],"documentation":{"short":"Determines if the project uses static code analysis.","url":"https://github.com/ossf/scorecard/blob/da0f2b4ebca563a6f7f1ca2f4099c336f7ce8bd8/docs/checks.md#sast"}}]},"last_synced_at":"2025-08-18T02:05:52.800Z","repository_id":57469606,"created_at":"2025-08-18T02:05:52.800Z","updated_at":"2025-08-18T02:05:52.800Z"},"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":277544182,"owners_count":25836481,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-09-29T02:00:09.175Z","response_time":84,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["apache-spark","grid-search","machine-learning","parameter-tuning","scikit-learn"],"created_at":"2024-07-31T18:01:20.169Z","updated_at":"2025-09-30T07:30:54.940Z","avatar_url":"https://github.com/databricks.png","language":"Python","readme":"Deprecation\n===========\n\nThis project is deprecated.\nWe now recommend using scikit-learn and `Joblib Apache Spark Backend \u003chttps://github.com/joblib/joblib-spark\u003e`_\nto distribute scikit-learn hyperparameter tuning tasks on a Spark cluster:\n\nYou need ``pyspark\u003e=2.4.4`` and ``scikit-learn\u003e=0.21`` to use Joblib Apache Spark Backend, which can be installed using ``pip``:\n\n.. code:: bash\n\n    pip install joblibspark\n\nThe following example shows how to distributed ``GridSearchCV`` on a Spark cluster using ``joblibspark``.\nSame applies to ``RandomizedSearchCV``.\n\n.. code:: python\n\n    from sklearn import svm, datasets\n    from sklearn.model_selection import GridSearchCV\n    from joblibspark import register_spark\n    from sklearn.utils import parallel_backend\n\n    register_spark() # register spark backend\n\n    iris = datasets.load_iris()\n    parameters = {'kernel':('linear', 'rbf'), 'C':[1, 10]}\n    svr = svm.SVC(gamma='auto')\n\n    clf = GridSearchCV(svr, parameters, cv=5)\n\n    with parallel_backend('spark', n_jobs=3):\n        clf.fit(iris.data, iris.target)\n\n\nScikit-learn integration package for Apache Spark\n=================================================\n\nThis package contains some tools to integrate the `Spark computing framework \u003chttps://spark.apache.org/\u003e`_\nwith the popular `scikit-learn machine library \u003chttps://scikit-learn.org/stable/\u003e`_. Among other things, it can:\n\n- train and evaluate multiple scikit-learn models in parallel. It is a distributed analog to the\n  `multicore implementation \u003chttps://pythonhosted.org/joblib/parallel.html\u003e`_ included by default in ``scikit-learn``\n- convert Spark's Dataframes seamlessly into numpy ``ndarray`` or sparse matrices\n- (experimental) distribute Scipy's sparse matrices as a dataset of sparse vectors\n\nIt focuses on problems that have a small amount of data and that can be run in parallel.\nFor small datasets, it distributes the search for estimator parameters (``GridSearchCV`` in scikit-learn),\nusing Spark. For datasets that do not fit in memory, we recommend using the `distributed implementation in\n`Spark MLlib \u003chttps://spark.apache.org/docs/latest/api/python/pyspark.mllib.html\u003e`_.\n\nThis package distributes simple tasks like grid-search cross-validation.\nIt does not distribute individual learning algorithms (unlike Spark MLlib).\n\nInstallation\n------------\n\nThis package is available on PYPI:\n\n::\n\n\tpip install spark-sklearn\n\nThis project is also available as `Spark package \u003chttps://spark-packages.org/package/databricks/spark-sklearn\u003e`_.\n\nThe developer version has the following requirements:\n\n- scikit-learn 0.18 or 0.19. Later versions may work, but tests currently are incompatible with 0.20.\n- Spark \u003e= 2.1.1. Spark may be downloaded from the `Spark website \u003chttps://spark.apache.org/\u003e`_.\n  In order to use this package, you need to use the pyspark interpreter or another Spark-compliant python\n  interpreter. See the `Spark guide \u003chttps://spark.apache.org/docs/latest/programming-guide.html#overview\u003e`_\n  for more details.\n- `nose \u003chttps://nose.readthedocs.org\u003e`_ (testing dependency only)\n- pandas, if using the pandas integration or testing. pandas==0.18 has been tested.\n\nIf you want to use a developer version, you just need to make sure the ``python/`` subdirectory is in the\n``PYTHONPATH`` when launching the pyspark interpreter:\n\n::\n\n\tPYTHONPATH=$PYTHONPATH:./python:$SPARK_HOME/bin/pyspark\n\nYou can directly run tests:\n\n::\n\n    cd python \u0026\u0026 ./run-tests.sh\n\nThis requires the environment variable ``SPARK_HOME`` to point to your local copy of Spark.\n\nExample\n-------\n\nHere is a simple example that runs a grid search with Spark. See the `Installation \u003c#installation\u003e`_ section\non how to install the package.\n\n.. code:: python\n\n    from sklearn import svm, datasets\n    from spark_sklearn import GridSearchCV\n    iris = datasets.load_iris()\n    parameters = {'kernel':('linear', 'rbf'), 'C':[1, 10]}\n    svr = svm.SVC(gamma='auto')\n    clf = GridSearchCV(sc, svr, parameters)\n    clf.fit(iris.data, iris.target)\n\nThis classifier can be used as a drop-in replacement for any scikit-learn classifier, with the same API.\n\n\nDocumentation\n-------------\n\n`API documentation \u003chttp://databricks.github.io/spark-sklearn-docs\u003e`_ is currently hosted on Github pages. To\nbuild the docs yourself, see the instructions in ``docs/``.\n\n.. image:: https://travis-ci.org/databricks/spark-sklearn.svg?branch=master\n    :target: https://travis-ci.org/databricks/spark-sklearn\n","funding_links":[],"categories":["Python"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdatabricks%2Fspark-sklearn","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdatabricks%2Fspark-sklearn","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdatabricks%2Fspark-sklearn/lists"}