Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/radanalyticsio/jupyter-notebook-py3.5
https://github.com/radanalyticsio/jupyter-notebook-py3.5
jupyter-notebook openshift-templates python35 spark
Last synced: 5 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/radanalyticsio/jupyter-notebook-py3.5
- Owner: radanalyticsio
- Created: 2017-11-07T14:08:50.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2018-04-30T17:16:09.000Z (almost 7 years ago)
- Last Synced: 2024-12-24T01:30:38.391Z (about 2 months ago)
- Topics: jupyter-notebook, openshift-templates, python35, spark
- Language: Shell
- Size: 6.84 KB
- Stars: 0
- Watchers: 5
- Forks: 3
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# jupyter-notebook-py3.5
This is a container image intended to make it easy to run Jupyter notebooks/jupyterlab with Apache Spark on OpenShift. You can use it as-is (by adding it to a project), or you can use it as the basis for another image. In the latter case, you'll probably want to add some notebooks, data, and/or additional packages to the derived image.
The Image uses python python 3.5## Usage
### As a standalone image
For your convenience, binary image builds are available from Docker Hub.
* Add the image `radanalyticsio/jupyter-notebook-py3.5` to an OpenShift project.
* Set `JUPYTER_NOTEBOOK_PASSWORD` in the pod environment to something you can remember (this step is optional but highly recommended; if you don't do this, you'll need to trawl the logs for an access token for your new notebook).
* Set `JUPYTERLAB` to `true` in the pod environment to use jupyterlab UI.
* Create a route to the pod.### deploy on openshift
```
oc new-app radanalyticsio/jupyter-notebook-py3.5 -e JUPYTER_NOTEBOOK_PASSWORD=developer -e JUPYTERLAB=false```
### As a base image* As `nbuser` (uid 1011), add notebooks to `/notebooks` and data to `/data`.
* This process should be easier in the future; stay tuned!## Notes
Make sure that this notebook image is running the same version of Spark as the external cluster you want to connect it to.
## Credits
This image was initially based on [Graham Dumpleton's images](https://github.com/getwarped/jupyter-stacks), which have some additional functionality (notably s2i support) that we'd like to incorporate in the future.