Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/hibuz/hadoop-docker
🐳 hadoop ecosystems docker image
https://github.com/hibuz/hadoop-docker
data-engineering docker docker-compose flink hadoop hbase hive spark zeppelin
Last synced: 5 days ago
JSON representation
🐳 hadoop ecosystems docker image
- Host: GitHub
- URL: https://github.com/hibuz/hadoop-docker
- Owner: hibuz
- Created: 2024-03-21T13:56:28.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-10-30T21:06:14.000Z (20 days ago)
- Last Synced: 2024-10-30T22:17:51.757Z (20 days ago)
- Topics: data-engineering, docker, docker-compose, flink, hadoop, hbase, hive, spark, zeppelin
- Language: Dockerfile
- Homepage:
- Size: 149 KB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Quick usage for hadoop-dev docker image
- Docker build and run
``` bash
git clone https://github.com/hibuz/hadoop-docker
cd hadoop-dockerdocker compose up hadoop-dev --no-build
```## Docker build & run for custom hadoop user and version
- see [Dockerfile](Dockerfile)
Hadoop Build Order``` bash
# hadoop
hadoop-docker$ docker build -t hibuz/hadoop-dev .
# hbase|spark|hive|flink
hadoop-docker/(hbase|spark|hive|flink)$ docker compose up --build
# flink-base for zeppelin
hadoop-docker/zeppelin$ docker compose build flink-base
# zeppelin
hadoop-docker/zeppelin$ docker compose up --build
```
### Attach to running container
``` bash
docker exec -it hadoop bash
```### Prepare input files into the distributed filesystem
``` bash
# Make the HDFS directories
hdfs dfs -mkdir -p /user/hadoop/input
# Copy the input files
hdfs dfs -put $HADOOP_HOME/etc/hadoop/*.xml input
```### Run some of the examples provided:
``` bash
# Run example wordcount job:
hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar wordcount input output
# View the output files on the distributed filesystem:
hdfs dfs -cat output/*# Run example wordcount grep job:
hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar grep input output/count 'dfs[a-z.]+'
# View the output files on the distributed filesystem:
hdfs dfs -cat output/count/*
# Result of the output files
1 dfsadmin
1 dfs.replication# Remove the output dir:
hdfs dfs -rm -r output
```# Visit hadoop dashboard
- Hadoop Dashboard: http://localhost:9870
- Yarn Dashboard: http://localhost:8088 (run start-yarn.sh or uncomment command props in [docker-compose.yml](docker-compose.yml))
- Hadoop Job History: http://localhost:19888### Stops containers and removes containers, networks, and volumes created by `up`.
``` bashdocker compose down -v
[+] Running 3/3
✔ Container hbase Removed
✔ Volume hbase_hbase-vol Removed
✔ Network hbase_default Removed
```# Reference
- [Execute MapReduce jobs](https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html#Execution)
- https://github.com/rancavil/hadoop-single-node-cluster
- https://github.com/big-data-europe/docker-hadoop