Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mtth/hdfs
API and command line interface for HDFS
https://github.com/mtth/hdfs
cli hdfs python
Last synced: 6 days ago
JSON representation
API and command line interface for HDFS
- Host: GitHub
- URL: https://github.com/mtth/hdfs
- Owner: mtth
- License: mit
- Created: 2014-03-10T19:49:09.000Z (almost 11 years ago)
- Default Branch: master
- Last Pushed: 2024-09-24T03:58:23.000Z (4 months ago)
- Last Synced: 2024-12-26T01:04:01.782Z (13 days ago)
- Topics: cli, hdfs, python
- Language: Python
- Homepage: https://hdfscli.readthedocs.io
- Size: 611 KB
- Stars: 271
- Watchers: 15
- Forks: 103
- Open Issues: 22
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGES
- License: LICENSE
- Authors: AUTHORS
Awesome Lists containing this project
README
# HdfsCLI [![CI](https://github.com/mtth/hdfs/actions/workflows/ci.yml/badge.svg)](https://github.com/mtth/hdfs/actions/workflows/ci.yml) [![Pypi badge](https://badge.fury.io/py/hdfs.svg)](https://pypi.python.org/pypi/hdfs/) [![Downloads badge](https://img.shields.io/pypi/dm/hdfs.svg)](https://pypistats.org/packages/hdfs)
API and command line interface for HDFS.
```
$ hdfscli --alias=devWelcome to the interactive HDFS python shell.
The HDFS client is available as `CLIENT`.In [1]: CLIENT.list('models/')
Out[1]: ['1.json', '2.json']In [2]: CLIENT.status('models/2.json')
Out[2]: {
'accessTime': 1439743128690,
'blockSize': 134217728,
'childrenNum': 0,
'fileId': 16389,
'group': 'supergroup',
'length': 48,
'modificationTime': 1439743129392,
'owner': 'drwho',
'pathSuffix': '',
'permission': '755',
'replication': 1,
'storagePolicy': 0,
'type': 'FILE'
}In [3]: with CLIENT.read('models/2.json', encoding='utf-8') as reader:
...: from json import load
...: model = load(reader)
...:
```## Features
* Python 3 bindings for the [WebHDFS][] (and [HttpFS][]) API,
supporting both secure and insecure clusters.
* Command line interface to transfer files and start an interactive client
shell, with aliases for convenient namenode URL caching.
* Additional functionality through optional extensions:+ `avro`, to [read and write Avro files directly from HDFS][].
+ `dataframe`, to [load and save Pandas dataframes][].
+ `kerberos`, to [support Kerberos authenticated clusters][].See the [documentation][] to learn more.
## Getting started
```sh
$ pip install hdfs
```Then hop on over to the [quickstart][] guide. A [Conda
feedstock](https://github.com/conda-forge/python-hdfs-feedstock) is also
available.## Testing
HdfsCLI is tested against both [WebHDFS][] and [HttpFS][]. There are two ways
of running tests (see `scripts/` for helpers to set up a test HDFS cluster):```sh
$ HDFSCLI_TEST_URL=http://localhost:50070 pytest # Using a namenode's URL.
$ HDFSCLI_TEST_ALIAS=dev pytest # Using an alias.
```## Contributing
We'd love to hear what you think on the [issues][] page. Pull requests are also
most welcome![HttpFS]: http://hadoop.apache.org/docs/current/hadoop-hdfs-httpfs/
[WebHDFS]: http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html
[read and write Avro files directly from HDFS]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.avro
[load and save Pandas dataframes]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.dataframe
[support Kerberos authenticated clusters]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.kerberos
[documentation]: https://hdfscli.readthedocs.io/
[quickstart]: https://hdfscli.readthedocs.io/en/latest/quickstart.html
[issues]: https://github.com/mtth/hdfs/issues