Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mortoray/shelljob
Python process and file system module
https://github.com/mortoray/shelljob
capture parallel python shell-script subprocesses
Last synced: 2 months ago
JSON representation
Python process and file system module
- Host: GitHub
- URL: https://github.com/mortoray/shelljob
- Owner: mortoray
- License: mit
- Created: 2020-07-25T14:23:57.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2022-12-08T07:45:33.000Z (about 2 years ago)
- Last Synced: 2024-08-10T10:28:15.156Z (5 months ago)
- Topics: capture, parallel, python, shell-script, subprocesses
- Language: Python
- Homepage:
- Size: 287 KB
- Stars: 6
- Watchers: 2
- Forks: 3
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: license.txt
Awesome Lists containing this project
README
# shelljob
This provides a clean way to execute subprocesses, either one or multiple in parallel, capture their output and monitor progress:
- Single sub process `call` with optional timeout
- High level `FileMonitor` to execute several processes in parallel and store output in a file
- Low level `Group` execution to execute jobs in parallel and capture outputAdditional tools for working with the filesystem are also included:
- `find` which offers much of the functionality of the shell find utility
- `shelljob.fs.NamedTempFile` provides a _with_ block wrapper for temporary named files[API Documentation](https://mortoray.github.io/shelljob/)
I use this actively in my own projects, such as my online escape games at [Edaqa's Room](https://edaqasroom.com/). I've not needed any new features in a while -- it's a fairly stable package.
# Install
```
pip install shelljob
```# Parallel Subprocesses
Using the Job system is the quickest approach to just run processes and log their output (by default in files named '/tmp/job_ID.log')
````
from shelljob import jobjm = job.FileMonitor()
jm.run([
[ 'ls', '-alR', '/usr/local' ],
'my_prog',
'build output input',
])
````An array will passed directly to `subprocess.Popen`, a string is first parsed with `shlex.split`.
The lower level `Group` class provides a simple container for more manual job management.
````
from shelljob import procg = proc.Group()
p1 = g.run( [ 'ls', '-al', '/usr/local' ] )
p2 = g.run( [ 'program', 'arg1', 'arg2' ] )while g.is_pending():
lines = g.readlines()
for proc, line in lines:
sys.stdout.write( "{}:{}".format( proc.pid, line ) )
````## Encoding
By default the output will be binary encoding. You can specify an `encoding='utf-8'` to the `run` command to use an encoded text stream instead. Be aware that if the encoding fails (the program emits an invalid sequence) the running will be interrupted. You should also use the `on_error` function to check for this.
Line-endings will always be preserved.
# Simple Subprocess calls
A simplified `call` function allows timeouts on subprocesses and easy acces to their output.
````
from shelljob import proc# capture the output
output = proc.call( 'ls /tmp' )
# this raises a proc.Timeout exception
proc.call( 'sleep 10', timeout = 0.1 )
````# Find
The 'find' funtion is a multi-faceted approach to generating listings of files.
````
from shelljob import fsfiles = fs.find( '/usr/local', name_regex = '.*\\.so' )
print( "\n".join(files) )
````Refer to the [API docs](https://mortoray.github.io/shelljob/) for all parameters. Just let me know if there is some additional option you need.