Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/kentnl/ipc-run-fused
Capture Stdout/Stderr simultaneously as if it were one stream, painlessly.
https://github.com/kentnl/ipc-run-fused
Last synced: 4 days ago
JSON representation
Capture Stdout/Stderr simultaneously as if it were one stream, painlessly.
- Host: GitHub
- URL: https://github.com/kentnl/ipc-run-fused
- Owner: kentnl
- License: other
- Created: 2009-10-20T22:46:11.000Z (about 15 years ago)
- Default Branch: master
- Last Pushed: 2017-03-08T15:26:27.000Z (over 7 years ago)
- Last Synced: 2023-08-20T23:06:51.601Z (about 1 year ago)
- Language: Perl
- Homepage:
- Size: 258 KB
- Stars: 2
- Watchers: 3
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.mkdn
- Changelog: Changes
- License: LICENSE
Awesome Lists containing this project
README
# NAME
IPC::Run::Fused - Capture Stdout/Stderr simultaneously as if it were one stream, painlessly.
# VERSION
version 1.000002
# SYNOPSIS
use IPC::Run::Fused qw( run_fused );
run_fused( my $fh, $stderror_filled_program, '--file', $tricky_filename, @moreargs ) || die "Argh $@";
open my $fh, '>', 'somefile.txt' || die "NOO $@";# Simple implementation of 'tee' like behaviour,
# sending to stdout and to a file.while ( my $line = <$fh> ) {
print $fh $line;
print $line;
}# DESCRIPTION
Have you ever tried to do essentially the same as you would in bash do to this:
parentapp <( app 2>&1 )
And found massive road works getting in the way.
Sure, you can always do this style syntax:
open my $fh, 'someapp --args foo 2>&1 |';
But that's not very nice, because
- 1. you're relying on a sub-shell to do that for you
- 2. you have to manually escape everything
- 3. you can't use list context.And none of this is very **Modern** or **Nice**
# SIMPLEST THING THAT JUST WORKS
This code is barely tested, its here, because I spent hours griping about how the existing ways suck.
# FEATURES
- 1. No String Interpolation.
Arguments after the first work as if you'd passed them directly to 'system'. You can be as dangerous or as
safe as you want with them. We recommend passing a list, but a string ( as a scalar reference ) should workBut if you're using a string, this modules probably not affording you much.
- 2. No dicking around with managing multiple file handles yourself.
I looked at [`IPC::Run`](https://metacpan.org/pod/IPC::Run), [`IPC::Run3`](https://metacpan.org/pod/IPC::Run3) and [`IPC::Open3`](https://metacpan.org/pod/IPC::Open3), and they all seemed
very unfriendly, and none did what I wanted.- 3. Non-global file-handles supported by design.
All the competition seem to still have this thing for global file handles and you having to use them. Yuck!.
We have a few global file-handles inside our code, but they're only `STDERR` and `STDOUT`, at present I don't think I can
circumvent that. If I ever can, I'll endeavor to do so ☺# EXPORTED FUNCTIONS
No functions are exported by default, be explicit with what you want.
At this time there is only one, and there are no plans for more.
## run\_fused
run_fused( $fh, $executable, @params ) || die "$@";
run_fused( $fh, \$command_string ) || die "$@";
run_fused( $fh, sub { .. } ) || die "$@";# Recommended
run_fused( my $fh, $execuable, @params ) || die "$@";
# Somewhat supported
run_fused( my $fh, \$command_string ) || die "$@";
$fh will be clobbered like 'open' does, and $cmd, @args will be passed, as-is, through to exec() or system().
$fh will point to an IO::Handle attached to the end of a pipe running back to the called application.
the command will be run in a fork, and `STDERR` and `STDOUT` "fused" into a singular pipe.
**NOTE:** at present, `STDIN`'s file-descriptor is left unchanged, and child processes will inherit parent `STDIN`'s, and will thus block ( somewhere ) waiting for response.
# AUTHOR
Kent Fredric
# COPYRIGHT AND LICENSE
This software is copyright (c) 2017 by Kent Fredric .
This is free software; you can redistribute it and/or modify it under
the same terms as the Perl 5 programming language system itself.