https://github.com/oatpp/funnel-proxy
Transparent multiplexing proxy
https://github.com/oatpp/funnel-proxy
async http multiplexing oatpp proxy
Last synced: 3 months ago
JSON representation
Transparent multiplexing proxy
- Host: GitHub
- URL: https://github.com/oatpp/funnel-proxy
- Owner: oatpp
- License: apache-2.0
- Created: 2020-05-19T21:51:21.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-05-22T23:40:53.000Z (over 5 years ago)
- Last Synced: 2024-10-29T22:48:55.339Z (about 1 year ago)
- Topics: async, http, multiplexing, oatpp, proxy
- Language: C++
- Homepage: https://oatpp.io/
- Size: 15.6 KB
- Stars: 3
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Funnel-Proxy
Transparent multiplexing proxy.
It can handle hundreds of thousand connections and proxy them to downstream services using a fixed connection pool.
It can also be embedded in Oat++ services via [monolithization](https://oatpp.io/docs/monolithization/).
See more:
- [Oat++ Website](https://oatpp.io/)
- [Oat++ Github Repository](https://github.com/oatpp/oatpp)
- [Oat++ Monolithization](https://oatpp.io/docs/monolithization/)
## When To Use
- You have a simple request-response API and you want to increase the possible concurrency level it can run on.
## When Not To Use
- Streaming. You can't do streaming through funnel-proxy, because the stream will acquire a connection from connection-pool
forever and you'll limit your concurrency level to the number of connections in the connection pool.
*However you can implement your custom streaming on the proxy-side - when proxy doing continuous streaming while requesting chunk-by-chunk from the source service*
- Long-polling - same as for streaming.