An open API service indexing awesome lists of open source software.

https://github.com/FrancoisChabot/variadic_future

Variadic, completion-based futures for C++17
https://github.com/FrancoisChabot/variadic_future

cpp futures lockless multithreading

Last synced: 12 months ago
JSON representation

Variadic, completion-based futures for C++17

Awesome Lists containing this project

README

          

[![CircleCI](https://circleci.com/gh/FrancoisChabot/variadic_future.svg?style=svg)](https://circleci.com/gh/FrancoisChabot/variadic_future)
[![Build status](https://ci.appveyor.com/api/projects/status/b7ppx6xmmor89h4q/branch/master?svg=true)](https://ci.appveyor.com/project/FrancoisChabot/variadic-future/branch/master)
[![Codacy Badge](https://api.codacy.com/project/badge/Grade/862b964980034316abf5d3d02c9ee63e)](https://www.codacy.com/app/FrancoisChabot/variadic_future?utm_source=github.com&utm_medium=referral&utm_content=FrancoisChabot/variadic_future&utm_campaign=Badge_Grade)
[![Total alerts](https://img.shields.io/lgtm/alerts/g/FrancoisChabot/variadic_future.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/FrancoisChabot/variadic_future/alerts/)
[![Language grade: C/C++](https://img.shields.io/lgtm/grade/cpp/g/FrancoisChabot/variadic_future.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/FrancoisChabot/variadic_future/context:cpp)
[![Documentation](https://img.shields.io/badge/docs-doxygen-blue.svg)](https://francoischabot.github.io/variadic_future/annotated.html)
# Variadic futures

High-performance variadic completion-based futures for C++17.

* No external dependency
* Header-only
* Lockless

## Why?

This was needed to properly implement [Easy gRPC](https://github.com/FrancoisChabot/easy_grpc), and it was an interesting exercise.

## What

Completion-based futures are a non-blocking, callback-based, synchronization mechanism that hides the callback logic from the asynchronous code, while properly handling error conditions.

A fairly common pattern is to have some long operation perform a callback upon its completion. At first glance, this seems pretty straightforward:

```cpp
void do_something(int x, int y, std::function on_complete);

void foo() {
do_something(1, 12, [](int val) {
std::cout << val << "\n";
});
}
```

However, there's a few hidden complexities at play here. The code within `do_something()` has to make decisions about what to do with `on_complete`. Should `on_complete` be called inline or put in a work pool? Can we accept a default constructed `on_complete`? What should we do with error conditions? The path of least resistance led us to writing code with no error handling whatsoever...

With Futures, these decisions are delegated to the *caller* of `do_something()`, which prevents `do_something()` from having to know much about the context within which it is operating. Error handling is also not optional, so you will never have an error dropped on the floor.

```cpp
Future do_something(int x, int y);

void foo() {
do_something(1, 12).finally([](expected val) {
if(val.has_value()) {
std::cout << val << "\n";
}
});
```

It *looks* essentially the same, but now implementing `do_something()` is a lot more straightforward, less error-prone, and supports many more operation modes out of the box.

Once you start combining things, you can express some fairly complicated synchronization relationships in a clear and concise manner:

```cpp
Future foo() {
Future fut_a = do_something_that_produces_an_int();
Future fut_b = do_something_that_produces_a_bool();

// Create a future that triggers once both fut_a and fut_b are ready
Future combined_fut = join(fut_a, fut_b);

// This callback will only be invoked if both fut_a and fut_b are successfully fullfilled. Otherwise,
// The failure gets automatically propagated to the resulting future.
Future result = combined_fut.then([](int a, bool b) {
std::cout << a << " - " << b;
});

return result;
}
```

## Documentation

You can find the auto-generated API reference [here](https://francoischabot.github.io/variadic_future/annotated.html).

## Installation

* Make the contents of the include directory available to your project.
* Have a look at `var_future/config.h` and make changes as needed.
* If you are from the future, you may want to use `std::expected` instead of `expected_lite`,

## Usage
### Prerequisites

I am assuming you are already familiar with the [expected<>](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0323r7.html) concept/syntax. `aom::expected` is simply a `std::expected`.

### Consuming futures

Let's say that you are using a function that happens to return a `Future<...>`, and you want to execute a callback when the values becomes available:

```cpp
Future get_value_eventually();
```

The `Future` will **eventually** be **fullfilled** with an `int` and a `float` or **failed** with one or more `std::exception_ptr`, up to one per field.

The simplest thing you can do is call `finally()` on it. This will register a callback that will be invoked when both values are available or failed:

```cpp
auto f = get_value_eventually();

f.finally([](expected v, expected f) {
if(v.has_value() && f.has_value()) {
std::cout << "values are " << *v << " and " << *f << "\n";
}
});
```

Alternatively, if you want to create a future that is **completed** once the callback has **completed**, you can use `then_expect()`.

Like `finally()`, `then_expect()` invokes its callback when all values are either fullfilled or failed. However, this time, the return value of the callback is used to populate a `Future` (even if the callback returns `void`). If the callback happens to throw an exception (like invoking `value()` on an `expected` containing an error), then that exception becomes the result's failure.

Rules:

- if the callback returns a `Future`, that produces a `Future`.
- if the callback returns a `expected`, that produces a `Future`.
- if the callback returns `segmented(T, U)`, that produces a `Future`.
- Otherwise if the callback returns `T`, that produces a `Future`

```cpp
auto f = get_value_eventually();

Future result = f.then_expect([](expected v, expected f) {
// Reminder: expected::value() throws an exception if it contains an error.
return f.value() * v.value();
});
```

Finally, this pattern of propagating a future's failure as the failure of its callback's result is so common that a third method does that all at once: `then()`.

Here, if `f` contains one or more **failures**, then the callback is never invoked at all, and the first error is immediately propagated as the `result`'s failure.

The same return value rules as `then_expect()` apply.

```cpp
auto f = get_value_eventually();

Future result = f.then([](int v, float f) {
return f * v;
});
```

In short:

| | error-handling | error-propagating |
|----------------|-------------------------|-------------------|
| **chains** | `then_expect()` | `then()` |
| **terminates** | `finally()` | N/A |

#### Void fields

If a callback attached to `then_expect()` or `then()` returns `void`, that produces a `Future`.

`Future<>::then()` has special handling of void fields: They are ommited entirely from the callback arguments:

```cpp
Future f_a;
Future f_b;
Future f_c;

f_a.then([](){});
f_b.then([](int v){});
f_c.then([](float f, int v){});
```

#### The Executor

The callback can either

1. Be executed directly wherever the future is fullfilled (**immediate**)
2. Be posted to a work pool to be executed by some worker (**deffered**)

**immediate** mode is used by default, just pass your callback to your chosen method and you are done.

N.B. If the future is already fullfilled by the time a callback is attached in **immediate** mode, the callback will be invoked in the thread attaching the callback as the callback is being attached.

For **deferred** mode, you need to pass your queue (or an adapter) as the first parameter to the method. The queue only needs to be some type that implements `void push(T&&)` where `T` is a `Callable`.

```cpp

struct Queue {
// In almost all cases, this needs to be thread-safe.
void push(std::function cb);
};

void foo(Queue& queue) {
get_value_eventually()
.then([](int v){ return v * v;})
.finally(queue, [](expected v) {
if(v.has_value()) {
std::cerr << "final value: " << *v << "\n";
}
});
}
```

### Producing futures

Futures can be created by `Future::then()` or `Future::then_expect()`, but the chain has to start somewhere.

#### Promises

`Promise` is a lightweight interface you can use to create a future that will eventually be fullfilled (or failed).

```cpp
Promise prom;
Future fut = prom.get_future();

std::thread thread([p=std::move(prom)](){
p.set_value(3);
});
```

#### async

`async()` will post the passed operation to the queue, and return a future to the value returned by that function.

```cpp
aom::Future fut = aom::async(queue, [](){return 12.0;})
```

#### Joining futures

You can wait on multiple futures at the same time using the `join()` function.

```cpp

#include "var_future/future.h"

void foo() {
aom::Future fut_a = ...;
aom::Future fut_b = ...;

aom::Future combined = join(fut_a, fut_b);

combined.finally([](aom::expected a, aom::expected b){
//Do something with a and/or b;
});
}
```

#### Posting callbacks to an ASIO context.

This example shows how to use [ASIO](https://think-async.com/Asio/), but the same idea can be applied to other contexts easily.

```cpp
#include "asio.hpp"
#include "var_future/future.h"

// This can be any type that has a thread-safe push(Callable); method
struct Work_queue {
template
void push(T&& cb) {
asio::post(ctx_, std::forward(cb));
}

asio::io_context& ctx_;
};

int int_generating_operation();

void foo() {
asio::io_context io_ctx;
Work_queue asio_adapter{io_ctx};

// Queue the operation in the asio context, and get a future to the result.
aom::Future fut = aom::async(asio_adapter, int_generating_operation);

// push the execution of this callback in io_context when ready.
fut.finally(asio_adapter, [](aom::expected v) {
//Do something with v;
});
}
```

#### get_std_future()

`Future<>` provides `get_std_future()`, as well as `get()`, which is the exact same as `get_std_future().get()` as a convenience for explicit synchronization.

This was added primarily to simplify writing unit tests, and using it extensively in other contexts is probably a bit of a code smell. If you find yourself that a lot, then perhaps you should just be using `std::future<>` directly instead.

```cpp
Future f1 = ...;
std::future x_f = f1.get_std_future();

Future f2 = ...;
int x = f2.get();
```

### Future Streams

**Warning:** The stream API and performance are not nearly as mature and tested as `Future<>`/`Promise<>`.

#### Producing Future streams
```cpp
aom::Stream_future get_stream() {
aom::Stream_promise prom;
auto result = prom.get_future();

std::thread worker([p = std::move(prom)]() mutable {
p.push(1);
p.push(2);
p.push(3);
p.push(4);

// If p is destroyed, the stream is implicitely failed.
p.complete();
});

worker.detach();

return result;
}
```

#### Consuming Future streams
```cpp
auto all_done = get_stream().for_each([](int v) {
std::cout << v << "\n";
}).then([](){
std::cout << "all done!\n";
});

all_done.get();
```

## Performance notes

The library assumes that, more often than not, a callback is attached to the
future before a value or error is produced, and is tuned this way. Everything
will still work if the value is produced before the callback arrives, but
perhaps not as fast as possible.

The library also assumes that it is much more likely that a future will be
fullfilled successfully rather than failed.

## FAQs

**Is there a std::shared_future<> equivalent?**

Not yet. If someone would use it, it can be added to the library, we just don't want to add features that would not be used anywhere.

**Why is there no terminating+error propagating method?**

We have to admit that it would be nice to just do `fut.finally([](int a, float b){ ... })`, but the problem with that is that errors would have nowhere to go. Having the path of least resistance leading to dropping errors on the ground by default is just a recipe for disaster in the long run.