From 3cff5d09617c0de74a77df49ee94f90b1d58a1cc Mon Sep 17 00:00:00 2001 From: Philipp Oppermann Date: Thu, 20 Feb 2020 11:48:03 +0100 Subject: [PATCH] Small improvements --- .../second-edition/posts/12-async-await/index.md | 12 +++++++----- 1 file changed, 7 insertions(+), 5 deletions(-) diff --git a/blog/content/second-edition/posts/12-async-await/index.md b/blog/content/second-edition/posts/12-async-await/index.md index 968720df..be42893f 100644 --- a/blog/content/second-edition/posts/12-async-await/index.md +++ b/blog/content/second-edition/posts/12-async-await/index.md @@ -145,13 +145,13 @@ The `poll` method takes two arguments: `self: Pin<&mut Self>` and `cx: &mut Cont [_pinned_]: https://doc.rust-lang.org/nightly/core/pin/index.html -The purpose of the `cx: &mut Context` parameter is to pass a [`Waker`] instance to the asynchronous task, e.g. the file system load. This `Waker` allows the asynchronous task to signal that it (or a part of it) is finished, e.g. that the file was loaded from disk. Since the main task knows that it will be notified when the `Future` is ready, it does not need to call `poll` over and over again. We will explain this process in more detail later when we implement an own `Waker` type. +The purpose of the `cx: &mut Context` parameter is to pass a [`Waker`] instance to the asynchronous task, e.g. the file system load. This `Waker` allows the asynchronous task to signal that it (or a part of it) is finished, e.g. that the file was loaded from disk. Since the main task knows that it will be notified when the `Future` is ready, it does not need to call `poll` over and over again. We will explain this process in more detail later in this post when we implement an own `Waker` type. [`Waker`]: https://doc.rust-lang.org/nightly/core/task/struct.Waker.html ### Working with Futures -We now know how futures are defined and the rough idea behind the `poll` method. However, we still don't know how to effectively work with futures. The problem is that futures represent results of asynchronous tasks, which might be not available yet. In practice, however, we often need these values directly for further calculations. So the question is: How can we efficiently retrieve the value of a future when we need it? +We now know how futures are defined and understand the basic idea behind the `poll` method. However, we still don't know how to effectively work with futures. The problem is that futures represent results of asynchronous tasks, which might be not available yet. In practice, however, we often need these values directly for further calculations. So the question is: How can we efficiently retrieve the value of a future when we need it? #### Waiting on Futures @@ -169,7 +169,7 @@ let file_content = loop { Here we _actively_ wait for the future by calling `poll` over and over again in a loop. The arguments to `poll` don't matter here, so we omitted them. While this solution works, it is very inefficient because we keep the CPU busy until the value becomes available. -A more effective approach could be to _block_ the current thread until the future becomes available. This is of course only possible if you have threads, so this solution does not work for kernel, at least not yet. Even on systems where blocking is supported, it is often not desired because it turns an asynchronous task into a synchronous task again, thereby inhibiting the potential performance benefits. +A more efficient approach could be to _block_ the current thread until the future becomes available. This is of course only possible if you have threads, so this solution does not work for kernel, at least not yet. Even on systems where blocking is supported, it is often not desired because it turns an asynchronous task into a synchronous task again, thereby inhibiting the potential performance benefits of parallel tasks. #### Future Combinators @@ -214,7 +214,9 @@ This code does not quite work because it does not handle [_pinning_], but it suf [_pinning_]: https://doc.rust-lang.org/stable/core/pin/index.html -Manually writing correct combinator methods is difficult, therefore they are often provided by libraries. While the Rust standard library itself provides no combinator methods yet, the semi-official (and `no_std` compatible) [`futures`] crate does. Its [`FutureExt`] trait provides high-level combinator methods such as [`map`] or [`then`], which can be used to manipulate the result with arbitrary closures. +With this `string_len` function, we can calculate the length of an asynchronous string without waiting for it. Since the function returns a `Future` again, the caller can't work directly on the returned value, but needs to use combinator functions again. This way, the whole call graph becomes asynchronous and we can efficiently wait for multiple futures at once at some point, e.g. in the main function. + +Manually writing combinator functions is difficult, therefore they are often provided by libraries. While the Rust standard library itself provides no combinator methods yet, the semi-official (and `no_std` compatible) [`futures`] crate does. Its [`FutureExt`] trait provides high-level combinator methods such as [`map`] or [`then`], which can be used to manipulate the result with arbitrary closures. [`futures`]: https://docs.rs/futures/0.3.4/futures/ [`FutureExt`]: https://docs.rs/futures/0.3.4/futures/future/trait.FutureExt.html @@ -223,7 +225,7 @@ Manually writing correct combinator methods is difficult, therefore they are oft ##### Advantages -The big advantage of future combinators is that they keep the operations asynchronous. In combination with asynchronous I/O interfaces, this approach can lead to very high performance. The fact that future combinators are implemented as normal structs with trait implementations allows the compiler to excessively optimizing them to a efficient state machine. For more details, see the [_Zero-cost futures in Rust_] post, which announced the addition of futures to the Rust ecosystem. +The big advantage of future combinators is that they keep the operations asynchronous. In combination with asynchronous I/O interfaces, this approach can lead to very high performance. The fact that future combinators are implemented as normal structs with trait implementations allows the compiler to excessively optimizing them. For more details, see the [_Zero-cost futures in Rust_] post, which announced the addition of futures to the Rust ecosystem. [_Zero-cost futures in Rust_]: https://aturon.github.io/blog/2016/08/11/futures/