Read, transform and write data concurrently from async sources to destinations. More...
Read, transform and write data concurrently from async sources to destinations.
Read, writes and transforms happen in parallel if sources and destinations are asynchronous. This library does not allocate any memory, all buffers are supplied by the caller.
Async Streams are largely inspired by node.js Streams, a very powerful tool to process large amounts of data concurrently.
The basic idea about an async stream is to create a Source / Sink abstraction (also called Readable and Writable) and process small buffers of data at time.
The state machine that coordinates this interaction handles data buffering and more importantly handles also back-pressure, that means:
By implementing streams on top of async operations it's possible to run many of them concurrently very efficiently. When properly implemented for example an async pipeline can concurrently read from disk, write to a socket while compressing data.
Most notable differences with node.js streams are for now: