through2-concurrent alternatives and similar modules
Based on the "Streams" category.
Alternatively, view through2-concurrent alternatives based on common mentions on social networks and blogs.
-
Highland.js
High-level streams library for Node.js and the browser -
through2
Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise -
concat-stream
writable stream that concatenates strings or data and calls a callback with the result -
multistream
A stream that emits multiple other streams one after another (streams3) -
scramjet
Public tracker for Scramjet Cloud Platform, a platform that bring data from many environments together. -
duplexify
Turn a writable and readable stream into a streams2 duplex stream with support for async initialization and streams1/streams2 input -
pumpify
Combine an array of streams into a single duplex stream using pump and duplexify -
into-stream
Convert a string/promise/array/iterable/asynciterable/buffer/typedarray/arraybuffer/object into a stream -
from2
Convenience wrapper for ReadableStream, with an API lifted from "from" and "through2" -
stream-combiner2
Turn a pipeline into a single stream. -
binary-split
a fast newline (or any delimiter) splitter stream - like require('split') but specific for binary data -
graphicsmagick-stream
Fast conversion/scaling of images using a pool of long lived GraphicsMagick processes. -
peek-stream
Transform stream that lets you peek the first line before deciding how to parse it -
first-chunk-stream
Transform the first chunk in a stream
AWS Cloud-aware infrastructure-from-code toolbox [NEW]
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of through2-concurrent or a related project?
README
through2-concurrent
A simple way to create a Node.JS Transform stream which processes in parallel. You can limit the concurrency (default is 16) and order is not preserved (so chunks/objects can end up in a different order to the order they started in if the transform functions take different amounts of time).
Built using through2 and has the
same API with the addition of a maxConcurrency
option.
Non-objectMode
streams are supported for completeness but I'm not
sure they'd be useful for anything.
Written by Thomas Parslow (almostobsolete.net and tomparslow.co.uk) as part of Active Inbox (activeinboxhq.com).
Install
npm install --save through2-concurrent
Examples
Process lines from a CSV in parallel. The order the results end up in
the all
variable is not deterministic.
var through2Concurrent = require('through2-concurrent');
var all = [];
fs.createReadStream('data.csv')
.pipe(csv2())
.pipe(through2Concurrent.obj(
{maxConcurrency: 10},
function (chunk, enc, callback) {
var self = this;
someThingAsync(chunk, function (newChunk) {
self.push(newChunk);
callback();
});
}))
.on('data', function (data) {
all.push(data)
})
.on('end', function () {
doSomethingSpecial(all)
})
Contributing
Fixed or improved stuff? Great! Send me a pull request through GitHub or get in touch on Twitter @almostobsolete or email at [email protected]