through2-concurrent alternatives and similar modules
Based on the "Streams" category.
Alternatively, view through2-concurrent alternatives based on common mentions on social networks and blogs.
-
concat-stream
writable stream that concatenates strings or data and calls a callback with the result -
scramjet
Public tracker for Scramjet Cloud Platform, a platform that bring data from many environments together. -
duplexify
Turn a writable and readable stream into a streams2 duplex stream with support for async initialization and streams1/streams2 input -
into-stream
Convert a string/promise/array/iterable/asynciterable/buffer/typedarray/arraybuffer/object into a stream -
binary-split
a fast newline (or any delimiter) splitter stream - like require('split') but specific for binary data -
graphicsmagick-stream
DISCONTINUED. Fast conversion/scaling of images using a pool of long lived GraphicsMagick processes.
CodeRabbit: AI Code Reviews for Developers

* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of through2-concurrent or a related project?
README
through2-concurrent
A simple way to create a Node.JS Transform stream which processes in parallel. You can limit the concurrency (default is 16) and order is not preserved (so chunks/objects can end up in a different order to the order they started in if the transform functions take different amounts of time).
Built using through2 and has the
same API with the addition of a maxConcurrency
option.
Non-objectMode
streams are supported for completeness but I'm not
sure they'd be useful for anything.
Written by Thomas Parslow (almostobsolete.net and tomparslow.co.uk) as part of Active Inbox (activeinboxhq.com).
Install
npm install --save through2-concurrent
Examples
Process lines from a CSV in parallel. The order the results end up in
the all
variable is not deterministic.
var through2Concurrent = require('through2-concurrent');
var all = [];
fs.createReadStream('data.csv')
.pipe(csv2())
.pipe(through2Concurrent.obj(
{maxConcurrency: 10},
function (chunk, enc, callback) {
var self = this;
someThingAsync(chunk, function (newChunk) {
self.push(newChunk);
callback();
});
}))
.on('data', function (data) {
all.push(data)
})
.on('end', function () {
doSomethingSpecial(all)
})
Contributing
Fixed or improved stuff? Great! Send me a pull request through GitHub or get in touch on Twitter @almostobsolete or email at [email protected]