peek-stream alternatives and similar modules
Based on the "Streams" category.
Alternatively, view peek-stream alternatives based on common mentions on social networks and blogs.
-
Highland.js
High-level streams library for Node.js and the browser -
through2
Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise -
concat-stream
writable stream that concatenates strings or data and calls a callback with the result -
get-stream
Get a stream as a string, Buffer, ArrayBuffer or array -
multistream
A stream that emits multiple other streams one after another (streams3) -
scramjet
Public tracker for Scramjet Cloud Platform, a platform that bring data from many environments together. -
pumpify
Combine an array of streams into a single duplex stream using pump and duplexify -
duplexify
Turn a writable and readable stream into a streams2 duplex stream with support for async initialization and streams1/streams2 input -
into-stream
Convert a string/promise/array/iterable/asynciterable/buffer/typedarray/arraybuffer/object into a stream -
from2
Convenience wrapper for ReadableStream, with an API lifted from "from" and "through2" -
binary-split
a fast newline (or any delimiter) splitter stream - like require('split') but specific for binary data -
through2-concurrent
Simple Node.JS stream (streams2) Transform that runs the transform functions concurrently (with a set max concurrency) -
graphicsmagick-stream
Fast conversion/scaling of images using a pool of long lived GraphicsMagick processes. -
first-chunk-stream
Transform the first chunk in a stream
SurveyJS - Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of peek-stream or a related project?
README
peek-stream
Transform stream that lets you peek the first line before deciding how to parse it
npm install peek-stream
Usage
var peek = require('peek-stream')
var ldjson = require('ldjson-stream')
var csv = require('csv-parser')
var isCSV = function(data) {
return data.toString().indexOf(',') > -1
}
var isJSON = function(data) {
try {
JSON.parse(data)
return true
} catch (err) {
return false
}
}
// call parser to create a new parser
var parser = function() {
return peek(function(data, swap) {
// maybe it is JSON?
if (isJSON(data)) return swap(null, ldjson())
// maybe it is CSV?
if (isCSV(data)) return swap(null, csv())
// we do not know - bail
swap(new Error('No parser available'))
})
}
The above parser will be able to parse both line delimited JSON and CSV
var parse = parser()
parse.write('{"hello":"world"}\n{"hello":"another"}\n')
parse.on('data', function(data) {
console.log(data) // prints {hello:'world'} and {hello:'another'}
})
Or
var parse = parser()
parse.write('test,header\nvalue-1,value-2\n')
parse.on('data', function(data) {
console.log(data) // prints {test:'value-1', header:'value-2'}
})
Per default data
is the first line (or the first 65535
bytes if no newline is found).
To change the max buffer pass an options map to the constructor
var parse = peek({
maxBuffer: 10000
}, function(data, swap) {
...
})
If you want to emit an error if no newline is found set strict: true
as well.
License
MIT
*Note that all licence references and agreements mentioned in the peek-stream README section above
are relevant to that project's source code only.