Split a stream into multiple streams by defining flexible delimiter or a delimiting function that returns index of separation.
Each new resulting substream starts when the reading of previous is finished.
There are 2 methods: stream of streams splitted by delimiter or explicit function that returns substream that ends at next delimiter.
$ yarn add split-into-streams
$ npm i split-into-streams
First way: (stream of streams)
const SplitStream = ;const rs = readableStreamexplicitRead: false // set as non explicitsplitAt: '\n' // split at newliners;
Second way: (explicit function)
const SplitStream = ;const rs = readableStreamexplicitRead: true // set as explicitconst stream = await rs;// received stream will end after next line breakstream
with this method you can also provide different delimiter to each next
NOTE: this method will automatically pause the given stream on creation, and resume & pause when reading each next chunk, this will force the main stream to stay until everything is read when we read from stdout of spawn process for example.
To specify one of two ways above
splitAt / or as argument to
The delimiter value that should separate streams, can be string, regex, array of numbers or function that returns point of separation.
- when string, will separate at place where toString() values of bytes in buffer match the string.
- when regex, will separate at place where toString() values of bytes in buffer match the regex.
- when array of numbers, will spearate at place where bytes match the values.
- when function, will call that function on chunk of data and expect an index of separation to be returned.
example: to separate immediately after line break, you can pass
, or provide function:
to separate before the delimiter, simply decrease by 1 position:
nextChunkData - 1
to split next stream by different delimiter than the first, you can make counter inside this function and provide different implementation on second call, return -1 if you dont want to split yet and continue passing chunks to currently read substream.
Sometimes long delimiters can begin at end of one chunk (that is read internally) and end at start of next, in order to consider these the library doesn't push the entire chunk into substream after its read from main stream, but rather leaves out some bytes at the end, to be pushed before next chunk. The length of that ending is defined by
Use this if you are dealing with fairly long delimiters and set it to be the max possible length of your delimiter.