crawl-fs

0.0.1 • Public • Published

crawl-fs Build Status

Recursively crawls a folder and returns a list of relative filenames. Iteration is flat and do not use recursive function.

Usage

There are few ways to use crawl-fs. Following examples assume the folder structure looks like this:

./folder-to-crawl/abc.txt
./folder-to-crawl/def/
./folder-to-crawl/def/ghi.txt
./folder-to-crawl/xyz/

We only returns files, empty folders are not included in result. Filenames are always relative to the base path, which relative to process.cwd(), specified in the function call.

Default returns an array

require('crawl-fs')('folder-to-crawl/', (err, filenames) => {
  assert.deepEqual(
    filenames,
    [
      'abc.txt',
      'def/ghi.txt'
    ]
  );
});

Returns a Promise with an iterator

If applicable, this approach is preferred as it is progressive and requires less memory.

let numCalled = 0;
 
require('crawl-fs').withIterator('folder-to-crawl/', filename => {
  // First iteration
  numCalled === 0 && assert.equal(filename, 'abc.txt');
 
  // Second iteration
  numCalled === 1 && assert.equal(filename, 'def/ghi.txt');
 
  numCalled++;
}).then(() => {
  // Completion
  assert.equal(numCalled, 2);
});

Design considerations

If you think these design considerations are not valid, please challenge us.

  • We did not use Promise progression, instead, we prefer iterator pattern with Promise resolver as finisher
  • We did not implement ES6 generator
    • Our iterator is async, ES6 sync generator has no real benefits over array, in terms of time/space complexity

Changelog

  • 0.0.1 (2016-02-13) - Initial commit

Wishlist

  • When async generator is official in ES7, we should add new .withAsyncGenerator() function

Contribution

Want to contribute? There are few easy ways:

Readme

Keywords

Package Sidebar

Install

npm i crawl-fs

Weekly Downloads

1

Version

0.0.1

License

MIT

Last publish

Collaborators

  • compulim