csv-parser

Streaming CSV parser that aims for maximum speed as well as compatibility with the csv-spectrum test suite

csv-parser

Streaming CSV parser that aims for maximum speed as well as compatibility with the csv-spectrum CSV acid test suite

npm install csv-parser

csv-parser can convert CSV into JSON at at rate of around 90,000 rows per second (perf varies with data, try bench.js with your data).

Simply instantiate csv and pump a csv file to it and get the rows out as objects

You can use csv-parser in the browser with browserify

var csv = require('csv-parser')
 
fs.createReadStream('some-csv-file.csv')
  .pipe(csv())
  .on('data', function(data) {
    console.log('row', data)
  })

The data emitted is a normalized JSON object

The csv constructor accepts the following options as well

var stream = csv({
  raw: false,    // do not decode to utf-8 strings 
  separator: ',', // specify optional cell separator 
  newline: '\n' // specify a newline character 
})

It accepts too an array, that specifies the headers for the object returned:

var stream = csv(['index', 'message'])
 
// Source from somewere with format 12312,Hello World 
origin.pipe(stream)
  .on('data', function(data) {
    console.log(data) // Should output { "index": 12312, "message": "Hello World" } 
  })

or in the option object as well

var stream = csv({
  raw: false,    // do not decode to utf-8 strings 
  separator: ',', // specify optional cell separator 
  newline: '\n', // specify a newline character 
  headers: ['index', 'message'] // Specifing the headers 
})

If you do not specify the headers, csv-parser will take the first line of the csv and treat it like the headers

There is also a command line tool available. It will convert csv to line delimited JSON.

npm install -g csv-parser

Open a shell and run

$ csv-parser --help # prints all options
$ printf "a,b\nc,d\n" | csv-parser # parses input

MIT