🚀 A high-performance, drop-in replacement for csv-parser powered by Rust
Streaming CSV parser that aims for maximum speed while maintaining 100% compatibility with the original csv-parser
API. Built with Rust and N-API for blazing-fast performance on large datasets.
fast-csv-parser
is a drop-in replacement - simply change your import and enjoy better performance with zero code changes required.
⚡️ Up to 1.3x faster on large datasets (7K+ rows)
🔧 100% API compatible with the original csv-parser
🦀 Rust-powered native performance
🌐 Cross-platform support (Windows, macOS, Linux, ARM)
📦 Zero dependencies in production
🔤 UTF-16 support - handles UTF-16 LE/BE with BOM detection
Performance characteristics vary by file size. Here are real benchmark results:
Dataset | Rows | Size | csv-parser | fast-csv-parser | Speedup |
---|---|---|---|---|---|
large-dataset.csv | 7,268 | 1.1MB | 59ms | 47ms | 🚀 1.26x faster |
option-maxRowBytes.csv | 4,577 | 700KB | 36ms | 29ms | 🚀 1.24x faster |
Small files show the Node.js ↔ Rust boundary overhead:
- Files <100 rows: ~0.6x performance (0.1ms → 0.2ms)
- The overhead is constant (~0.1ms) regardless of file size
Performance Ratio (higher = better)
2x ┤
│ ╭─ Peak performance zone
1.5x ┤ ╱
│ ╱
1x ┼─────────────────────── Break-even point (~1KB)
│
0.5x ┤ Overhead zone
└─────────────────────────────────────
0KB 1KB 10KB 100KB 1MB+
File Size
💡 Recommendation: Use fast-csv-parser
for files >10KB or high-throughput scenarios.
Using npm:
npm install fast-csv-parser
Using yarn:
yarn add fast-csv-parser
Using pnpm:
pnpm add fast-csv-parser
Simply replace your csv-parser import:
// Before
const csv = require('csv-parser')
// After - that's it!
const csv = require('fast-csv-parser')
// Your existing code works unchanged
const fs = require('fs')
const results = []
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(results)
// [
// { NAME: 'Daffy Duck', AGE: '24' },
// { NAME: 'Bugs Bunny', AGE: '22' }
// ]
})
const csv = require('fast-csv-parser')
fs.createReadStream('data.csv')
.pipe(csv({
headers: ['name', 'age', 'city'],
skipLines: 1,
mapHeaders: ({ header }) => header.toUpperCase(),
mapValues: ({ header, value }) => {
if (header === 'age') return parseInt(value)
return value.trim()
}
}))
.on('data', (row) => console.log(row))
const csv = require('fast-csv-parser')
fs.createReadStream('data.tsv')
.pipe(csv({ separator: '\t' }))
.on('data', (row) => console.log(row))
const csv = require('fast-csv-parser')
fs.createReadStream('data.csv')
.pipe(csv({ strict: true }))
.on('data', (row) => console.log(row))
.on('headers', (headers) => console.log('Headers:', headers))
.on('error', (err) => {
console.error('Parse error:', err.message)
})
.on('end', () => console.log('Parsing complete'))
const csv = require('fast-csv-parser')
const { Transform } = require('stream')
const processor = new Transform({
objectMode: true,
transform(row, encoding, callback) {
// Process each row
row.processed_at = new Date().toISOString()
this.push(row)
callback()
}
})
fs.createReadStream('input.csv')
.pipe(csv())
.pipe(processor)
.pipe(fs.createWriteStream('output.json'))
fast-csv-parser
implements the exact same API as the original csv-parser
. All options, events, and behaviors are identical.
Returns: Transform Stream
All original csv-parser options are supported:
-
separator
(String, default:,
) - Column separator -
quote
(String, default:"
) - Quote character -
escape
(String, default:"
) - Escape character -
newline
(String, default:\n
) - Line ending -
headers
(Array|Boolean) - Custom headers or disable header parsing -
mapHeaders
(Function) - Transform header names -
mapValues
(Function) - Transform cell values -
skipLines
(Number, default: 0) - Skip initial lines -
skipComments
(Boolean|String, default: false) - Skip comment lines -
maxRowBytes
(Number) - Maximum bytes per row -
strict
(Boolean, default: false) - Strict column count validation -
raw
(Boolean, default: false) - Disable UTF-8 decoding
const csv = require('fast-csv-parser')
fs.createReadStream('data.tsv')
.pipe(csv({
separator: '\t',
mapHeaders: ({ header }) => header.toLowerCase(),
mapValues: ({ value }) => value.trim()
}))
.on('data', (row) => console.log(row))
Emitted for each parsed row (excluding headers).
Emitted after header row is parsed with Array<string>
of header names.
Emitted when parsing is complete.
Emitted on parsing errors.
- Use on large files - The performance benefits are most apparent with files >1MB
-
Enable
raw: true
for maximum speed if you don't need UTF-8 processing -
Avoid complex
mapValues
functions - They can negate performance gains -
Set appropriate
maxRowBytes
to avoid memory issues with malformed data
Pre-built binaries are available for:
- macOS: x64, ARM64 (Apple Silicon)
- Linux: x64, ARM64, ARM7 (GNU & musl)
- Windows: x64, x86, ARM64
- FreeBSD: x64
- Android: ARM64, ARM7
Migration is seamless:
// Old code
const csvParser = require('csv-parser')
// New code - just change the import!
const csvParser = require('fast-csv-parser')
// Everything else stays the same
fast-csv-parser
automatically detects and handles multiple text encodings:
- UTF-8 (default) - with automatic BOM stripping
- UTF-16 LE (Little Endian) - with BOM detection
- UTF-16 BE (Big Endian) - with BOM detection
No configuration needed - encoding is detected automatically from Byte Order Marks (BOM):
const csv = require('fast-csv-parser')
// Works with UTF-8, UTF-16 LE, UTF-16 BE files automatically
fs.createReadStream('data-utf16.csv') // UTF-16 file
.pipe(csv())
.on('data', (row) => console.log(row))
Encoding | BOM | Detection | Status |
---|---|---|---|
UTF-8 | EF BB BF |
Auto-detected, BOM stripped | ✅ Supported |
UTF-16 LE | FF FE |
Auto-detected | ✅ Supported |
UTF-16 BE | FE FF |
Auto-detected | ✅ Supported |
ASCII | None | Treated as UTF-8 | ✅ Supported |
Other | - | Not supported | ❌ |
Run benchmarks yourself:
# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser
# Install dependencies
npm install
# Run performance comparison against original csv-parser
./benchmark-comparison.js
# Run individual benchmarks
npm run bench
Sample output:
🏁 CSV Parser Performance Comparison
Comparing Rust implementation vs Original JavaScript csv-parser
📊 PERFORMANCE COMPARISON
========================================
File Original Rust Speedup
large-dataset.csv 59ms 47ms 1.26x ⚡
option-maxRowBytes.csv 36ms 29ms 1.24x ⚡
basic.csv 0.098ms 0.43ms 0.23x
Errors are handled identically to the original csv-parser:
fs.createReadStream('data.csv')
.pipe(csv())
.on('error', (err) => {
if (err instanceof RangeError) {
console.log('Row length mismatch')
} else {
console.log('Parse error:', err.message)
}
})
Full TypeScript definitions are included:
import csv from 'fast-csv-parser'
interface Row {
name: string
age: number
}
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row: Row) => {
console.log(row.name, row.age)
})
The CLI is fully compatible with csv-parser:
# Parse CSV to NDJSON
fast-csv-parser data.csv
# Parse TSV
fast-csv-parser -s $'\t' data.tsv
# Custom options
fast-csv-parser --separator=';' --quote='"' data.csv
For most cases, encoding is handled automatically. For advanced scenarios:
const csv = require('fast-csv-parser')
// Automatic encoding detection (recommended)
fs.createReadStream('data.csv') // Any supported encoding
.pipe(csv())
.on('data', (row) => console.log(row))
// For unsupported encodings, use iconv-lite preprocessing
const iconv = require('iconv-lite')
fs.createReadStream('latin1-data.csv')
.pipe(iconv.decodeStream('latin1'))
.pipe(csv())
.on('data', (row) => console.log(row))
Use fast-csv-parser when:
- Processing large CSV files (>1MB)
- Performance is critical
- You want a drop-in replacement for csv-parser
- Working with data pipelines or ETL processes
Stick with csv-parser when:
- Processing many small files (<1KB each) where startup overhead matters
- Bundle size is extremely critical (csv-parser is pure JS)
- You need cutting-edge features that haven't been ported yet
A: Yes! Just change your import from csv-parser
to fast-csv-parser
. All options, events, and behaviors are identical.
A: Use fast-csv-parser
when:
- Processing files larger than 10KB
- Performance is critical (ETL pipelines, data processing)
- You're already using csv-parser and want better performance
A: No breaking changes. The API is 100% compatible. UTF-16 files are now properly supported with automatic encoding detection.
A: There's a ~0.1ms overhead from the Node.js ↔ Rust boundary. For tiny files, this overhead exceeds the parsing time. The crossover point is around 1KB.
A: Not currently. This is a native Node.js addon. For browser use, stick with the original csv-parser.
A: Very stable. It passes all csv-parser tests and maintains the same error handling. The Rust core uses battle-tested CSV parsing libraries.
Requirements:
- Node.js 16+
- Rust 1.60+
- Supported platform (see Platform Support above)
# Clone the repository
git clone https://github.com/jonaylor89/fast-csv-parser
cd fast-csv-parser
# Install dependencies
npm install
# Build the native addon
npm run build
# Run tests
npm test
# Run benchmarks
npm run bench
fast-csv-parser/
├── src/ # Rust source code
│ ├── lib.rs # N-API bindings
│ └── parser.rs # Core CSV parsing logic
├── __test__/ # Test files and fixtures
├── examples/ # Usage examples
├── bin/ # CLI tools
├── main.js # Main entry point with Stream API
└── index.js # Auto-generated native binding loader (build-safe)
-
Rust Core (
src/parser.rs
) - High-performance CSV parsing -
N-API Bridge (
src/lib.rs
) - Node.js ↔ Rust interface -
Native Loader (
index.js
) - Auto-generated cross-platform binary loading -
Main Entry (
main.js
) - Stream API compatibility layer (build-safe)
Contributions welcome! This project maintains:
- 100% API compatibility with csv-parser
- Comprehensive test coverage (23+ test files)
- Performance benchmarks
- Cross-platform CI/CD
- Rust best practices
# Run all tests
npm test
# Run specific test
npm test headers.test.mjs
# Run benchmarks
./benchmark-comparison.js
MIT License - same as the original csv-parser
.
Built on the shoulders of giants:
-
csv-parser
by @mafintosh - The original, excellent CSV parser -
napi-rs
- Rust N-API framework for Node.js addons
Special thanks to the csv-parser community for creating such a robust and well-designed API that made this drop-in replacement possible.
🚀 Ready to speed up your CSV processing?
npm install fast-csv-parser
Just replace your import and enjoy the performance boost!