When you need an array with a few trillion elements
A resizable array using non-sequential block memory allocation. Growing or shrinking the array does not require reallocation of the entire array. Useful when you need to track a few trillion data points.
Note: For best results a 64-bit system and enough RAM to hold your data is recommended. If your data set grows slowly over time, paging to virtual memory may be acceptable.
npm install big-array
Usage: bigArray.Char(numberOfInitialElements, sizeOfElementBlocks)
- numberOfInitialElements: Number of array elements to initially allocate, rounds up to the nearest number divisible by sizeOfElementBlocks. default is 1
- sizeOfElementBlocks: Number of elements to allocate in each memory block. If this number is too small it will drastically increase the overhead memory usage for large arrays. default is 1048576 (1 MB for char)
var bigArray = require'big-array'ba val;// Numeric array typesba = 10 100;ba = ;ba = ;ba = ;ba = ;ba = ;ba = ;ba = ;ba = ;ba = ;
Tip: When testing on Linux, use "ulimit -v" to limit the maximum memory consumption.
# Limit memory usage to 10GBulimit -v 10485760
setindex valuegetindex// inc/dec by 1incindexdecindexpushvaluepop// modify push/pop indexsetIndexindexgetIndex// resize array to [size]resizesize