JSON Mangler
JSON mangler is a compressor with name mangling for JSON objects. Using a schema, the compressor generates mappings for any given objects and compresses them which makes the objects smaller in size (and harder to read as a side-effect).
API
Compression:
const Compressor = Compressor;const data = name: 'Jack' age: 43 occupation: 'Frontend Developer' employed: true name: 'Steve' age: 28 occupation: 'Backend Developer' employed: true name: 'Bob' age: 23 occupation: 'Frontend Developer' employed: false ;// Create compressor with schemaconst compressor = name: true age: true occupation: true employed: true; // Compress data with mutation and compression infoconst result = compressor; console; // [{ a: 'Jack', b: 43, c: 'Frontend Developer', d: true }, ...]console; // a:*.name;b:*.age;c:*.occupation;d:*.employed; (Use for decompressing this object)console; // { originalSize: 226, newSize: 163, diff: 63, time: 0 }
Decompression:
const Decompressor = Decompressor;// Compressed object and mappings can be obtained from the compressor resultsconst compressed = a: 'Jack' b: 43 c: 'Frontend Developer' d: true a: 'Steve' b: 28 c: 'Backend Developer' d: true a: 'Bob' b: 23 c: 'Frontend Developer' d: false ;const mappings = 'a:*.name;b:*.age;c:*.occupation;d:*.employed;'; // Decompress a compressed object using its mappingsconst decompressor = mappings; // Decompress data with mutation and decompression infoconst results = decompressor; console; // The original objectconsole; // { originalSize: 163, newSize: 226, diff: -63, time: 0 }
Compressor
Commonjs:
const Compressor = Compressor;
ES6:
;
Is a constructor that takes a schema for the objects. Any properties included in the schema will be considered in the compression. The constructed object has the following members:
mappings
: A read-only string which contains the mappings generated by the provided schema.compress(json, noClone, calcInfo)
: A method which takes an object to compress with two flags:noClone
(default:false
): Iftrue
, compression will avoid cloning the object which makes the compression faster but mutates the original object by reference.calcInfo
(default:false
): Iftrue
, the compression information will be calculated and available on the returning object with keyinfo
.
The compression result will contain the following properties:
compressed
: The compressed object.mappings
: The mappings generated for the object.info
: The compression info (only ifcalcInfo
flag wastrue
).
Decompressor
Commonjs:
const Decompressor = Decompressor;
ES6:
;
Is a constructor that takes a mappings string. The constructed object has the following members:
mappings
: A read-only string which contains the mappings provided in the constructor.decompress(json, noClone, calcInfo)
: A method which takes a compressed-object to decompress with two flags:noClone
(default:false
): Iftrue
, decompression will avoid cloning the object which makes the decompression faster but mutates the original object by reference.calcInfo
(default:false
): Iftrue
, the decompression information will be calculated and available on the returning object with keyinfo
.
The decompression result will contain the following properties:
decompressed
: The decompressed object.info
: The decompression info (only ifcalcInfo
flag wastrue
).
Real-Life Application
- A good application for this module is when a great amount of objects with the same structure is served to the client from a server. Let's consider the following example:
- We have a service (web application + server) which is connected to a database of movies information
- A complete movie object looks like the following:
-
The JSON object of the movie title "Pulp Fiction" is 1031 bytes in size (with white spaces removed)
-
The database contains 500,000 movie titles (approximately 515.5 MB) and the service has on average 200,000 daily users
-
If each user fetches 50 movies in their visit (consider the frontpage showcasing new movies in a list), the server is serving ~300 GB of data each month
-
Compressing the database using this module would reduce the storage size to 434.5 MB and the monthly bandwidth to ~262.3 GB
-
This data would be decompressed on the client side before being displayed to the users
-
Tabular data converted to JSON would usually end up as an array of objects with the same properties. Mangling can reduce the total size greatly.
Tests
npm test