Lex Luthor
This package is a lexical scanner written in Javascript based on a talk by Rob Pike. In the video he goes over a method that uses state functions to tokenize input, and uses Go channels to emit the tokens. In Node, I thought it would be possible to use streams and event emitters to achieve the same thing.
Example
This basic example will find comments of the multi-line format (/*...*/
). The general idea is that all state function should return a new state function or null if it is the end of the file.
// Register the default state functionLexer; // Register inside comment state functionLexer; // Create the lexer and give it the input filevar lex = ; // Create an array to store the tokens// In real live you would probably have a parser listen for thisvar tokens = ;lex; // Run the lexerlex;
Tests
There are unit tests and end-to-end tests. To run both you can use grunt test
, or you can run them individually with npm run-script test-unit
and npm run-script test-e2e
.
To generate the coverage report you can run grunt test-coverage
. This will run both unit and end-to-end tests, and generate a report for each in the coverage
directory. This task also opens up a server where you can view the html output of the report at: