transform streamed glsl tokens into an ast
a through stream that takes tokens from glsl-tokenizer and turns them into an AST.
var tokenizer = require'glsl-tokenizer'fs = require'fs'parser = require'./index'var num = 0fscreateReadStream'test.glsl'pipetokenizerpipeparseron'data'console.log'ast of' xtype
similar to JSONStream, you may pass selectors
into the constructor to match only AST elements at that level. viable selectors are strings
and regexen, and they'll be matched against the emitted node's
because i am not smart enough to write a fully streaming parser, the current parser "cheats" a bit when it encounters a
exprnode! it actually waits until it has all the tokens it needs to build a tree for a given expression, then builds it and emits the constituent child nodes in the expected order. the
exprparsing is heavily influenced by crockford's tdop article. the rest of the parser is heavily influenced by fever dreams.
the parser might hit a state where it's looking at what could be an expression, or it could be a declaration -- that is, the statement starts with a previously declared
struct. it'll opt to pretend it's a declaration, but that might not be the case -- it might be a user-defined constructor starting a statement!
#endifmacros are completely unhandled at the moment, since they're a bit of a pain. if you've got unhygenic macros in your code, move the #if / #endifs to statement level, and have them surround wholly parseable code. this sucks, and i am sorry.