Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

To use a grammar T.g:

Code Block
import package antlr3
from	 
TLexer 	import TLexer
from TParser import TParser

input = '...what you want to feed into the parser...'
char_stream = antlr3.ANTLRStringStream(input)
# or to parse a file:
# char_stream = antlr3.ANTLRFileStream(path_to_input)
# or to parse an opened file or any other file-like object:
# char_stream = antlr3.ANTLRInputStream(file)

lexer = TLexer(char_stream)
tokens = antlr3.CommonTokenStream(lexer)
parser = TParser(tokens)
parser.entry_rule()org.antlr.runtime.*;
	 
	public class AntlrActionScriptTest { 
		public function AntlrActionScriptTest ( ):void { 
			var lexer:TLexer = new TLexer ( new ANTLRStringStream ( input ) );
			var tokens:CommonTokenStream = CommonTokenStream ( lexer );

			var parser:TParser = TParser ( tokens );
			parser.entry_rule ( ); 
 		}
	}
} 

If you want to access the tokens types in your code, you'll have to import these and access them from the lexer or parser module (in Java these are members of the lexer/parser classes, in Python they are defined on module level):

...

e.g. TLexer.EOF, TLexer.IDENTIFIER):

Using tree parsers

For grammars T.g (parser and lexer) and TWalker.g (the tree parser):

...