Table of Contents
This page details the important classes in Lark.
Lark
The Lark class is the main interface for the library. It's mostly a thin wrapper for the many different parsers and other features.
Methods
__init__(self, grammar, **options)
The Lark class accepts a grammar string, and keywords options.
-
start - The symbol in the grammar that begins the parse (Default: "start")
-
parser - Decides which parser engine to use, "earley" or "lalr". (Default: "earley")
-
lexer - Decides whether or not to use a lexer stage.
-
lexer="dynamic" - A dynamic lexer, capable of lexing anything (used only for Earley. Default for Earley)
-
lexer=None - No lexer. Same effect as "dynamic", but slower (Deprecated. Used only for Earley)
-
lexer="standard" - A standard lexer. Very fast, but the user is responsible for solving token collisions (Default for LALR)
-
lexer="contextual" - An enhancement over the standard lexer. Can solve some collisions, but not all (Used only for LALR)
-
-
transformer - Applies the transformer to every parse, without building a parse tree (only allowed with parser="lalr")
-
postlex - Lexer post-processing (Default: None, only works when lexer is "standard" or "contextual")
-
ambiguity -
-
"explicit" - Return all derivations inside an "_ambig" data node.
-
"resolve" - Let the parser choose the best derivation (greedy for tokens, non-greedy for rules. Default)
-
-
debug - Display warnings (such as Shift-Reduce warnings for LALR)
-
keep_all_tokens - Don't throw away any terminals from the tree (Default=False)
-
propagate_positions - Propagate line/column count to tree nodes (Experimental, default=False)
lex(self, text)
Return the stream of terminals before parsing (only relevant for lexer="standard" or "contextual)
parse(self, text)
Return a complete parse tree for the text (of type Tree)
Tree
The main tree class
Properties
data
- The name of the rule or aliaschildren
- List of matched sub-rules and terminalsmeta
- Line & Column numbers, if usingpropagate_positions
Methods
__init__(self, data, children)
Creates a new tree, and stores "data" and "children" in attributes of the same name.
pretty(self, indent_str=' ')
Returns an indented string representation of the tree. Great for debugging.
find_pred(self, pred)
Returns all nodes of the tree that evaluate pred(node) as true.
find_data(self, data)
Returns all nodes of the tree whose data equals the given data.
iter_subtrees(self)
Iterates over all the subtrees, never returning to the same node twice (Lark's parse-tree is actually a DAG)
__eq__, __hash__
Trees can be hashed and compared.
Transformers & Visitors
Transformers & Visitors provide a convenient interface to process the parse-trees that Lark returns.
They are used by inheriting from the correct class (visitor or transformer), and implementing methods corresponding to the rule you wish to process. Each methods accepts the children as an argument. That can be modified using the v-args
decorator, which allows to inline the arguments (akin to *args
), or add the tree meta
property as an argument.
See: https://github.com/lark-parser/lark/blob/master/lark/visitors.py
Visitors
Visitors visit each node of the tree, and run the appropriate method on it according to the node's data.
They work bottom-up, starting with the leaves and ending at the root of the tree.
Example
class IncreaseAllNumbers(Visitor):
def number(self, tree):
assert tree.data == "number"
tree.children[0] += 1
IncreaseAllNumbers().visit(parse_tree)
There are two classes that implement the visitor interface:
-
Visitor - Visit every node (without recursion)
-
Visitor_Recursive - Visit every node using recursion. Slightly faster.
Transformers
Transformers visit each node of the tree, and run the appropriate method on it according to the node's data.
They work bottom-up, starting with the leaves and ending at the root of the tree.
Transformers can be used to implement map & reduce patterns.
Because nodes are reduced from leaf to root, at any point the callbacks may assume the children have already been transformed (if applicable).
Transformers can be chained into a new transformer by using multiplication.
Example:
from lark import Tree, Transformer
class EvalExpressions(Transformer):
def expr(self, args):
return eval(args[0])
t = Tree('a', [Tree('expr', ['1+2'])])
print(EvalExpressions().transform( t ))
# Prints: Tree(a, [3])
Here are the classes that implement the transformer interface:
- Transformer - Recursively transforms the tree. This is the one you probably want.
- Transformer_InPlace - Non-recursive. Changes the tree in-place instead of returning new instances
- Transformer_InPlaceRecursive - Recursive. Changes the tree in-place instead of returning new instances
Token
When using a lexer, the resulting tokens in the trees will be of the Token class, which inherits from Python's string. So, while normal string comparisons will work as expected, it has a few other useful attributes:
- type - Name of the token (as specified in grammar).
- pos_in_stream - the index of the token in the text
- line - The line of the token in the text (starting with 1)
- column - The column of the token in the text (starting with 1)
- end_line - The line where the token ends
- end_column - The column where the token ends