What is the difference between Left Factoring
and Left Recursion
? I understand that Left factoring
is a predictive top down parsing technique. But I get confused when I hear these two terms.
相关问题
- Correctly parse PDF paragraphs with Python
- R: eval(parse()) error message: cannot ope
- How do shader compilers work?
- C++ Builder - Difference between Lib and Res
- How do I parse a .pls file using PHP? Having troub
相关文章
- How do I get from a type to the TryParse method?
- Should client-server code be written in one “proje
- Algorithm for maximizing coverage of rectangular a
- Slow ANTLR4 generated Parser in Python, but fast i
- How to force Delphi compiler to display all hints
- Is there an existing solution for these particular
- Xcode - How to change application/project name : w
- What other neat tricks does the SpecialNameAttribu
Left factoring is removing the common left factor that appears in two productions of the same non-terminal. It is done to avoid back-tracing by the parser. Suppose the parser has a look-ahead ,consider this example-
A -> qB | qC
where A,B,C are non-terminals and q is a sentence. In this case, the parser will be confused as to which of the two productions to choose and it might have to back-trace. After left factoring, the grammar is converted to-
A -> qD
D -> B | C
In this case, a parser with a look-ahead will always choose the right production.
Left recursion is a case when the left-most non-terminal in a production of a non-terminal is the non-terminal itself( direct left recursion ) or through some other non-terminal definitions, rewrites to the non-terminal again(indirect left recursion). Consider these examples -
(1) A -> Aq (direct)
(2) A -> Bq B -> Ar (indirect)
Left recursion has to be removed if the parser performs top-down parsing
left recursion:= when left hand non terminal is same as right hand non terminal. Example: A->A&|B where & is alpha. We can remove left ricursion using rewrite this production as like.
A->BA' A'->&A'|€
Left factor mean productn should not non deterministic. . Example: A->&A|&B|&C
Left Recursion: A grammar is left recursive if it has a nonterminal A such that there is a derivation A -> Aα | β where α and β are sequences of terminals and nonterminals that do not start with A.
While designing a top down-parser, if the left recursion exist in the grammar then the parser falls in an infinite loop, here because A is trying to match A itself, which is not possible. We can eliminate the above left recursion by rewriting the offending production. As-
A -> βA'
A' -> αA' | epsilon
Left Factoring: Left factoring is required to eliminate non-determinism of a grammar. Suppose a grammar, S -> abS | aSb
Here, S is deriving the same terminal a in the production rule(two alternative choices for S), which follows non-determinism. We can rewrite the production to defer the decision of S as-
S -> aS'
S' -> bS | Sb
Thus, S' can be replaced for bS or Sb
Left Factoring is a grammar transformation technique. It consists in "factoring out" prefixes which are common to two or more productions.
For example, going from:
to:
Left Recursion is a property a grammar has whenever you can derive from a given variable (non terminal) a rhs that begins with the same variable, in one or more steps.
For example:
or
There is a grammar transformation technique called Elimination of left recursion, which provides a method to generate, given a left recursive grammar, another grammar that is equivalent and is not left recursive.
The relationship/confusion between both terms probably derives from the fact that both transformation techniques may need to be applied to a grammar before being able to derive a predictive top down parser for it.
left factor :
Let the given grammar : A-->ab1 | ab2 | ab3
1) we can see that , for every production , there is a common prefix & if we choose any production here , it is not cofirmed that we will not need to backtrack.
2) it is non deterministic , because we cannot choice any production and be assured that we will reach at our desired string by making the correct parse tree. but if we rewrite the grammar in a way that is deterministic and also leaves us to be flexible enough to make it any string that may be possible without backtracking .... it will be :
A --> aA', A' --> b1 | b2| b3 now if we are asked to make the parse tree for string ab2.... we don't need back tracking . Because we can always choose the correct production when we get A' thus we will generate the correct parse tree .
Left recursion :
A --> Aa | b here it is clear that the left child of A will always be A if we choose the first production,this is left recursion .because , A is calling itself over and over again . the generated string from this grammar is : ba* since this cannot be in a grammar ... we eliminate the left recursion by writing :
A --> bA' A' --> E | aA' now we will not have left recursion and also we can generate ba* .
This is the way I've seen the two terms used: