Sunteți pe pagina 1din 6

The goal of this lecture is to show you

the push on alternative to find exactly


the context for your languages. So let's
get to it. Besides the comfort of knowing
that two seemingly unrelated concepts are
really the same, the grammar PDA
equivalence will let us jump between the
two notations when we talk about
properties of context free languages. The
ability to jump between different
representations, regular expressions and
deterministic [inaudible] was important
when we addressed properties of regular
languages. Felicia found the ability
essential for context free languages as
well. We also find it easier sometimes to
describe a PDA for a language rather than
a grammar. For example, you might find it
hard to invent a grammar for balanced
parenthesis, but a PDA is easy to think
of. Just push left parens onto the stack,
and pop the stack, once, every time you
see a right parenthesis. If the bottom of
the bottom of the stack marker is exposed,
then the parenthesis were balanced. And
you never pop the bottom marker because
that would mean you have more right parens
than left. Let's start with a language L
that has a context-free grammar G. We'll
convert grammar G to a PDAP that accepts L
by empty stack. And if you want a PDA that
accepts L by final state, we know how to
convert to one of those. P will have only
one state, Q. That's all we need.
Naturally, Q is the start state. There are
no final states because we are accepting
by empty stack. The input symbols of P are
the terminals of G. The stacks symbols of
P are all the terms and variables of G.
And the start symbol of P is the start
symbol of G. Our intent is that people
step through a left-most arivation of
input W from the start symbol S. The
secret is that each left [inaudible] form
is represented in a subtle way. It is
whatever input P has so far consumed
followed by whatever is on P's stack. When
P reaches an empty stack, then the
[inaudible] form it represents is whatever
input it has consumed, followed by
nothing. That is, by the empty stack. That
means that P has found a leftmost
derivation of the input string it has
read. So acceptance of the string is
justified. If not sequence of choices of
the nondeterministic P leads to empty
stack after consuming W from the input.
And W is not a terminal string drive by
the grammar and P rightly does not accept
W. There are two kinds of rules in the
transition function of P depending on
whether a terminal or variable G is at the
top of P stack.The type one rules handle
the case where A is the terminal on top of
P stack. There better NA as the next input
symbol or P is guessed wrongly about the
left most derivation of the input as it
actually exists. In affect we cancel the A
in the stack against the A on the input.
The less intention form represented does
not change. We have now consumed one more
input symbolA from the input. So that
becomes part of the left centential form.
But the A that was at the stack top is
removed so it no longer participates in
the left centential form. The type two
rules handle a variable, say A, at the top
of the stack. We need to expand that
variable by the body of one of its
productions, and thus, move to the next
[inaudible] form. Of course, we're only
guessing. We have to allow any of A's
productions to be used. If A goes to alpha
is one of these productions that a choice
for P, using epsilon input, and with A on
top of the stack, is to replace A by
alpha. We're going to prove that P accepts
exactly what G generates. Formally we will
show something more general. It seems we
always have to show something more general
than what we really want. Here we show,
that if p consumes w from its input.
Starting with only s on it's stack. And
[inaudible] up with stack alpha That is
this ID becomes that ID.'Kay. Then in G,
there is a left most derivation of the
screen W Alpha. Incidentally, notice that
as we describe the moves of P, we allow
any string X to follow W on the input.
Since no part of X was consumed, X cannot
have any effect on the moves P made
reaching the ID shown, so if the statement
is true for one X, it is true for any
other, that is X does not matter. We start
with the only if part. That is, if P makes
the transition shown, then S derives W
alpha in G. The basis is zero steps and W
is obviously Epsilon since nothing can
have been consumed from the input, and
alpha is S since the stack doesn't change.
We need to show that S derives W alpha in
a left most derivation. But W alpha is
just S and surely S derives itself. Now
let's do the induction. We'll consider the
result of N steps of P, that is, this id
has become that id and we'll assume the
inductive hypothesis. For sequences of N
minus one steps. You must consider type
one and type two moves as a last step
separately. For us consider the case.
Where the last of the n moves is a type
one move. Where the a of the top of the
stack is canceled against the a on the
input. Then the w consumed by the n move
system must then be formed y a. That's
this. And before the last move, the Y was
consumed. That is, leaving just a X.
Further, just before the last step. The
stack of P is A, alpha. But the inductive
hypothesis applies to the first N minus
one moves, we can conclude that there is a
left-most derivation from S of Y A alpha.
That's this and it corresponds to the fact
that there is an N minus one derivation of
that ID. But Y A is W, so we already know
that there is a leftmost derivation of W
alpha. That is the needed conclusion for
the full sequence of N steps. Now, let's
look at the case of a type two rules,
where there is a variable A on the top of
the stack, after the N- first move. After
N-1 moves, P has consumed W from the
input, and has A beta on its stack. That's
that. This, of course, is the ID after N-1
moves. At the nth move, no input is
consumed but A is replaced by gamma, one
of its production bodies. Okay that is we
assume A goes to gamma is A production,
okay notice that alpha is gamma beta here
that is. This stack string is really,
alpha. Okay. Again we apply the inductive
hypothesis to the first and minus one
steps. We thus know that there is leftmost
derivation from this of WA Beta. Since A
is clearly the left most variable, and A
goes to gamma as a production there is
also left most derivation of W gamma beta.
That and of course gama beta is alpha so
that is really a derrivation of W alpha as
we wanted to prove. We also should prove
the converse, but we won't. That is, we
need to show if there's a leftmost
variation of W alpha. Is this, then peak
at consumed W from its input with any
unseen X following. And turn stack S into
stack Alpha. The proof is and induction on
the number of steps of the derivation but
that's as far as we will take it. Assuming
we complete the proof of the converse, we
have the statement we set out to prove. P
can consume W from its input with any X
following, and turn stack S into stack
alpha if and only if G has a left most
derivation of W alpha. Now we can restrict
the statement to what we really care
about, the case where X is empty, that is,
P has consumed all its input, and alpha is
also empty. That is, P has emptied its
stack and is accepted. We conclude that P
consumes W while emptying its stack, if
and only if there's a left most derivation
of W and G. That is W is in N of P, if and
only if W is in L of G. For our next
trick, we'll show how to convert PDAs to
grabbers. Assume language L is accepted by
PDAP by empty stack. If it were accepted
by a final state, we already know how to
construct a new PDA that accepts L by
empty stack, so we're entitled to assume
acceptance is by empty stack. We'll
construct G ground refer L, and the idea
is to give G variables, which we'll denote
by PXQ with brackets around them, like
this. His job, the job of this variable is
to generate [inaudible] only if the
string's W. Such, that while reading W
from the input, P goes from state P to
state Q, and appears to pop X from the
input. While doing so, P can grow the
stack well above where X was. But it can
never go below where X was. And at the
end, the stack is shorter by one than it
was when it started. That is the net
effect is that X has been popped. Here's a
picture showing the height of the stack
while X is effectively popped while
reading W. Note that X might be replaced
at the first move or later by another
symbol, Y. It could even be replaced many
times. But the position of the stack that
originally held X is never popped until
the last move, right at the end here. As
we mentioned for every pair of states P
and Q, and stack symbol X, there is a
variable that we represent by the
composite symbol PXQ. Although this
expression consists of five characters,
you must think of it as a single symbol,
in the set of variables of G. Also, as we
hit to the java of PXQ, as it generate all
strings W that have the effect taking
PDAP, and state P with only X on the stack
to the ID where the state is Q. The input
has been consumed and X was popped. That's
that. Notice that since the initial
landing shows nothing below X on the
stack. You know that X can't be pupped
until the last step. So, PDAP can not make
any moves when it's stack is empty. And
there's one more variable in gene that
starts in bolex. There maybe many
productions of variable pxq. For each move
of the pda from state p with x as the top
of the stack. We produce one or more
productions. There several cases and they
get increasingly more complex depending on
how long the stack string is that's
replacing x on the first move. The easiest
case is that of a rule that says, instate
P with input A which could be epsilon or a
real symbol we pop X. That's that. Okay.
Their x is replaced by zero symbols. Then
there is a production PXQ goes to A. The
reason this is correct is that reading
only A is one way to have the metafact of
popping X from going to state P to state
Q. The next simplest case is when a move
replaces X by a string of length one, say
Y. Suppose that rule also changes the
state to R. Then there's a production,
PXQ, goes to A R Y Q. That is, one way to
pop X while going from state P to state Q
is to read input A going to state R and
replacing the X by Y at the top of the
stack. Then, some number of inputs, AW,
has the net effect of popping the Y while
going from state R to Q. As a consequence,
the net effect of reading A followed by W.
Is to take state p, to state q while
popping the original x. Here's a picture
of the case where x is replaced by a
single symbol y. How the y gets popped, we
don't know, but when it does, the effect
is that symbol a followed by whatever w
popped the y, has the effect of popping x
while going from state p to state q. Now
it's getting a little more complicated.
Supposed there is a move that replaces X
by two symbols Y ans Z, while going to
state R and reading A from the input. As
the new stack YZ replacing X. In order for
x to be raised. There must be some input
string u. That has the net effect of
raising y. And you must take the pda from
state r to some state s. Which
unfortunately we don't know. As a result.
We're going to have one production for
each possible state s. But after reaching
state s. We must have some additional
input [inaudible] that takes the pda from
state s to state q while popping the z
from the stack. And that effect is that a
followed by u and then v pops x from the
stack while going from state p to q.
Here's a picture of that action. Initially
you see X got replaced by Y and Z on the
stack. Then U had the net effect of
replacing Y, a pop, popping Y. Exposing
the Z, now we're in state S. And then V
has the net effect of popping the Z and
winding up in state Q. So we generate many
productions for this case, where I input
A, state P becomes R, and X gets replaced
by the sta, on the stack by Y and Z. For
every state S, there's a production with
head PXQ. And then the A that causes the
first move. Remember A could be empty. And
then, RYS to effectively pop the Y,
winding up in the state S that we really
don't know. So that's why there's one
production for each S. And then SZQ to
effectively pop the Z. We finally wind up
in state Q, which is the state that we
wanted to wind up, because that's the
state that appears in the head. As a
result of this production you can see that
PXQ can derive any string AUV, provided
that RYS derives the U. And SGQ divides
the V. In the general case, we're on input
A, in state P. X is replaced by a string
of three or more stack symbols, Y1 through
YK. And the state becomes r. We need a
family of productions in which there are k
minus one unknown states, s one through s
k minus one, right. The productions all
have this form. P X Q can re, can be
replaced by an A. Which again maybe esplon
followed by variables R, Y1, S1. S1, Y2,
S2 and so on with the last of the
variables being SK minus one, YK and
finally the state Q from the head that we
want to wind up in. With productions
disrupted in this manner, we can prove
that P accept W by empty stack, that is.
The I.D. Q0wz0 goes to P epsilon, epsilon,
if and only if the variable Q0Z0P derives
W. We're not going to give the proof that
is too easy inductions, one for each
direction. The only problem is, we don't
know state P. But remember G has another
variable S and that is the start symbol.
So, we add production S goes to Q0 Z0 P
for every state and now we have a grammar
that generates exactly the strings that
the pdap accepts.

S-ar putea să vă placă și