Sunteți pe pagina 1din 291

ByBy

VV VV LL DivakarDivakar AllavarapuAllavarapu

Asst.Asst. ProfessorProfessor GITAMGITAM UniversityUniversity VishakapatnamVishakapatnam

UNITUNIT IIII PredicatePredicate LogicLogic

Quantifiers

There are two types of quantifiers

1. Universal quantifier

(pronounced as “For All”)

2. Existential quantifier

(pronounced as “There Exists”)

Universal Quantifier

1. All Kings are persons x: king(x) → person(x)

2. All people are literal x: person(x) → literate(x)

3. All men are people x: man(x) → person(x)

4. All Pompieans were romans x: pompiens(x) →roman(x)

Existential Quantifier ( )

1. There is some people who wrote games

x : person(x) → wrote(x, games) 2. There is a person who wrote chess x : person(x) → wrote(x, chess) 3. Everyone is loyal to someone. x y : loyalto(x,y)

Predicate Sentences

1. Marcus was a man

man(Marcus) 2. Marcus was a Pompeian Pompeian(Marcus)

3. All Pompeians were Romans x: Pompeian(x) → Roman(x)

4. Caeser was a ruler ruler(Caeser)

Predicate Sentences

5. All Romans were either loyal to Caesar or hated him. x : Roman(x) → loyalto(x,Caeser) v hate(x, Caeser) 6. Everyone is loyal to someone. x: y : loyalto(x,y)

Predicate Sentences

7. People only try to assassinate rulers

they are not loyal to. x: y: Person(x) & ruler(y)& tryassassinate(x,y) → ~loyalto(x,y)

8. Marcus tried to assassinate Caesar. tryassisanate(Marcus, Caesar)

9. All Men are people. x : man(x) → person(x)

Predicate Sentences

Answer the Question

Was Marcus loyal to Caesar? We need to prove ~Loyalto(Marcus,Caesar) OR Loyalto(Marcus,Caesar)

Predicate Sentences

1. Marcus was a man (1)

2. All men are people (9)

3.Conclusion Marcus was a person

4. Marcus tried to assassinate Caesar(8)

5. Caesar was a ruler (4)

6. people only try to assassinate rulers

they are not loyal to.(7) Conclude from (3, 4 and 5) Marcus not loyal to Caesar.

Predicate Sentences

(Predicate 1)man(Marcus)

| (Predicate 9)

person(Marcus)

| (Predicate 8)

person(Marcus) &

Tryassassinate(Marcus,Caesar)

| (Predicate 4)

Predicate Sentences

person(Marcus) & Ruler(Caeser) & Tryassassinate(Marcus,Caesar)

| (Predicate 7)

~Loyalto(Marcus, Caeser)

instance and isa Relationship

Knowledge can be represented as classes, objects, attributes and Super class and sub class relationships.

Knowledge can be inference using property inheritance. In this elements of specific classes inherit the attributes and values.

instance and isa Relationship

Attribute instance is used to represent the relationship “Class membership ” (element of the class) Attribute isa is used to represent the relationship “Class inclusion” (super class, sub class relationship)

instance and isa Relationship

1. man(Marcus)

2. Pompeian(Marcus)

3. x: Pompeian(x) → Roman(x)

4. Ruler(Caesar)

5. x : Roman(x) → loyalto(x,Caeser) v

hate(x, Caeser)

Using instance Attribute

1. instance (Marcus,man)

2. instance (Marcus,Pompeian)

3. x: instance(x,Pompeian) →

instance(x,Roman)

4. instance(Caesar, Ruler)

5. x : instance(x,Roman) →

loyalto(x,Caeser) v hate(x, Caeser)

Using isa Attribute

1. instance (Marcus,man)

2. instance (Marcus,Pompeian)

3.isa (Pompeian, Roman)

4. instance(Caesar, Ruler)

5. x : instance(x,Roman) → loyalto(x,Caeser) v hate(x, Caeser)

6. x: y: z: instance(x, y) & isa(y,

z)→instance(x, z)

Computable functions and Predicates

Some of the computational predicates like Less than, Greater than used in knowledge representation.

It generally return true or false for the inputs. Examples: Computable predicates gt(1,0) or lt(0,1) gt(5,4) or gt(4,5) Computable functions: gt(2+4, 5)

cont

1. marcus was a man man(Marcus) 2. Marcus was a pompeian Pompeian(Marcus) 3. Marcus was born in 40 A.D born(marcus, 40) 4. All men are mortal x: men(x)→ mortal(x) 5. All Pompeians died when the volcano erupted in 79 A.D erupted(volcano,79) & x :pompeian(x)→died(x, 79)

cont

6. No mortal lives longer than150 years

x: t1: t2: mortal(x) & born(x,t1) & gt(t2-t1,150)→

dead(x,t1)

7. It is Now 1991

Now=1991

8. Alive means not dead

x: t: [ alive(x,t) ~dead(x,t)] & [~dead(x,t)alive(x,t)]

9. If someone dies then he is dead at all later times x: t1: t2: died(x,t1) & gt(t2,t1)→ dead(x1,t2)

1. man(Marcus)

cont

2. Pompeian(Marcus)

3. born(marcus, 40)

4. x: men(x)→ mortal(x)

5. erupted(volcano,79)

6. x :pompeian(x)→died(x, 79)

7. x: t1: t2: mortal(x) & born(x,t1) & gt(t2-t1,150)→

dead(x,t1)

8. Now=1991

9. x: t: [ alive(x,t) ~dead(x,t)] &

[~dead(x,t)alive(x,t)] 10. x: t1: t2: died(x,t1) & gt(t2,t1)→ dead(x1,t2)

cont

Is Marcus alive?

cont

~alive(Marcus, Now)

| (9, Substitution)

Dead(Marcus, Now)

| (10, Substituation)

pompeian(Marcus) & gt(now ,t1)

|(5, Substituation)

pompeian(Marcus) & gt(now ,79)

|

(2)

gt(now,79)

| (8, substitute Equals)

gt( 1991, 79)

|

True

cont

Disadvantage:

Many steps required to prove simple conclusions

Verity of processes such as matching and substitution used to prove simple conclusions

Resolution

Resolution is a proof procedure by refutation.

To prove a statement using resolution it attempt to show that the negation of that statement.

Conversion to Conjunctive Normal Form

All Romans who know Marcus either hate Caesar or think that any one hates any one is crazy. x : [Roman(x) & known(x,Marcus)] → [hate(x, Caeser) v ( y: z :hate(y,z) →thinkcrazy(x,y))]

CNF Equivalent:

~Roman(x) v ~known(x,Marcus) v hate(x, Caeser) v ~hate(y,z) v thinkcrazy(x,y)

Algorithm : converting to CNF

1. Eliminate a→b = ~a v b

x : ~[Roman(x) & known(x,Marcus)] v [hate(x, Caeser) v ( y: z : ~hate(y,z) v thinkcrazy(x,y)]

CNF

2. Reduce the scope of ~

~(~p) = p

~(a & b)= ~a v ~b

~(a v b)= ~a & ~b x : [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( y: z : ~hate(y,z) v thinkcrazy(x,y)]

CNF

3. Make each quantifier bind to a unique variables x : P(x) v x Q(x)

by eliminate unique variables x : P(x) v y Q(y)

CNF

4. Move all quantifiers to the left of the formula

x : y: z : [~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( ~hate(y,z) v thinkcrazy(x,y)]

CNF

5. Eliminate existential quantifier() by substituting variable reference to y: president(y)

president(F1)

x: y: father-of(x, y) x: father-of(F(x), x)

CNF

6. Drop the prefix

[~Roman(x) v ~known(x,Marcus)] v [hate(x, Caeser) v ( ~hate(y,z) v thinkcrazy(x,y)]

7. Convert statement into conjunction of

disjunct (a & b) v c = (a v b) & (b v c)

Resolution in Propositional Logic

Given Axioms:

1. P

2. (P & Q) →R

3. (S v T) →Q

4. T

Step 1: Convert all Axioms into clause

form

1. P

2. ~P v ~Q v R

3. (a) ~S v Q

(b) ~T v Q

Propositional Resolution

Step 2: Negate the proposition we want to prove and add it to the existing clauses

example : Form Above we want to prove

R

so ~R add it to clauses

Propositional Resolution

step 3: select some clauses and try to prove our assumption is wrong.

(~R) & (~P v ~Q v R) [clause 2]

|

~P v ~Q | [clause 1]

(~P v ~Q) & P

|

~Q

| [clause 3(b)]

(~Q) &(~T v Q)

|

~T

| [clause 4]

(~T) & (T)

|

Contradiction

Unification

Unification is the process of finding substitutions that make different logical expressions look identical

Proposition Logic: R & ~R

Predicate Logic:

man(Marcus) & ~man(Marcus)

man(Marcus) & ~man(Spot)

Cont…

Solution for this problem is matching and substitution. Example:

Unify P(x, x) P(y, z) Here x, y, z are variables

UNITUNIT IIIIII SymbolicSymbolic ReasoningReasoning UnderUnder UncertaintyUncertainty

NonNon MonotonicMonotonic ReasoningReasoning

Example

ABC Murder Story:

Abbott(A), Babbitt(B), Cabot(C) be suspects in a Murder case.

1. A has alibi, in the register of respected Hotel. 2. B also has alibi, for his brother-in-law testified that B was visiting at that time. 3. C pleads alibi too, claiming to watch live match in the ground(But we have only his words).

Example

So We can believe:

1. That A did not commit the crime 2. That B did not commit the crime 3. That A or B or C did

Conclusion ?

Example

But C have been caught by Live television. So new belief is

4. That C did not commit the crime.

Monotonic Reasoning

1. It is complete with respect to the domain interest. 2. It is consistent 3. Knowledge increase monotonically when new facts can be added.

Ex: KB1 = KBL

KB2 = KBL U F ( F is some facts) than

KB2 = KBL U F ( F is some facts) than

KB1 is sub set of KB2

Non Monotonic Reasoning

1. It is may not be complete, allow inference to be made on the basis of lake of knowledge.

2. It is inconsistent

3. Knowledge may decrease when new facts can be added.

Approaches

Approaches to handle these problems 1. Non Monotonic Reasoning (Belief) 2. Statistical Reasoning (Certainty)

Logics for Non Monotonic Reasoning

Deferent Reasonings

1. Default Reasoning

a) Non Monotonic Logic(NML)

b) Default Logic(DL)

2. Minimalist Reasoning

a) Closed World Assumption (CWA)

Non Monotonic Logic

This is the Predicate logic, augmented with model operator M, which can be read as

“is consistent”

NML Example

x: y: Related(x, y) & M GetAlong(x, y) →WillDefend(x, y)

For all X and Y are related and if the fact that X gets along with Y is consistent with everything else that is believed, than conclude that X will defend Y

NML Example

1. x: Republican(x) & M ~Pacifist(x) → ~Pacifist(x) 2. x: Quaker(x) & M Pacifist(x) → Pacifist(x) 3. Republican(Marcus) 4. Quaker(Marcus)

Default Logic

It is an alternative logic. In this rules are represented in the form of

A: M B / C

If A is provable and it is consistent to assume B then conclude C.

Abduction Resoning

Deduction:

x: A(x) →B(x)

Given A(marcus)

we conclude B(marcus)

Abduction: It is the reverse process

It is given B(Marcus)

we conclude A(Marcus)

But it is wrong some times

Closed World Assumption(CWA)

It is a simple kind of minimalist reasoning. Courses offered: CS 101, CS 203, CS 503 How many courses will be offered? Answer ?

or ?

CWA

May be one to infinity.

Reason is that course assertions are do not deny unmentioned courses are also offered. (incomplete information)

Courses are different from each other.

CWA

The assumption is that the provided information is complete.

So not asserted to be true are assumed to be false.

Example: Airline KB Application

Is there any flight from Vskp to Hyd?

~Connect(Vskp, Hyd) is asserted when we can not prove Connect(Vskp, Hyd)

Implementation Issues

1. How to update Knowledge incrementally ? 2. Many facts are eliminated when new Knowledge become available.

How should it be manged? 3. Theories are not computationally effective?

These issues can be handled by search control. Depth first search ? Breadth first search?

Depth First Search

Cronological Backtracking

It is a depth first search backtracking.

It makes a guess at something, thus it creating a branch in the search space.

If our guess is wrong, back up there and try alternative. It is leaving everything after guess

Example

We need to Know the fact 'F'.

Which can be derived by making some assumption 'A' and derive 'F'.

It also derives some additional facts 'G' and 'H' from 'F'

Later we derive new facts 'M' and 'N', They are independent of 'A' and 'F'.

Example

A F
A
F
Example A F G H M N

G

H

M

N

Example

At some time a new fact invalidates 'A'.

In cronological backtracking invalidates all F, G, H, M, N even M, N not depend on assumption.

Exmple 2

Problem:

Finding a time at which three busy people can all attend a meating

Assumption:

Meating held on wednesday

Found a fact: All are free at 2:00

So choose 2:00 is the meating time.

Example

Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM

Assume day= Wed

Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM
Example Assume day= Wed After Many steps that only time all people availabe is 2:00 PM

After Many steps that only time all people availabe is 2:00

PM

Repete same time finding process and again decide on 2:00 PM. For same reasons. Try to find room

FAIL(a special conference has all the rooms booked on Wed)

SUCCEED

Problem

Based on the order they geerated by search process insted of responcibility of inconsistancy, we may waste a great effort

Dependency Directed Backtracking

It makes a guess at something, associate each node one or more justification in the search space.

Two Approaches for Dependency Directed Backtracking

Justification based Truth Maintenance systems(JTMS)

Logical based Truth Maintenance systems(JTMS)

Justification based Truth Maintenance Systems (JTMS)

JTMS

JTMS

ability to provided

dependency directed backtracking and so to support nonmonotonic reasoning.

has

an

Example: ABC Murder Story

Initially our believe that A is the primary suspect. Because he was a beneficiary and he had no alibi.

contd

Using Default Logic:

Beneficiary(x): M ~Alibi(x) / Suspect(x)

Dependency Network

Suspect A[IN]

+
+
Dependency Network Suspect A[IN] + IN List - (OUT List) Benificiary A Alibi Abbott Abort should

IN List

- (OUT List)

Benificiary A

Alibi Abbott

Abort should be a suspect when it is belived that he is a benificiary and it is not belived that he has an alibi

Dependency Network

There are three assertions:

1. Suspect A(Primary Murder suspect) 2. Benificiary A(He is benificiary of the victim) 3. Alibi Abbott(A was at a hotel at the time)

Dependency Network

Suspect A[OUT]

+ +
+
+

Benificiary A [IN]

Alibi A [IN]

+ +
+
+
_
_

Registered A [IN]

Far Away [IN]

Registered Forged A [OUT]

Abort should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

Dependency Network

Suspect B[OUT]

+ +
+
+

Benificiary B [IN]

Alibi B [IN]

+
+

Say So B-I-L [IN]

_
_

Lies B-I-L [OUT]

B should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

+

Dependency Network

Suspect C[IN]

+ +
+
+

Benificiary C [IN]

Alibi C

[OUT]

+

+

Tells truth Cabot [OUT]

Abort should not be a suspect when it is belived that he is a benificiary and it is belived that he has an alibi

+

Dependency Network

Suspect C[OUT]

+ +
+
+

Benificiary C [IN]

Alibi C

[IN]

+ _
+
_

TV Forgery [OUT]

+

+

Tells truth Cabot [IN]

C seen in TV [IN]

C should be a suspect when it is belived that he is a benificiary and it is belived that he has no alibi

Dependency Network

Contradiction[IN]

Dependency Network Contradiction[IN] - - - - Sespect Other Sespect A Sespect A Sespect A Sespect
- -
-
-
- -
-
-

Sespect Other

Sespect A

Sespect A

Sespect A

Sespect A

Logical Based Truth Maintainance System (LTMS)

It is similar to the JTMS.

In JTMS the nodes in the network are treated as atoms.

Which assumes no rerelationships among them except the ones that are expliucitly stated in the justifications.

Example: we can represent Lies B-I-L and not Lies B-I-L and labled both of them IN.

No contruduction will be detected automatically.

LTMS

In LTMS contradiction will be detected automatically. In this we need not create explicit contradiction

Breadth First Search

StatisticalStatistical ReasoningReasoning

Basic Probability

1. 0 <= p(a) <= 1

2. p(a)+p(~a) = 1

3. p(a V b) = p(a) + p(b) - p(a & b)

Prior Probability

It is associated the degree of belief in the absence of any other information

P(A) = 0.3

P(Cavity)=0.1

It is used only when no other related information.

Conditional Probability

Once we have obtained some evidence concerning previously unknown random variable conditional probabilities should be used.

P(a|b)= 0.2

Probability of a with known evidence b

P(Cavity|Toothache) = 0.8

Conditional Probability

Product Rule:

P(a & b)= P(a|b) P(b) P(a & b)= P(b|a) P(a)

P(a|b)=P(a&b) / P(b)

Bayes Theorem

Bayes rule states:

The probability of the hypothesis(H) to be true with Known observations(E) is

P(H|E) = P(H&E) / P(E)

=> P(H|E) = P(E|H)P(H) / P(E)

For N events if P(A1)+P(A2)+

P(Ai)|B)=

+P(An)=1

P(B|Ai)*P(Ai)

P(B|A1)*P(A1)+

+P(B|An)*P(An)

Bayes Theorem

Doctor know that cavity causes the patent has toothache say 50%. Prior probability that any patent has toothache 1/20 and cavity

1/1000.

P(Toothache|Cavity)=0.5 P(Cavity)=0.001 P(Toothache)=0.05

Finding P(Cavity|Toothache)

= 0.5 * 0.001/0.05 = 0.01

Bayes Network

S: Sprinkler was on last night W: Grass is wet R: It rained last night

Sprinkler Rain Wet P(wet|Sprinkler) = 0.9 P(Rain|wet) =0.7
Sprinkler
Rain
Wet
P(wet|Sprinkler) = 0.9
P(Rain|wet) =0.7

Bayes Network

C

t

f

P(C)= 0.5

Cloudy C P(S) t .10 f .50 Sprinkler Rain Wet
Cloudy
C
P(S)
t
.10
f
.50
Sprinkler
Rain
Wet

S

R

t

t

t

f

f

t

f

f

P(W)

.99

.90

.90

.00

P(R)

.80

.20

Bayes Theorem

Disadvantages:

1. Too many probabilities need to be provided 2. Space to store all probabilities 3. The time required to compute the probabilities 4. This theory is good for well structured situation in which all the data is available and the assumptions are satisfied. Unfortunately these conditions may not occur in reality.

Certainty Factors and Rule-based systems

cont

In MYCIN Expert system each rule is associated with certainty factor which is the measure of the evidence to be believed.

MYCIN Rule looks like:

If 1. The stain of the organism is gram positive and 2. The Morphology is Coccus and 3. The Growth is Clumps then suggestive evidence 0.7 that is staphylococcus.

Certainty Factor

Certainty factor CF[h,e] is defined in two components:

in hypothesis h given evidence e.

disbelief in hypothesis h given evidence e.

MB[h,e] – A measure (b/w 0 and 1) of belief

MD[ h,e] – A measure (b/w 0 and 1) of

CF[h,e] = MB[h,e] - MD[h,e]

Cont…

In MYCIN model, e for two evidences e1 and e2 supporting hypothesis h.

The measure of belief MB is

MB[h,e1&e2] =

MB[h,e1] +MB[h,e2] *(1-MB[h,e1])

MD[h,e1&e2] = MD[h,e1] +MD[h,e2] *(1-

MD[h,e1&e2] = MD[h,e1] +MD[h,e2] *(1-

MD[h,e1])

Cont…

If MD[h,e1&e2] = 0
Or
MB[h,e1&e2]=1 All the evidences (e1 and e2) approves the hypothesis (h)

Or

MD[h,e1&e2] = 1
Or
MB[h,e1&e2]=0 All the evidences (e1 and e2) disproves the hypothesis (h)

Example for CF

Set of rules r1,r2, …r7 are given a support of evidence for the hypothesis

H is conclusion that it is an elephant

e1: r1:It has a tail 0.3

e2: r2:It has a trunk 0.8

e3: r3:It has a heavy body 0.4

e4: r4:It has four legs 0.2

e5: r5:It has black colour 0.1

e6: r6:It has stripes 0.6

Cont…

For rule 1: MB= 0.3 and MD=0

Inclusion of effect of rule 2 gives the value of MB and MD as

MB=0.3+0.8 * (1- 0.3) =0.86 and MD=0

For rule 3 inclusion

MB=0.86+0.4 * (1- 0.8) =0.94 and MD=0

For rule 4 inclusion

MB=0.94+0.2 * (1- 0.94) =0.952 and MD=0

Cont…

For rule 5 inclusion

MB=0.952+0.1* (1- 0.952) =0.9568 and

MD=0

For rule 6 inclusion

0.9568 and MD=0.6

For rule 7 inclusion

MB= 0.9568 +0.6 (1- 0.9568) =0.98272 and

Dempster Shafer Theory

DST is design to deal with the distinction between Uncertainty and ignorance.

It is very useful to handle epistemic information as well as ignorance or lack of information

cont

It is represented in the Belief and Plausibility Belief measures the strength of the evidence range from 0 to 1 Plausibility is denoted to be pl(s)= 1-bel(~s)

Weak-SlotsWeak-Slots andand FillerFiller StructuresStructures

Introduction

Knowledge can be represented in slot-and – filler system as a set of entities and their attributes

The structure is useful beside the support of Inheritance

It enables attribute values to be retrieved quickly

Properties of relations are easy to describe

It allows ease consideration of object oriented programming.

Introduction…

A slot is an attribute value pair in its simplest form A filler is a value that a slot can take -- could be a numeric, string (or any data type) value or a pointer to another slot A weak slot and filler structure does not consider the content of the representation

Semantic Nets

Semantic nets consists of

Nodes denoting the objects.

Links denoting relations between objects.

Link labels that denotes particular relations.

Example

Mammal isa has-part Person Nose instance Uniform- team color Blue Pee-Wee-Reese Brooklyn-Dodgers
Mammal
isa
has-part
Person
Nose
instance
Uniform-
team
color
Blue
Pee-Wee-Reese
Brooklyn-Dodgers

Representing Binary predicates

Some of the predicates can be asserted from the above figure are:

isa(person, Mammal)

Instance(Pee-Wee-Reese, person)

team(Pee-Wee-Reese,Brooklyn-Dodgers)

uniform-color(Pee-Wee-Reese,Blue)

has-part(Pee-Wee-Reese, Nose)

Representing Nonbinary predicates

Three or more place predicate can be represented by creating one new object.

Example : score( Cubs, Dodgers, 5-3)

Game

creating one new object. Example : score( Cubs, Dodgers, 5-3) Game isa G23 Cubs Visiting team

isa

G23

Cubs

creating one new object. Example : score( Cubs, Dodgers, 5-3) Game isa G23 Cubs Visiting team

Visiting team

5-3

creating one new object. Example : score( Cubs, Dodgers, 5-3) Game isa G23 Cubs Visiting team

score

creating one new object. Example : score( Cubs, Dodgers, 5-3) Game isa G23 Cubs Visiting team
Dodgers
Dodgers
Dodgers

Dodgers

Home-team

Representing a sentence

John gave the book to marry

Give

Representing a sentence John gave the book to marry Give instance EV7 BK23 instance BK23 john

instance

EV7

BK23

a sentence John gave the book to marry Give instance EV7 BK23 instance BK23 john agent
instance BK23

instance

BK23

BK23

a sentence John gave the book to marry Give instance EV7 BK23 instance BK23 john agent

john

a sentence John gave the book to marry Give instance EV7 BK23 instance BK23 john agent

agent

objec

t

beneficiaryRepresenting a sentence John gave the book to marry Give instance EV7 BK23 instance BK23 john

Mary

Relating Entities

 

height

   

john

john 72

72

 

height

   

Bill

Bill 52

52

If we want to relate these two entities with the fact John is taller then Bill.

Relating Entities…

john

Relating Entities… john height H1 Bill height H2 Greater-than Relating two entities by creating two objects

height

H1

Bill

Relating Entities… john height H1 Bill height H2 Greater-than Relating two entities by creating two objects
height H2

height

H2

H2

Relating Entities… john height H1 Bill height H2 Greater-than Relating two entities by creating two objects

Greater-than

Relating two entities by creating two objects H1 and H2 with grater-than

Relating Entities…

john

Relating Entities… john height H1 Bill height H2 value 52 Greater- than value 72 This is

height

H1

Bill

Relating Entities… john height H1 Bill height H2 value 52 Greater- than value 72 This is

height

H2

Relating Entities… john height H1 Bill height H2 value 52 Greater- than value 72 This is
Relating Entities… john height H1 Bill height H2 value 52 Greater- than value 72 This is

value

52

Greater-

than

john height H1 Bill height H2 value 52 Greater- than value 72 This is assigning values

value

72

This is assigning values to John and Bill

Partitioned Semantic Nets

Consider simple statement The dog bitten the mail carrier can be represented

Dogs

The dog bitten the mail carrier can be represented Dogs isa d Bite isa b Mall-carrier

isa

d

The dog bitten the mail carrier can be represented Dogs isa d Bite isa b Mall-carrier

Bite

The dog bitten the mail carrier can be represented Dogs isa d Bite isa b Mall-carrier

isa

b

Mall-carrier

isa

victim

victim m
victim m

m

The dog bitten the mail carrier can be represented Dogs isa d Bite isa b Mall-carrier

assailant

Partitioned Semantic Nets…

If you want to represent quantified expressions in semantic nets, one way is to partition the semantic net into a hierarchical set of spaces

Consider the sentence

Every dog has bitten a mail carrier x : dog( x ) -> y: mail-carrier( y ) v bite( x,y)

Partitioned Semantic Nets…

SA Mall- GS Dogs Bite carrier isa isa isa isa S1 isa assaila victi g
SA
Mall-
GS
Dogs
Bite
carrier
isa
isa
isa
isa
S1
isa
assaila
victi
g
d nt
b m
m
form
A

Partitioned Semantic Nets…

The above figure g is the instance of special class GS of general statement

Every general statement has two attributes

Form which states the relation that is being asserted

On or more universal quantifier connections

Partitioned Semantic Nets…

Every dog in town has bitten the constable can be represented

Dogs

Dogs constables isa c GS Bite Town- dog isa isa isa S1 assaila victi g d

constables

isa

c

GS

Dogs constables isa c GS Bite Town- dog isa isa isa S1 assaila victi g d
Bite
Bite
Town- dog isa isa isa S1 assaila victi g d nt b m form A
Town-
dog
isa
isa
isa
S1
assaila
victi
g
d
nt
b
m
form
A
Dogs constables isa c GS Bite Town- dog isa isa isa S1 assaila victi g d

SA

Partitioned Semantic Nets…

Every dog has bitten every mail carrier can be represented

Dogs

Dogs Bite Mall-carrier isa isa isa S1 d assailant victim b m form A A isa

Bite

Dogs Bite Mall-carrier isa isa isa S1 d assailant victim b m form A A isa

Mall-carrier

Dogs Bite Mall-carrier isa isa isa S1 d assailant victim b m form A A isa
isa isa isa S1 d assailant victim b m
isa
isa
isa
S1
d assailant
victim
b
m
form A A
form
A
A
isa g

isa

isa g

g

GS

Dogs Bite Mall-carrier isa isa isa S1 d assailant victim b m form A A isa

SA

Frames

Semantic nets initially we used to represent labeled connections between objects As tasks became more complex the representation needs to be more structured The more structured system it becomes more beneficial to use frames

Frames…

A frame is a collection of attributes or slots and associated values that describe some real world entity

Each frame represents:

a class (set)

an instance (an element of a class)

Frame system example

Person Isa

:

Mammal

Cardinality

:

6,000,000,000

* handed Adult-Male

:

Right

Isa:

:

Person

Cardinality

:

2,000,000,000

*handed

:

Right

ML-Baseball-Player

Isa

:

Adult-Male

Cardinality

:

624

*height

:

6-1

*bats

:

Equal to

handed

*batting-average

:

.252

*team

:

*uniform-colour

:

Fielder

Isa

:

ML-Baseball

Player

Cardinality

:

376

*batting-average

:

.262

Frame system example

Pee-Wee-Reese

Instance

:

Fielder

Height

:

5-10

 

Bats

:

Right

Batting-Average

:

.309

Team

:

Brooklyn-Dodgers

Uniform-color

:

Blue

ML-Baseball Team Isa

:

Team

 

Cardinality

:

26

*team-size

:

24

*Manager Brooklyn-Dodgers

:

Instance

:

ML-Baseball-Team

Team-size

:

24

Manager

:

Leo-Durocher

Players

:

{Pee-Wee-Reese,

}

Class of All Teams As a Metaclass

Class

Instance

:

Class

Isa

:

Class

Team

Instance

: Class

Isa

:

Class

Cardinality :

{The number of teams that exist}

*team-size :

{Each team has a size}

ML-Baseball-Team Isa Instance Isa Cardinality :

teams that exist} *team-size :

team}

Manager

: Mammal

:

: Team 26 {The number of baseball

Class

24 {The default 24 players per

:

cont

Brooklyn-Dodgers

Instance

Team

:

ML-Baseball-

Isa

:

ML-Baseball-Team

Team-size

:

24

Manager

:

Leo-Durocher

*uniform-colour

:

Blue

Pee-Wee-Reese Instance Dodgers

:

Brooklyn-

Instance

:

Fielder

Uniform-colour :

Blue

Batting-Average

:

.309

Classes and Metaclasses

ML-Baseball-Team

Team

Class(set of sets)

and Metaclasses ML-Baseball-Team Team Class(set of sets) ML-Baseball- Brooklyn- Player Dodgers Pee-Wee- Reese

ML-Baseball-

and Metaclasses ML-Baseball-Team Team Class(set of sets) ML-Baseball- Brooklyn- Player Dodgers Pee-Wee- Reese

Brooklyn-

Player

and Metaclasses ML-Baseball-Team Team Class(set of sets) ML-Baseball- Brooklyn- Player Dodgers Pee-Wee- Reese

Dodgers

and Metaclasses ML-Baseball-Team Team Class(set of sets) ML-Baseball- Brooklyn- Player Dodgers Pee-Wee- Reese

Pee-Wee-

Reese

Representing Relationships among classes

Classes can be related to each other Class1 can be a subset of Class2 Mutually-disjoint-with relates a class to one or more other classes that are guaranteed to have no elements in common with it Is-covered-by a set S of mutually disjoint classes than S is called as partition of the class

Representing Relationships among classes

ML-Baseball-Player

isa

isa
isa
isa isa
isa
isa

Catcher

isa
isa

Pitcher

Fielder

American-

leaguer

National-

leaguer

instance
instance
instance
instance

Three-Finger-Brown

ML-Baseball-Player

Cont…

Is-covered-by

:

{Pitcher, Catcher, Fielder} {National-Leaguer, American-

leager}

Pitcher

Isa

:

ML-Baseball-Player

Mutually-disjoint-with

:

{Catcher, Fielder}

Fielder

Isa

:

ML-Baseball-Player

Mutually-disjoint-with

:

{Pitcher,Catcher}

Catcher

Isa

:

ML-Baseball-Player

Mutually-disjoint-with

:

{Pitcher, Fielder}

National-Leaguer

Isa

:

ML-Baseball-Player

Three-Finger-Brown

Instance

:

Pitcher

Instance

:

National-Leaguer

Slot-Values as Objects

John

height

:

72

Bill height

:

We

creating slots themselves into objects. we use Lambda(λ) notation for creating objects

could

attempt

to

slots

compare

by

John

height:

Bill height:

Cont…

72; λx( x.height > Bill.height )

λ x( x.height < John.height )

Inheritance

Bird fly:yes

isaInheritance B i r d fly:yes isa Ostrich fly:no instance Pet-Bird i n s t a

isa
isa

Ostrich

fly:no

Inheritance B i r d fly:yes isa isa Ostrich fly:no instance Pet-Bird i n s t

instance

Pet-Bird

Inheritance B i r d fly:yes isa isa Ostrich fly:no instance Pet-Bird i n s t

instance

Fifi fly:?

Cont…

Quaker pacifist :no

Cont… Quaker pacifist :no instance Republican- pacifist: false instance Dick pacifist:?

instance

Republican-

pacifist:

false

instance

Dick

pacifist:?

Solution

The solution to this problem instead of using path length but use inferential distance. Class1 is closer to class2 then to class3 , if and only if class1 has an inference path through class2 to class3 (class2 is between class1 and class3)

Property inheritance

The set of competing values for a slot S in a frame F contains all those values

Can be derived from some frame X that is above F in the isa hierarchy Are not contradicted by some frame Y that has a shorter inferential distance to F than X does

Bird

fly : yes

isa

isa

Ostrich fly:no

isa
isa

Pet-Bird

B i r d fly : yes isa Ostrich fly:no isa Pet-Bird instance is a Plumed-

instance

is

a

fly : yes isa Ostrich fly:no isa Pet-Bird instance is a Plumed- Ostrich isa White-Plumed Ostrich

Plumed-

Ostrich

isa
isa

White-Plumed

Ostrich

isa Ostrich fly:no isa Pet-Bird instance is a Plumed- Ostrich isa White-Plumed Ostrich instance F i

instance

Fifi

fly:?

Cont…

Republican

pacifist:false

isaCont… Republican pacifist:false Conservative- Republican i n s t a n c e Quaker pacifist :no

Conservative-

Republican

i n s t a n c e instance

Quaker pacifist :no

pacifist:false isa Conservative- Republican i n s t a n c e Quaker pacifist :no instance

instance

Dick pacifist:?

Reasoning Capabilities of Frames

Consistency checking to verify that when a slot value is added to a frame Propagation of definition values along isa and instance links Inheritance of the default values along isa and instance links

Frame Languages

The idea of Frame system as a way to represent declarative knowledge has been encapsulated in a series of frame oriented knowledge representation languages

KRL [Bobrow and Winograd in 1922] FRL [Roberts and Goldstein, 1977]

RLL,

KL-ONE,

KRYPTON,

NIKL,

CYCL,

Conceptual Graphs, THEO and FRAMEKIT

UNITUNIT IVIV Strong-SlotsStrong-Slots andand FillerFiller StructuresStructures

Conceptual Dependency

In semantic network and Frame systems may have specialized links and inference procedure but there is no rules about what kind of objects and links are good in general for knowledge representation Conceptual Dependency is a theory of how to represent events in natural language sentences

Facilitates drawing inferences from sentences

Independent of the language in which sentences were originally stated

Conceptual Dependency …

CD provides

a structure into which nodes representing information can be placed

a specific set of primitives at a given level of granularity

Sentences are represented as a series of diagrams

The agent and the objects are represented

The actions are built up from a set of primitive acts which can be modified by tense.

Primitive Acts

ATRANS

PTRANS

PROPEL

MOVE

GRASP

INGEST

EXPEL

Transfer of an abstract relationship (e.g.,give)

Transfer of the physical location of an

object

(e.g.,go)

Application of physical force to an

object

(e.g.,push)

Movement of a body part by its owner

(e.g.,kick) Grasping of an object by an actor (e.g.,clutch) Ingestion of an object by an animal

(e.g.,eat) Expulsion of something from the body of an

animal

(e.g.,cry)

Conceptual Dependency …

MTRANS

MBUILD

SPEAK

ATTEND

Transfer of mental information (e.g.,tell) Building new information out of old (e.g.,decide)

Production of sounds

(e.g.,say)

Focusing of a sense organ toward a. stimulus (e.g.,listen)

Premitive Concepts

conceptual

categories

provide

building

blocks

dependencies in the concepts in a sentence

which

the

allowable

are

set

of

PP

-- Real world objects(picture producers)

ACT -- Real world actions PA -- Attributes of objects(Modifiers of PP) AA -- Attributes of actions(Modifiers of actions)

T -- Times

LOC – Locations

Example

p

Raju ATRANS

O

← book

R

man from Raju

to

Raju  ATRANS O ← book R man f r o m Raju t o “Raju

“Raju gave the man a book”

Letters above indicate certain Relationship

o -- object. R -- recipient-donor. I -- instrument e.g. eat with a spoon D -- destination e.g. going home.

Double arrows () indicate two-way links between the actor (PP) and action (ACT)

Arrows indicate the direction of dependency

Modifiers…

The use of tense and mood in describing

events is extremely important modifiers are:

p – past f -- future

t -- transition ts-- start transition

tf-- finished transition

delta -- timeless c -- conditional

/ -- negative ? – interrogative

k -- continuing

the absence of any modifier implies the present tense.

Conceptual Dependency …

Arrows indicate the direction of dependency

The Double arrow( ) has an object (actor), PP and action, ACT. I.e. PP ACT.

The triple arrow(

) is also a two link but

PP ACT.  The triple arrow( ) is also a two link but between an object,

between an object, PP, and its attribute, PA.

I.e. PP to PA. It represents isa type dependencies.

1. PP ACT

2. PP

PA

PA1. PP  ACT 2. PP PA tall 3. PP doctor 4. PP ↑ PA 5.

tall

3. PP

1. PP  ACT 2. PP PA PA tall 3. PP doctor 4. PP ↑ PA

doctor

4. PP

PA

5. PP

.
.

PP

6. ACT

1. ACT

o

PP PA PA tall 3. PP doctor 4. PP ↑ PA 5. PP . PP 6.
o
o

PP

PP

PP

John

PP PA PA tall 3. PP doctor 4. PP ↑ PA 5. PP . PP 6.

John

PP PA PA tall 3. PP doctor 4. PP ↑ PA 5. PP . PP 6.

boy

nice

dog

PP PA PA tall 3. PP doctor 4. PP ↑ PA 5. PP . PP 6.

john

John PTRANS

John ran John is

height (>average)

doctor

John is a

A nice boy

John’s dog

Poss_by

o

John PROPEL

John pushed the ca

cart

p o John  ATRANS o book
p
o
John  ATRANS
o
book

John

Mary

John took the boo from mary.

8.

AC

T

I
I

9.

ACT

10.PP

D
D
P John John  INGEST I o do Ice cream o spoon P P field
P
John
John  INGEST I
o
do
Ice cream o
spoon
P
P
field
D
John
P
P
PTRANS
bag
o
P
fertilizer
P field D John P P  PTRANS bag o P fertilizer PP PA plants S

PP

PA

D John P P  PTRANS bag o P fertilizer PP PA plants S i z

plants

Size>x

Size=x

John ate ice cream with a spoon

John fertilized the field

The plants grew.

11. (a) (b)

12.

13.

11. (a)  (b)  12. 13. R bullet Bob Gun Bill  PTOPEL P Bob
R bullet Bob Gun
R
bullet Bob
Gun

Bill PTOPEL

P
P

Bob

health(-10)

yesterday

John P
John
P

PTRANS

1PTRANS o

P Bob health(-10) yesterday John P PTRANS 1  PTRANS o D 1 home I 1
D 1
D
1

home

I

1MTRANS frog

P PTRANS 1  PTRANS o D 1 home I 1  MTRANS frog CP eyes

CP

eyes

Bill shot Bob.

John

ran

yesterd

ay.

While

going home ,

I saw a frog

14. PP

14. PP woods MTRANS frog C P ears I heard a frog in the woods

woods

14. PP woods MTRANS frog C P ears I heard a frog in the woods

MTRANS

frog CP ears

14. PP woods MTRANS frog C P ears I heard a frog in the woods

I heard a frog in the woods

one INGEST

one  INGEST p INGE ST dea d aliv e o p SMOKE R one cigarette

p

INGE

ST

dea

d aliv

e

o

p

SMOKE

R

one

cigarette

o smo

ke

R I

cigare

tte

Since smoking can kill you, I stopped.

one

c

t f

Joh

n

Joh n Bil l Bill p  I MTRA NS Bill o do 1 nos e

Bil

Joh n Bil l Bill p  I MTRA NS Bill o do 1 nos e
Joh n Bil l Bill p  I MTRA NS Bill o do 1 nos e

l

Joh n Bil l Bill p  I MTRA NS Bill o do 1 nos e

Bill

p  I
p
I

MTRA

NS

Bill o
Bill
o

do 1

nos

e broke
e
broke

n

Poss-

byl Bill p  I MTRA NS Bill o do 1 nos e broke n Poss- l Bill p  I MTRA NS Bill o do 1 nos e broke n Poss-

Joh

n

Joh

n

I MTRA NS Bill o do 1 nos e broke n Poss- by Joh n Joh

o

belie

ve

Joh n cf Bill
Joh
n cf
Bill

nos

e

n Poss- by Joh n Joh n o belie ve Joh n cf Bill nos e

Joh

Poss-

by

do 2

do 1

brok

en

Bill threatened John With a broken nose.

Advantages with CD

Using these primitives involves fewer inference rules. Many inference rules are already represented in CD structure. The holes in the initial structure help to focus on the points still to be established.

Disadvantages with CD

Knowledge must be decomposed into fairly low level primitives. Impossible or difficult to find correct set of primitives. A lot of inference may still be required. Representations can be complex even for relatively simple actions

Scripts

Scripts generally used to represent knowledge about common sequence of events Script is a structure that describes a stereotyped sequence of events in a particular context A script consists of a set of slots associated with some information

Components of Scripts

Entry conditions Conditions that must, in general, be satisfied before the events described in the script can occur. Result Conditions that will , in general, be true after the events described in the script have occurred. Props Slots representing objects that are involved in the events described in the script. The presence of these objects can be inferred even if they are not mentioned explicitly.

Roles Slots representing people who are involved in the events described in the script. The presence of these people ,too, can be inferred even if they are not mentioned explicitly. If specific individuals are mentioned, they can be inserted into the appropriate slots. Track The specific variation on a more general pattern that is represented by this particular script. Different tracks of the same script will share many but not all components. Scenes The actual sequences of events that occur. The events are represented in conceptual dependency formalism.

Planning

Contents

Introduction to Planning Blocks world Problem Components of Planning system

Green’s approach STRIPS

Goal Stack Planning

Sussman Anomaly

Non Linear Planning Using Constraint Posting

TWEAK Algorithm

Hierarchical Planning

Planning

Planning problems are hard problems They are certainly non­trivial Method which we focus on ways of decomposing the original problem into appropriate subparts and on ways of handling interactions among the subparts during the problem-solving process are often called as planning Planning refers to the process of computing several steps of a problem-solving procedure before executing any of them

Block World Problem

There are number of square blocks, all the same size. They can be stacked one upon another There is a robot arm that can manipulate the blocks Robot arm can hold one block at a time All block are the same size

Robot Actions

UNSTACK(A,B)- Pick up block A from its current position on block B. The arm must be empty and block A must have no block on top of it.

STACK(A,B)- Place block A on block B. The arm must already be holding and the surface of B must be clear.

Robot Actions

PICKUP(A)- Pick up block A from the table and hold it. The arm must be empty and there must be nothing on top of block A.

PUTDOWN(A)- Put block A down on the table. The arm must have been holding block A.

Set of Predicates

ON(A,B)- Block A is on block B. ONTABLE(A)- Block A is on the table. CLEAR(A)-There is nothing on top of block A. HOLDING(A)- The arm is holding block A. ARMEMPTY- The arm is holding nothing.

Logical Statements

[x:HOLDING(x)] ⌐ARMEMPY

x: ONTABLE(x) y:ON(x,y)

x:[⌐ y:ON(y,x)] CLEAR(x)

Components of Planning System

Components of Planning System

Chose best rule to apply Apply the chosen rule Detecting when a solution has been found Detecting dead ends Repairing an almost correct solution

1. Chose Best Rule

Isolate

set

of

differences

between

the

desired goal state and the current state

Identify those rules that are relevant to reducing those differences (Means-Ends- Analysis)

If

several

rules

found

then

choose

best

using heuristic information

2. Apply Rules

In simple systems, applying rules is easy. Each rule simply specified the problem state that would result from its application. In complex systems, we must be able to deal with rules that specify only a small part of the complete problem state. One way is to describe, for each action, each of the changes it makes to the state description.

Green's Approach (Applying Rules)

The changes to a state produced by the application of a rule

Green's Approach (Applying Rules)  The changes to a state produced by the application of a

Green's Approach (Applying Rules)

UNSTACK(x,y)= [CLEAR(x,s) ON(x,y,s)]

[HOLDING(x,DO(UNSTACK((x,y),s)CLEAR(y,DO(UNSTACK(x,y),s))] Initial State of the problem is S0 If we execute UNSTACK(A,B) in state S0 The state that results from the unstacking operation is S1 => HOLDING(A,S1) CLEAR (B,S1)

Green's Approach (Applying Rules)

Advantages Resolution

description Disadvantages Many rules required to represent problem Difficult to represent complex problems

be

applied

can

on

state

STRIPS(Applying Rules)

STRIPS approach each operator described by set of lists of predicates STRIPES has three lists are ADD, DELETE, PRECONDITION

A list of things that become TRUE called ADD

called

DELETE A set of prerequisites that must be true before the operator can be applied

A

list

of

things

that

become

FALSE

STRIPS (Applying Rules)

STACKS(x,y)

P:CLEAR(y) HOLDING(x) D:CLEAR(y) HOLDING(x) A:ARMEMPTYON(x,y)

UNSTACK(x,y)

P:ON(x,y) CLEAR(x) ARMEMPTY D:ON(x,y) ARMEMPTY A:HOLDING(x) CLEARC(y)

STRIPS (Applying Rules)

PICKUP(x)

P:CLEAR(x) ONTABLE(x) ARMEMPTY D:ONTABLE(x) ARMEMPTY A:HOLDING(x)

PUTDOWN(x)

P:HOLDING(x) D:HOLDING(x) A:ONTABLE(x) ARMEMPTY

STRIPS (Applying Rules)

If a new attribute is introduced we do not need to add new axioms for existing operators. Unlike in Green's method we remove the state indicator and use a database of predicates to indicate the current state

Thus if the last state was:

ONTABLE(B)ON(A,B)CLEAR(A) after the unstack operation the new state is ONTABLE(B)CLEAR(B)HOLDING(A)CLEAR(A)

3. Detecting a Solution

A planning system has succeeded in finding a solution to a problem when it has found a sequence of operators that transform the initial problem state into the goal state In simple problem-solving systems we know the solution by a straightforward match of the state description But in complex problem different reasoning mechanisms can be used to describe the problem states, that reasoning mechanisms could be used to discover when a solution had been found

4. Detecting Dead Ends

A Planning system must be able to detect when it is exploring a path that can never lead to a solution. Above same reasoning mechanism can be used to detect dead ends In Search process reasoning forward from initial state , it can prune any path that lead to a state from which goal state cannot be reached Similarly backward reasoning, some states can be pruned from the search space

5.Repairing an Almost Correct Solution

In completely decomposable problems can be solve the sub problems and combine the sub solutions yield a solution to the original problem. But try to solving nearly decomposable

Means-Ends

the

problems one

Analysis

is

way

use

technique

to

minimize

difference between initial state to goal state

One

of

the

better

way

to

represent

knowledge about what went wrong and then apply a direct patch

Goal Stack Planning

Goal Stack Planning

Use of goal stack for solving compound goals by taking advantage of STRIPS method Problem solver makes use of a single stack

that

contains

both

goals

,operators

and

database

 

Database describes the current situation

 

Set

of

Operators

described

as

PRECONDITIO, ADD and DELETE lists

Goal Stack Planning…

Simple Blocks world problem

Goal Stack Planning…  Simple Blocks world problem

Example

We can describe the start state or Goal stack as

ON(