Sunteți pe pagina 1din 22

MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

CHAPTER 5
REASONING IN UNCERTAIN
SITUATION

CERTAINTY FACTOR ______________________________________________ 2


CERTAINTY FACTOR FOR SINGLE ANTECEDENT RULE ________________________________ 3
CERTAINTY FACTOR FOR MULTIPLE ANTECEDENTS RULE ______________________________ 4
CERTAINTY FACTOR FOR MULTIPLE RULES WITH SAME HYPOTHESIS ____________________ 5

FUZZY LOGIC____________________________________________________ 8
FUZZY SETS ______________________________________________________________________ 8
HEDGES _______________________________________________________________________ 10
FUZZY INFERENCES ______________________________________________________________ 13
PRACTICE MAKES PERFECT __________________________________________________ 17

5:0
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

5 REASONING IN UNCERTAIN SITUATION

Through most part of this module, inference procedures followed the model of reasoning used
in predicate calculus: from correct premises, sound inference rules produce new, guaranteed
correct conclusions. However there are many situations where does not fit this approach. This
is due to poorly formed and uncertain evidence using unsound inference rules.

In almost every aspect of our daily life, we draw useful conclusions from incomplete and
imprecise data successfully. Doctors deliver correct medical diagnoses and recommend
treatment from various ambiguous symptoms. People recognize other people from their voice
ot their gestures. All of these are example of uncertain situations.

The example below will demonstrate the problem of reasoning in ambiguous situation:

IF speed = high AND load = high THEN burn = very fast.


IF speed = moderate AND load = high THEN burn fast.

The usage of words like ‘high, moderate, fast, very fast’ shows that they are in uncertain
situation. There are several techniques can be used to reason in such situation. However, in
this module, two techniques should be discussed only Certainty Factor and Fuzzy Logic.

5:1
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

CERTAINTY FACTOR

Certainty factors theory is a popular technique to reason in uncertainty. The basic principles
of this theory were first introduced in MYCIN, an expert system for the diagnosis and therapy
of blood infections and meningitis. The developers of MYCIN found that medical experts
expressed the strength of their belief in terms that were neither logical nor mathematical
consistent. In addition, there was no reliable statistical data about the problem domain.
There are several uncertain terms interpreted in certainty factors as shown in table 1.

Table 1. Uncertain terms and their interpretation


Term Certainty Factor (cf)
Definitely not -1.0
Almost certainly not -0.8 (-0.99  -0.75)
Probably not -0.6 (-0.74  -0.55)
Maybe not -0.4 (-0.54  -0.25)
Unknown -0.2 to +0.2 (-0.24  0.24)
Maybe +0.4 (0.25  0.54)
Probably +0.6 (0.55  0.74)
Almost certainly +0.8 (0.75  0.99)
Definitely +1.0

The maximum value of certainty factors (cf) is +1.0 (definitely true and the minimum is -1.0
(definitely false). A positive value represented a degree of belief and a negative value
represented a degree of disbelief.

The certainty factors theory is based on two functions:

Measure of Belief (MB)


1 If p(H) = 1
MB(H,E) =
max[p(H|E), p(H)] – p(H)
otherwise
max[1,0] – p(H)
where:
p(H) is the prior probability of hypothesis H is true
p(H|E) is the probability that hypothesis H is true given evidence E.

Measure of belief indicates the degree to which belief in hypothesis (H) would be
increased if evidence (E) were observed.

5:2
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

Measure of Disbelief (MD)


1 If p(H) = 0
MD(H,E) =
min[p(H|E), p(H)] – p(H)
otherwise
min[1,0] – p(H)
where:
p(H) is the prior probability of hypothesis H is true
p(H|E) is the probability that hypothesis H is true given evidence E.

Measure of disbelief indicates the degree to which disbelief in hypothesis (H) would be
increased if evidence (E) were observed.

Both measures range between 0 and 1. Combining both measures, to produce the certainty
factor, determine the total strength of belief or disbelief in a hypothesis. The certainty factor
is computed using the following equation:

MB(H,E) – MD(H,E)
cf =
1 – min [MB(H,E), MD(H,E)]

Thus, cf value will always range between +1 and -1, indicates the total belief in hypothesis H.

Usually, the focus is to find the net certainty of rule consequent when the evidence in the
rule antecedent is uncertain. This is shown as follows:

IF A is X ( given that this is uncertain )


THEN B is Y ( how certain will this be? )

Certainty Factor for Single Antecedent Rule

The net certainty for a single antecedent rule, cf(H,E), can be easily computed by multiplying
the certainty factor of the antecedent, cf(E), with the rule certainty factor, cf.

cf(H,E) = cf(E) x cf

For example,

IF the sky is clear


THEN the forecast is sunny {cf 0.8}

and the current certainty factor of sky is clear is 0.5, then

cf(H,E) = 0.5 x 0.8 = 0.4

This result, according to table 1, would read as ‘It maybe sunny’.

5:3
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

Certainty Factor for Multiple Antecedents Rule

There are two possible cases for multiple antecedents rule: conjunctive rule and disjunctive
rule.

Conjunctive rule
The net certainty for a multiple antecedents rule, cf(H, E1  E2  E3  …  En), can be
computed by as follows:

cf(H, E1  E2  E3  …  En) = min [cf(E1),cf(E2), cf(E3), … , cf(En)] x cf

Example Given the following rule:

IF the sky is clear


AND the temperature is hot
AND the forecast is sunny
THEN action is wear sunglasses {cf 0.8}

and given the current certainty factor of the evidences are as follows:
cf(sky is clear) = 0.9, cf(temperature is hot) = 0.8, cf(forecast is sunny) = 0.7
Find the cf for action is wear sunglasses.

Solution cf(H,E1  E2  E3) = min[0.9, 0.8, 0.7] x 0.8


= 0.7 x 0.8
= 0.56

Therefore it can be interpreted as ‘Probably it would be a good idea to wear


sunglasses’.

Disjunctive rule
The net certainty for a multiple antecedents rule, cf(H,E1  E2  E3  …  En), can be
computed by as follows:

cf(H,E1  E2  E3  …  En) = max [cf(E1),cf(E2), cf(E3), … , cf(En)] x cf

Example Given the following rule:

IF the sky is clear


OR the temperature is hot
OR the forecast is sunny
THEN action is wear sunglasses {cf 0.8}

and given the current certainty factor of the evidences are as follows:

cf(sky is clear) = 0.9, cf(temperature is hot) = 0.8, cf(forecast is sunny) = 0.7


Find the cf for action is wear sunglasses.

5:4
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

Solution cf(H,E1  E2  E3) = max[0.9, 0.8, 0.7] x 0.8


= 0.9 x 0.8
= 0.72

Therefore it can be interpreted as ‘Almost certainly sunglasses should be worn


today’.

Certainty Factor for Multiple Rules with Same Hypothesis

Sometimes, two or more rules can affect the same hypothesis. This occurs when the same
consequent is obtained as a result of the execution of two or more rules.

For example:
Rule 1 : IF A is X
THEN C is Z {cf 0.8}

Rule 2 : IF B is Y
THEN C is Z {cf 0.6}

In the given example above, both rules will fire the same consequences which is C is Z.
Therefore, the individual certainty factors obtained from each rule should be combined
together using the following equations:
cf1 + cf2 x ( 1 - cf1 ) If cf1 > 0 and cf2 > 0 (Both positive values)

cf1 + cf2 If cf1 > 0 or cf2 > 0 (One positive and one
cf(H,E) =
1 – min(|cf1| , |cf2|) negative value)

cf1 + cf2 x ( 1 + cf1 ) If cf1 < 0 and cf2 < 0 (Both negative values)
where:
cf1 is the confidence in hypothesis H established by Rule 1.
cf2 is the confidence in hypothesis H established by Rule 2.
|cf1| and |cf2| is the absolute magnitudes of cf1 and cf2, respectively.

5:5
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

Case 1: Both positive cf

Example Given the following rules:

Rule 1 : IF today is rain


THEN tomorrow is rain {cf 0.8}

Rule 2 : IF today is dry


THEN tomorrow is rain {cf 0.5}

and given the current certainty factor of the evidences are as follows:

cf(today is rain) = 0.9, cf(today is dry) = 0.1.

Find the cf for tomorrow is rain.

Solution cf1(tomorrow is rain) = 0.9 x 0.8


= 0.72

cf2(tomorrow is rain) = 0.1 x 0.5


= 0.05

cf2(tomorrow is rain) = cf1 + cf2 x ( 1 - cf1 )


= 0.72 + 0.05 x ( 1 – 0.72)
= 0.734

Therefore it can be interpreted as ‘tomorrow probably rain’.

Case 2: One positive cf and one negative cf

Example Given the following rules:

Rule 1 : IF today is rain


THEN tomorrow is rain {cf 0.8}

Rule 2 : IF today is dry


THEN tomorrow is rain {cf 0.5}

and given the current certainty factor of the evidences are as follows:

cf(today is rain) = 0.9, cf(today is dry) = -0.1.

Find the cf for tomorrow is rain.

5:6
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

Solution cf1(tomorrow is rain) = 0.9 x 0.8


= 0.72

cf2(tomorrow is rain) = -0.1 x 0.5


= -0.05

cf2(tomorrow is rain) = cf1 + cf2


1 - min[ |cf1| , |cf2|]

= 0.72 + (-0.05)
1 - min[ |0.72| , |-0.05|]

= 0.67
1 - 0.05

= 0.705

Therefore it can be interpreted as ‘tomorrow probably rain’.

Case 3: Both negative cf

Example Given the following rules:

Rule 1 : IF today is rain


THEN tomorrow is rain {cf 0.8}

Rule 2 : IF today is dry


THEN tomorrow is rain {cf 0.5}

and given the current certainty factor of the evidences are as follows:

cf(today is rain) = -0.9, cf(today is dry) = -0.1.

Find the cf for tomorrow is rain.

Solution cf1(tomorrow is rain) = -0.9 x 0.8


= -0.72

cf2(tomorrow is rain) = -0.1 x 0.5


= -0.05

cf2(tomorrow is rain) = cf1 + cf2 x ( 1 + cf1 )


= -0.72 + -0.05 x ( 1 + (-0.72))
= -0.734

Therefore it can be interpreted as ‘tomorrow probably not rain’.

5:7
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

FUZZY LOGIC

Besides certainty factor, fuzzy logic is another way to reason in uncertain situation. Experts
usually rely on common sense when solving a problem. First order predicate calculus is a logic
in which an interpretation requires mapping symbols into sets in order to assign a truth value.
For example, you are an astronaut unless you are a member of the set that lists all
astronauts. This kind of logic is a crisp one – either an object is a member of a set or not.

Fuzzy logic is an idea of logic where involve partial set membership introduced by Lotfi Zadeh
during the 1960’s. The idea was to provide a reasoning mechanism that could use fuzzy
variables.

In fuzzy logic, linguistic variables are usually used to describe the variables and it always
being assigned with linguistic values. Sometimes, hedges are used to strengthen the values
assigned.

Terms & Definitions:

Fuzzy logic A branch of logic that uses degree of membership in sets rather than
a strict true/false value

Linguistic Term used in our natural language to describe some concepts that
variable usually has vague fuzzy values

Example:

IF speed is high Linguistic variables : speed, load and burn


AND load is high Linguistic values : high, fast
THEN burn is very fast. Hedges : very

FUZZY SETS

The concept of a set is fundamental to mathematics. For example, car indicates the set of
cars. When we say a car, means one out of the set of cars.

Let X be a classical (crisp) set and x is an element. Then, x is either belongs to X (x  X) or


not belong to X (x  X). This explains that crisp set imposes a sharp clear cut boundary on the
set. Therefore, the x element can be assigned with value 1 or 0, which means it belongs or
not belong to the set respectively. For example, we could ask the question, ‘Is the man tall?’.
In crisp set, the answer could be yes (1) or no (0).

5:8
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

However, in fuzzy logic, usually we express set more than classical set does. The question we
should ask is, ‘How tall is the man?’. And the answer could be ‘quite tall’, ‘very tall’, and
etc.

The following figure explains more about the differences between crisp and fuzzy set.
According to figure (a), if Halim’s height is 160 cm, he is not a tall man. However, if we
consider the fuzzy set in figure (b), Halim belong to ‘tall men’ set with degree of membership
0.2.

Fuzzy membership value is always calculated using the membership function as represented in
the graph.

1.0 1.0

0.8 0.8
Degree of membership

Degree of membership
0.6 0.6

0.4 0.4

0.2 0.2

0.0 0.0

150 155 160 165 170 175 180 185 150 155 160 165 170 175 180 185
Height, cm Height, cm

(a) Crisp set of ‘tall men’ (b) Fuzzy set of ‘tall men’

Fuzzy set can also be represented in the form of elements and degree of membership. Assume
we have a universe of discourse X and a fuzzy set A defined on it.

X = { x1, x2, x3, … , xn }

Fuzzy sets A defines the degree of membership, A (x) that maps elements xi of X to degree of
membership in [0,1].

A = { a1, a2, a3, … , an } where ai = A (xi)

Includes symbols “/” which associates the membership valus ai, to the element xi.
as follows:

A = { a1/x1, a2/x2, a3/x3, … , an/xn }

5:9
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

1.0

For example, we could consider the following


set of short, medium and tall people (refer to
the following graph): short medium tall

Short = { 1/130, 1/140, 0.5/150, 0/160 }


Medium = { 0/140, 0.5/150, 1/160, 0.5/170,
0/180 }
Tall = { 0/160, 0.5/170, 1/180, 1/190 } 0.0
130 140 150 160 170 180 190 cm

FUZZY SET OPERATIONS

The same concept as classical set, several operations can be done onto the sets, such as
intersection, union and complementation.

Refer to the given set below to understand each operation:

Short = { 1/130, 1/140, 0.5/150, 0/160 }


Medium = { 0/140, 0.5/150, 1/160, 0.5/170, 0/180 }
Tall = { 0/160, 0.5/170, 1/180, 1/190 }

INTERSECTION

In classical set theory, intersection of 2 sets contains elements that common to both.
In fuzzy sets, an element may be partially in both sets.

AB(x) = min( A(x), B(x) ) 1.0

Example:
Medium  Tall = { 0/160, 0.5/170, 0/180 }
(as shown in the shaded area in the graph) short medium tall

0.0
130 140 150 160 170 180 190 cm

5:10
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

UNION

Union of 2 sets contains elements that belong to one or both sets.

AB(x) = max( A(x), B(x) ) 1.0

Example:
Medium  Tall = { 0/140, 0.5/150, 1/160,
0.5/170, 1/180, 1/190 } short medium tall
(as shown in the shaded area in the graph)

0.0
130 140 150 160 170 180 190 cm

COMPLEMENTATION

1.0
A (x) = 1 - A(x)

Example:
Medium = { 1/140, 0.5/150, 0/160,
short medium tall
0.5/170, 1/180 }
(as shown in the shaded area in the graph)

0.0
130 140 150 160 170 180 190 cm

5:11
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

HEDGES

Another term to be familiarized is ‘hedges’. A hedge is a qualifier of a fuzzy set used to


modify its shape. Hedges includes adverbs such as ‘very’, ‘somewhat’, ‘quite’, ‘more or less’,
and ‘slightly’. They perform mathematical operations of concentration, dilation or
intensification by reducing or increasing the degree of membership of fuzzy elements.

Hedge Mathematical Expression Graphical Representation

A little [A (x)]1.3

Slightly [A (x)]1.7

Very [A (x)]2

Extremely [A (x)]3

Very very [A (x)]4

More or less
 A (x)
Somewhat

2[A (x)]2
if 0  A  0.5
Indeed
1 - 2[1 - A (x)]2
if 0.5 < A  1

5:12
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

FUZZY INFERENCES

There are several stages in fuzzy inferences:


1. Fuzzification
2. Match the rules
3. Determine the degree to which the conclusion is supported
4. Combining all rule outputs
5. Defuzzification

Fuzzy Inference Engine


2. Match the rules
crisp fuzzy fuzzy crisp
3. Determine the
input Fuzzifier input degree to which the output Defuzzifier ouput
1. Fuzzification conclusion is 5. Defuzzification
supported
4. Combining all rule
outputs

membership function
rules

Rule-base

Assuming the rules and fuzzy sets below is given:


Rule 1: IF speed_error is zero
AND acceleration is positive
THEN throttle is ‘reduce small amount’
Rule 2: IF speed_error is positive
OR acceleration is negative
THEN throttle is ‘reduce high amount’
Rule 3: IF speed_error is negative
THEN throttle is ‘increase small amount’
1.0 1.0

positive zero negative positive negative

0.0 0.0
-15 -10 -5 0 5 10 15 -15 -10 -5 0 5 10 15
(a) Fuzzy set for speed_errror (b) Fuzzy set for acceleration

5:13
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

1.0

reduce reduce constant increase increase


high small small high

0.0
-15 -10 -5 0 5 10 15
(c) Fuzzy set for throttle

1. FUZZIFICATION

This stage involves the process of converting (fuzzifying) the crisp value to fuzzy value. For
example, refer to the figure below. Suppose that, from the reading on the sensors, the speed
error is 1 and the acceleration is 8.

1.0 1.0
0.85
0.8

positive zero negative positive negative


zero

0.2 0.25

0.0 0.0
-15 -10 -5 0 5 10 15 -15 -10 -5 0 5 10 15

From the given graph above, we know that  speed_error = positive (1) = 0.2,  speed_error = zero (1)
= 0.85,  acceleration = negative (8) = 0.8, and  acceleration = zero (8) = 0.25.

2. MATCHING RULES

During the inference process, the consequence part will get the membership value of the
antecedent.

Example: IF speed_error is negative


THEN throttle is ‘increase small amount’

 speed_error = negative = 0.0

Therefore,  throttle = increase small = 0.0

5:14
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

If there are more than one antecedents linked together with an AND operator, take the
minimum value.

Example: IF speed_error is zero


AND acceleration is zero
THEN throttle is ‘reduce small amount’

 speed_error = zero = 0.85

Therefore,  throttle = reduce small = min (0.85, 0.25)


= 0.25

If there are more than one antecedents linked together with an OR operator, take the
minimum value.

Rule 2: IF speed_error is positive


OR acceleration is negative
THEN throttle is ‘reduce high amount’

 speed_error = positive = 0.2, and  acceleration = negative = 0.8

Therfore,  throttle = reduce high = max (0.2, 0.8)


= 0.8

3. DETERMINE THE DEGREE TO WHICH THE CONCLUSION IS SUPPORTED

From all three rules fired, we can determine the degree to which the conclusion is supported.

Rule 1:
 throttle = reduce small = 0.25

Rule 2:
 throttle = reduce high = 0.8

Rule 3:
 throttle = increase small = 0.0

5:15
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

4. COMBINING ALL RULE OUTPUTS

Show the combination by clipping of the output graph as follows:


1.0

0.8

reduce
high
0.25
reduce
small
0.0
-15 -10 -5 0 5 10 15

5. DEFUZZIFICATION

Defuzzification is the last stage where the fuzzy output being converted (defuzzified) into
crisp value, which gives more meaning to the user. Center of Gravity (COG) can be used to
calculate this.

1.0

0.8

0.25
0

0.0
-25 -20 -15 -10 -5 0 5 10 15

COG =  x . (x)
 (x)

= (-25)(0.8) + (-20)(0.8) + (-15)(0.5) + (-10)(0.25) + (-5)(0.25) + (0)(0.0)


0.8 + 0.8 + 0.5 + 0.25 + 0.25 + 0.0

= -47.25
2.6

= -18.17

Therefore, the crisp output should be -18.17.

5:16
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

PRACTICE MAKES PERFECT

1. Given the following rules:


R1: if have children = yes
and enjoy gambling = yes
then risk tolerance = high {0.9}

R2: if have children = yes


and children headed for college = yes
and children's education already funded = no
then risk tolerance = high {0.4}

R3: if budgeting very important = yes


then risk tolerance = low {0.2}

R4: if worry about money at night = yes


then risk tolerance = low {0.7}

R5: if budget but splurge sometimes = yes


then risk tolerance = medium {0.9}

R6: if have children = yes


or enjoy gambling = yes
then risk tolerance = medium {0.8}

Given the certainty factors below,


have children = yes {1.0}
enjoy gambling = yes {0.8)
children headed for college = yes {0.3}
children's education already funded = no {0.4}
budgeting very important = yes {0.7}
worry about money at night = yes {0.1}
budget but splurge sometimes = yes {0.5}

a) Calculate the certainty factor for risk tolerance = high.

b) Calculate the certainty factor for risk tolerance = medium.

c) Calculate the certainty factor for risk tolerance = low.

5:17
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

2. Given the following rules:

R1: if today is saturday


and week is odd
then day is off-day

R2: if day is off-day


and weather is sunny
then go to market

R3: if day is off-day


and weather is rain
then go to hypermarket

R4: if day is working-day


then go to office

R5: if week is first


or week is third
then week is odd

R6: if today is monday


or today is tuesday
then today is working-day

Given the certainty factors below:

R1 = 0.8, R2= 0.5, R3 = 0.7, R4 = 0.9, R5 = 1.0, R6 = 0.2


today is saturday {0.2}
week is first { 0.1}
week is third {1.0}
weather is sunny {0.6}

Calculate the certainty factor for ‘go to market’.

5:18
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

3. Given the following rules:


R1: if today is rain
then tomorrow is rain {cf 0.5}

R2: if today is dry


then tomorrow is dry {cf 0.5}

R3: if today is rain


and rainfall is low
then tomorrow is dry {cf 0.6}

R4: if today is rain


and rainfall is low
and temperature is cold
then tomorrow is dry {cf 0.7}

R5: if today is dry


and temperature is warm
then tomorrow is rain {cf 0.65}

R3: if today is dry


and rainfall is low
and sky is overcast
then tomorrow is rain {cf 0.55}

Given the certainty factors below,


today is rain {cf 1.0}
rainfall is low (cf 0.8}
temperature is cold {cf 0.9}

 Calculate the certainty factor for tomorrow is rain.

 Calculate the certainty factor for tomorrow is dry.

4. Given two fuzzy sets of A and B.


A = { 0.3/1, 0.5/3, 1/4, 0.7/6, 0.2/7 }
B = { 0.2/1, 0.8/2, 0.9/4, 0.6/5, 0.3/7 }

a) Find AB and AB.

b) Find A (B  A).

5:19
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

5. Given the fuzzy sets:

Tall(X) = { 0 if X < 1.6m


(X - 1.6m) / 0.2, if 1.6m <= X < 1.8m
1, if X >= 1.8m }

Short(X) = { 1 if X < 1.6m


(1.8m - X) / 0.2, if 1.6m <= X < 1.8m
0, if X >= 1.8m }

a) Sketch the graphs of Tall(X) and Short(X).

b) Calculate the Union of the fuzzy sets Tall(X) and Short(X).

c) Calculate the Intersection of the fuzzy sets Tall(X) and Short(X).

d) Show that the complement of Tall(X) is Short(X).

5:20
MODULE – LECTURE NOTES ITS462 UNCERTAINTY REASONING

6. Given the following rules, show the fuzzy inferences for all the rules and the
aggregation if temperature is 65 and pressure is 30.

IF temperature is normal IF temperature is normal


OR pressure is low AND pressure is normal
THEN velocity is medium THEN velocity is low

IF temperature is high
THEN velocity is high

1.0 1.0

low normal high low normal high

0.0 0.0
20 40 60 80 100 0 25 50 75 100

Fuzzy sets of temperature Fuzzy sets of pressure

1.0

low normal high

0.0
0 25 50 75 100

Fuzzy sets of velocity

Show the inference processes and come out with the velocity.

5:21

S-ar putea să vă placă și