Sunteți pe pagina 1din 7

Name : …………………………………………….

………………
htt
Roll No. : …………………………………………...……………..
Invigilator’s Signature : ………………………………………..

CS/B.TECH(CSE)/SEM-8/CS-801B/2012
p:/
2012
SOFT COMPUTING
/q
Time Allotted : 3 Hours Full Marks : 70

The figures in the margin indicate full marks.


pap
Candidates are required to give their answers in their own words
as far as practicable.

GROUP – A
er.

( Multiple Choice Type Questions )

1. Choose the correct alternatives for the following : 10 × 1 = 10


wb

i) For a network with inputs [ x1, x 2 , x 3 ] = [ 0 ⋅ 3, 0 ⋅ 5, 0 ⋅ 6 ]

and weights [ w1, w2 , w3 ] = [ 0 ⋅ 2, 0 ⋅ 1, − 0 ⋅ 3 ] , the net


ut .

output to the output M-P neuron is


a c.

a) 0·07 b) – 0·07

c) 0·7 d) – 0·7.
in

8203 [ Turn over


CS/B.TECH(CSE)/SEM-8/CS-801B/2012

ii) For a 3-input neuron representing a perceptron,


htt where [ x 1 , x 2 , x 3 ] = [ 0 ⋅ 8 , 0 ⋅ 6, 0 ⋅ 4 ] and weights

[ w1 , w 2 , w 3 ] = [ 0 ⋅ 1, 0 ⋅ 3, − 0 ⋅ 2 ] and bias b = 0·35, the

output of the neuron using bipolar sigmoid activation


p:/
function is

a) 0·625 b) 0·764
/q
c) 0·259 d) 0·346.
pap
iii) The boundary of the fuzzy set A is defined by those
elements x of the universe such that

a) μA ( x ) = 1 b) 0 < μA ( x ) < 1
er.

c) μA ( x ) = 0 d) 0 ≤ μA ( x ) ≤ 1 .
wb

iv) The Hebbian rule is .................. type of learning.

a) supervised b) competitive
ut .

c) unsupervised d) reinforced.

v) Single layer perceptron is used for


a c.

a) linear separability b) error minimization

c) back propagation d) annealing.


in

8203 2
CS/B.TECH(CSE)/SEM-8/CS-801B/2012

vi) In fuzzy logic controller the rule base is updated by the


htt help of

a) Fuzzy logic b) ANN and GA


p:/
c) Hebb's rule d) none of these.

vii) If a problem is linearly separable, then we can find the

weight vector Wi , such that WiT . X > 0 , for each X, with


/q
desired output '+1' and WiT . X < 0 , for each X, with
pap
desired output '–1'.

a) True b) False

c) Can't say d) None of these.


er.

viii) Combination of genes for representing a particular


property of an individual is known as
wb

a) gene b) genome

c) allele d) chromosome.
ut .

ix) The size of each chromosome for the problem

maximizing a function f ( x ) = x2 in the interval

0 ≤ x ≤ 31 is
a c.

a) 8 b) 5

c) 4 d) none of these.
in

8203 3 [ Turn over


CS/B.TECH(CSE)/SEM-8/CS-801B/2012

x) Let A and B are two fuzzy sets with membership


htt function µ, then μ A ∪B ( x ) is equal to

a) μ A ( x ) + μB ( x ) b) μ A ( x ) − μB ( x )
p:/
c) MAX { μ A ( x ), μ B ( x ) } d) MIN { μ A ( x ), μ B ( x ) } .

GROUP – B
/q
( Short Answer Type Questions )
Answer any three of the following. 3 × 5 = 15
pap
2. Compare between supervised learning and unsupervised

learning with proper diagrams.

3. Discuss about the different activation functions used of


er.

training artificial neural networks.

4. Indicate different parameters of classical GA.


wb

5. Define the signal layer perceptron net and its linear

separability.
ut .

⎧0 ⋅ 2 0 ⋅ 3 0 ⋅ 4 0 ⋅ 5 ⎫
6. Consider two fuzzy sets A = ⎨ + + + ⎬ and
~ ⎩ 1 2 3 4 ⎭

⎧0 ⋅1 0 ⋅ 2 0 ⋅ 2 1 ⎫
a c.

B =⎨ + + + ⎬ . Find the algebraic sum and


~ ⎩ 1 2 3 4⎭

algebraic product of the given fuzzy sets.


in

8203 4
CS/B.TECH(CSE)/SEM-8/CS-801B/2012

GROUP – C
htt ( Long Answer Type Questions )
Answer any three of the following. 3 × 15 = 45

7. a) Using Hebb Network, implement AND function with


p:/
bipolar inputs and targets.

b) Write down the training algorithm and testing algorithm


/q
for the single layer perceptron network using perceptron

learning rule.
pap
c) What is 'delta learning rule' ? 6+6+3

8. a) Find the new weights of the back propagation network


er.

after the first iteration of the following figure by using

suitable activation function :


wb
ut .
a c.

Assume with given vectors, Input pattern I = [ 0·4,– 0·7],

Target t = [ 0·1 ], Weight vector from input layer to


in

8203 5 [ Turn over


CS/B.TECH(CSE)/SEM-8/CS-801B/2012

⎡ 0 ⋅ 1 0 ⋅ 4⎤
hidden layer V = ⎢− 0 ⋅ 2 0 ⋅ 2⎥ , Weight vector from
htt ⎣ ⎦

⎡ 0 ⋅ 2⎤
hidden layer to output layer W = ⎢ ⎥.
⎣− 0 ⋅ 5⎦
p:/
Show new weight vector W and V after first iteration.

State the proposed scheme of convergence of the above


/q
net.

b) Using Perceptron Network, implement XOR function


pap
with bipolar inputs and targets.

c) What are the two algorithms used in a Madeline net ?

Differentiate them. 7+4+4


er.

9. a) Discuss about fuzzy Cartesian product with example.

y1 y2
x ⎡0 ⋅ 6 0 ⋅ 3⎤
b) Two fuzzy relations are given by R = 1 ⎢ and
x 2 ⎣0 ⋅ 2 0 ⋅ 9⎥⎦
wb

z1 z2 z3
y1 ⎡ 1 0 ⋅ 5 0 ⋅ 3⎤
S= . Obtain fuzzy relation T as a
~

y 2 ⎣0 ⋅ 8 0 ⋅ 4 0 ⋅ 7⎥⎦ ~
ut .

composition between fuzzy relations.

c) Describe the importance of fuzzy sets. 5+6+4


a c.

10. a) Describe how genetic algorithms differ from other

optimization and search procedures.


in

b) What is crossover rate and mutation rate ?

8203 6
CS/B.TECH(CSE)/SEM-8/CS-801B/2012

c) Use genetic algorithm to maximize the function


htt
f ( x ) = x 2 for 0 ≤ x ≤ 31. The four initial population

chromosome's size of 5 is selected as 01101, 11000,


p:/
01000, 10011. Show the first two iterations and justify

the improvement of optimal solution of x. 5+4+6


/q
11. Write short notes on any three of the following : 3×5

a) Biological neuron vs ANN


pap
b) ART network

c) Defuzzification methods
er.

d) Self organizing map.

=============
wb
ut .
a c.
in

8203 7 [ Turn over

S-ar putea să vă placă și