Sunteți pe pagina 1din 11

Artificial Neural Networks Exam

Name :- ------------------------------------------------------------------------------------------------------------------
Time allowed :- 3 hours
Please read the following instructions carefully before beginning
1-Answers will be outside the question paper.
2-It is not allowed to open any source for finding the solutions just depend on
yourself.
3-Questions must be answered with a pen not a pencil.
4-Un-attempted questions will carry zero marks.
5-Read the questions carefully before beginning.
6-Attempt to solve all questions.
7-This question paper contains (3) Questions distributed in (11) pages.
Question (1):-
Answer the following questions:-
1-What do mean by Artificial Neural Networks?
2-What are the similarities and differences between (BNN) and (ANN)?
3-Why do we need Artificial Neural Networks?
4-Draw the structure of a biological neuron?
5-What are the various characteristics of an Artificial Neural Network?
6-Define activation functions?
7-Define bias and threshold in context of ANN?
8-What are the basic Building Blocks of an ANN?
9-Classify the nets based on their architecture?
10-Define learning in context of ANN?
11-What are the types of learning? Explain in detail
12-Explain in detail the architecture of the neural networks?
13-What are the main requirements of the McCulloch-Pitts neurons?
14-Differentiate between excitatory and inhibitory connections?
15-What is the activation used in the McCulloch-Pitts net?
16-Derive the Hebbian and the perceptron learning rule?
17-What is the importance of delta learning rule? Delta learning is called as error
correction rule justify
18-Explain the development of Hebb net?
19-Define Linear Separability?
20-How is boundary region (decision boundary)determined using linear
separability concept?
Question (2):-
Put a circle around the correct answer:-
1-From the Figure shown below:-

(1) What is the name of the model in figure?


a) Rosenblatt perceptron model
b) McCulloch-pitts model
c) Widrow’s Adaline model
d) None of the mentioned
(2) What is nature of function F(x) in the figure?
a) linear
b) non-linear
c) can be either linear or non-linear
d) none of the mentioned
(3) What does the character ‘b’ represent in the above diagram?
a) bias
b) any constant value
c) a variable value
d) none of the mentioned

2-From the Figure Shown below:-

(a) If ‘b’ in the figure above is the bias, then what logic circuit does it represents?
a) or gate
b) and gate
c) nor gate
d) nand gate
(b) When both inputs are 1, what will be the output of the above figure?
a) 0
b) 1
c) either 0 or 1
d) z
(c) When both inputs are different, what will be the output of the above figure?
a) 0
b) 1
c) either 0 or 1
d) z
3-Does McCulloch-pitts model have ability of learning?
a) yes
b) no
4-What is competitive learning?
a) learning laws which modulate difference between synaptic weight &
output signal
b) learning laws which modulate difference between synaptic weight &
activation value
c) learning laws which modulate difference between actual output &
desired output
d) none of the mentioned
5- A 4-input neuron has weights 1, 2, 3 and 4. The transfer function is linear with
the constant of proportionality being equal to 2. The inputs are 4, 10, 5 and 20
respectively. The output will be:
a) 238
b) 76
c) 119
d) 123
6-Which of the following is true?
(i) On average, neural networks have higher computational rates than
conventional computers.
(ii) Neural networks learn by example.
(iii) Neural networks mimic the way the human brain works.
a)All of the mentioned are true
b) (ii) and (iii) are true
c) (i), (ii) and (iii) are true
d) None of the mentioned
7-Which of the following is true for neural networks?
(i)The training time depends on the size of the network.
(ii) Neural networks can be simulated on a conventional computer.
(iii) Artificial neurons are identical in operation to biological ones.
a) All of the mentioned
b) (ii) is true
c) (i) and (ii) are true
d) None of the mentioned
8-Which of the following is true?
Single layer associative neural networks do not have the ability to:
(i)perform pattern recognition
(ii) find the parity of a picture
(iii)determine whether two or more shapes in a picture are connected or not
a) (ii) and (iii) are true
b) (ii) is true
c) All of the mentioned
d) None of the mentioned
9-. Which is true for neural networks?
a) It has set of nodes and connections
b) Each node computes it’s weighted input
c) Node could be in excited state or non-excited state
d) All of the mentioned
10-Why are linearly separable problems of interest of neural network
researchers?
a) Because they are the only class of problem that network can solve successfully
b) Because they are the only class of problem that Perceptron can solve
successfully
c) Because they are the only mathematical functions that are continue
d) Because they are the only mathematical functions you can draw
11-Neural Networks are complex ______________ with many parameters.
a) Linear Functions
b) Nonlinear Functions
c) Discrete Functions
d) Exponential Functions
12-Which of the following is an application of NN (Neural Network)?
a) Sales forecasting
b) Data validation
c) Risk management
d) All of the mentioned
13-Which of the following is not an application of learning?
a) Data mining
b) WWW
c) Speech recognition
d) None of the mentioned
14-Which of the following is the component of learning system?
a) Goal
b) Model
c) Learning rules
d) All of the mentioned
Question (3):-
Solve the following problems:-
1-The logic networks shown in Figure, use the McCulloch-Pitts model
neuron from Figure. Find the truth tables and the logic functions that
are implemented by networks (a), (b), (c), and (d).

2-Consider the following neural network.


3-Give the output of the network in Figure for the input
[111]T.

4-Create the letters (u,v) & (s,t) using a 5*5 matrix. Form testing and training
data by yourself and classify the patterns using Hebb net.
5-A neuron with 4 inputs has the weight vector w = [1, 2, 3, 4]T and a bias 𝜽 = 0
(zero). The activation function is linear, where the constant of proportionality
equals 2 — that is, the activation function is given by f(net) = 2 × net. If the
input vector is x = [4, 8, 5, 6]T then what will be the output of the neuron
6-In the perceptron below, what will the output be when the input is (0, 0)?
What about inputs (0, 1), (1, 1) and (1, 0)? What if we change the bias weight
to -0.5? (original b=1)

7-A neuron j receives inputs from four other neurons whose activity levels are
10,-20,-1.0,and -0.9 . Calvulate the output of neuron j for the following two
situations:-
(a)The neuron is linear
(b)The neuron is represented by a McCulloch-Pitts model
Assume that the bias applied to the neuron is zero
8-Below is a diagram if a single artificial neuron (unit):

The node has three inputs x = (x1, x2, x3) that receive only binary signals
(either 0 or 1). How many different input patterns this node can receive?
What if the node had four inputs? Five? Can you give a formula that
computes the number of binary input patterns for a given number of inputs?
9-Logical operators (i.e. NOT, AND, OR, XOR, etc) are the building blocks
of any computational device. Logical functions return only two possible
values, true or false, based on the truth or false values of their arguments.
For example, operator AND returns true only when all its arguments are
true, otherwise (if any of the arguments is false) it returns false. If we denote
truth by 1 and false by 0, then logical function AND can be represented by
the following table:

This function can be implemented by a single-unit with two inputs:

if the weights are w1 = 1 and w2 = 1 and the activation function is:


Note that the threshold level is 2 (v ≥2).
a) Test how the neural AND function works.
b) Suggest how to change either the weights or the threshold level of this
single-unit in order to implement the logical OR function (true when at least
one of the arguments is true):

c) The XOR function (exclusive or) returns true only when one of the
arguments is true and another is false. Otherwise, it returns always false. This
can be represented by the following table:

Do you think it is possible to implement this function using a single unit? A


network of several units?
10-The following diagram represents a feed-forward neural network with one
hidden layer:

A weight on connection between nodes i and j is denoted by wij, such as


w13 is the weight on the connection between nodes 1 and 3. The following
table lists all the weights in the network:
Each of the nodes 3, 4, 5 and 6 uses the following activation function:

where v denotes the weighted sum of a node. Each of the input nodes (1and 2)
can only receive binary values (either 0 or 1). Calculate the output of the
network (y5 and y6) for each of the input patterns:

S-ar putea să vă placă și