Documente Academic
Documente Profesional
Documente Cultură
FinalReport
Submittedby: Submittedto:
VaibhavSingh Mrs.Nishthaphutela
TABLEOFCONTENTS
AboutDeeplearning 3
LinearRegression 5
R2 6
Code: 7
Challenge 8
Code 8
UseofLinearregression 10
NeuralNetwork 10
Process 10
Code 11
UseofNeuralNetwork 22
TensorFlowforclassification 23
AboutDeeplearning
Traditionally, programing has been about defining every single step for a program to reach an
outcome. The machine learning flips that approach. With machine learning, we define the
outcome and program learn to steps get there. So if we build an app that can recognize street
signs so instead of writing the code to recognize hundreds of different feature of street signs
like this shape of letters and the colors. We will just say here are some examples of a street
signs, learn the shape you need to recognize it. Sometimes, we have not any idea what steps
could possibly be. Machine learning is already everywhere on the internet.Every major services
uses it in some way. Youtube uses it to decide which other videos we might like as we watch
anditsuseswillonlygrowovertime.
There are lot of machine learning models are there andoneofthemareneuralnetworks.When
we use a neural network thats not just one or two but many layers deep to make a prediction,
we called that deep learning. Its a subset of machine learning that has outperformed every
othertypeofmodelalmosteverytimeonahugerangeoftasks.
LinearRegression
Simplelinearregressionisastatisticalmethodthatallowsustosummarizeandstudy
relationshipsbetweentwocontinuous(quantitative)variables:
Onevariable,denotedx,isregardedasthepredictor,explanatory,orindependentvariable.
Theothervariable,denotedy,isregardedastheresponse,outcome,ordependentvariable.
Withsimplelinearregressionwewanttomodelourdataasfollows:
y=B0+B1*x(similartoy=mx+c)
Thisisalinewhereyistheoutputvariablewewanttopredict,xistheinputvariableweknow
andB0andB1arecoefficientsthatweneedtoestimatethatmovethelinearound.
R2
The R2 term is the coefficient of determination and it usually reflects how well the model
fits the observed data. The coefficient of determination is usually given by,
Code:
import pandas as pd
# Read data
dataframe = pd.read_fwf('brain_body.txt')
print(dataframe)
x_values = dataframe[['Brain']]
y_values = dataframe[['Body']]
body_regression = linear_model.LinearRegression()
body_regression.fit(x_values, y_values)
# visualize results
plt.scatter(x_values, y_values)
plt.plot(x_values, body_regression.predict(x_values))
plt.show()
Challenge
challengeistousescikit-learntocreatealineofbestfitfortheincluded'challenge_dataset'.
Then,makeapredictionforanexistingdatapointandseehowcloseitmatchesuptotheactual
value.Printouttheerroryouget.Youcanusescikit-learn'sdocumentationformorehelp.
Code
import pandas as pd
import numpy as np
# Read dataset
x_values = dataframes[['X_data']]
y_values = dataframes[['Y_data']]
model = linear_model.LinearRegression()
model.fit(x_values, y_values)
res = (np.array(actual)-np.array(predicted)**2)
return res
# Plotting on matplotlib
plt.figure(1)
plt.savefig("Scatter.png")
plt.figure(2)
plt.figure(3)
plt.plot(residuals)
plt.savefig("residual.png")
plt.show()
UseofLinearregression
1. Inenvironmentalscienceitisusedtotrytoestablishhowmuchonequantity,say
atmosphericgreenhousegasses,influencesanother,sayglobalsurfacetemperature.
2. Inquantitativefinancelinearregressioniscoretoeverything,astheyusesomething
calledalinearfactormodel.Longstoryshortitestimateshowmuchthepriceofone
assetwillmovewhenanotherthing(oilprices)move.
NeuralNetwork
AnArtificialNeuralNetwork(ANN)isaninformationprocessingparadigmthatisinspiredbythe
waybiologicalnervoussystems,suchasthebrain,processinformation.Thekeyelementofthis
paradigmisthenovelstructureoftheinformationprocessingsystem.Itiscomposedofalarge
numberofhighlyinterconnectedprocessingelements(neurons)workinginunisontosolve
specificproblems.ANNs,likepeople,learnbyexample.AnANNisconfiguredforaspecific
application,suchaspatternrecognitionordataclassification,throughalearningprocess.
Learninginbiologicalsystemsinvolvesadjustmentstothesynapticconnectionsthatexist
betweentheneurones.ThisistrueofANNsaswell.
Process
Wewillgiveeachinputaweight,whichcanbeapositiveornegativenumber.Aninputwitha
largepositiveweightoralargenegativeweight,willhaveastrongeffectontheneuronsoutput.
1. Taketheinputsfromatrainingsetexample,adjustthembytheweights,andpassthem
throughaspecialformulatocalculatetheneuronsoutput.
2. Calculatetheerror,whichisthedifferencebetweentheneuronsoutputandthedesired
outputinthetrainingsetexample.
3. Dependingonthedirectionoftheerror,adjusttheweightsslightly.
4. Repeatthisprocess10,000times.
Code
import numpy as np
class NeuralNetwork():
def __init__(self):
random.seed(1)
# of -1 to 1 a mean of 0
# Activation Function
return 1 / (1 + exp(-x))
return x * (1 - x)
output = self.predict(training_set_inputs)
# sigmoid curve
self.__sigmoid_deravative(output))
self.synaptic_weights += adjustments
if __name__ == '__main__':
neural_network = NeuralNetwork()
print(neural_network.synaptic_weights)
print(neural_network.synaptic_weights)
print('predicting...')
print(neural_network.predict(array([1, 0, 0])))
Challengeistocreatea3layerfeedforwardneuralnetworkusingonlynumpyasdependency.
Challenge.py
import numpy as np
# %matplotlib inline
class NeuralNetwork():
def __init__(self):
# every time
random.seed(1)
# more the number of nodes in hidden layer more the confident the
l2 = 5
l3 = 6
return x * (1-x)
a2 = self.__sigmoid(dot(training_set_inputs, self.synaptic_weights1))
print("============================================================")
print(dot(training_set_inputs, self.synaptic_weights1))
print(a2)
print("============================================================")
a3 = self.__sigmoid(dot(a2, self.synaptic_weights2))
print("============================================================")
print(dot(a2, self.synaptic_weights2))
print(a3)
print("============================================================")
print("============================================================")
print(dot(a3, self.synaptic_weights3))
print(output)
print("============================================================")
# calculatng error
print("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++")
print("Error:")
print(delta4)
print("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++")
print("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++")
print(delta3)
print("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++")
print("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++")
print(delta2)
print("++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++")
self.synaptic_weights1 += adjustment1
self.synaptic_weights2 += adjustment2
self.synaptic_weights3 += adjustment3
a2 = self.__sigmoid(dot(inputs, self.synaptic_weights1))
a3 = self.__sigmoid(dot(a2, self.synaptic_weights2))
return output
if __name__ == '__main__':
neural_network = NeuralNetwork()
print(neural_network.synaptic_weights1)
print(neural_network.synaptic_weights2)
print(neural_network.synaptic_weights3)
plt.imshow(training_set_inputs)
plt.show()
# happens is when you resize an image. The number of pixels change, but you
# want the same information. Since pixels are discrete, there's missing
plt.imshow(training_set_inputs, interpolation='nearest')
plt.show()
print(neural_network.synaptic_weights1)
print(neural_network.synaptic_weights2)
print(neural_network.synaptic_weights3)
print("Predicting...")
print(neural_network.forward_pass(array([1, 0, 0])))
UseofNeuralNetwork
CharacterRecognition-Theideaofcharacterrecognitionhasbecomeveryimportantas
handhelddeviceslikethePalmPilotarebecomingincreasinglypopular.Neuralnetworkscanbe
usedtorecognizehandwrittencharacters.
ImageCompression-Neuralnetworkscanreceiveandprocessvastamountsofinformationat
once,makingthemusefulinimagecompression.WiththeInternetexplosionandmoresites
usingmoreimagesontheirsites,usingneuralnetworksforimagecompressionisworthalook.
StockMarketPrediction-Theday-to-daybusinessofthestockmarketisextremely
complicated.Manyfactorsweighinwhetheragivenstockwillgoupordownonanygivenday.
Sinceneuralnetworkscanexaminealotofinformationquicklyandsortitallout,theycanbe
usedtopredictstockprices.
TravelingSaleman'sProblem-Interestinglyenough,neuralnetworkscansolvethetraveling
salesmanproblem,butonlytoacertaindegreeofapproximation.
Medicine,ElectronicNose,Security,andLoanApplications-Thesearesomeapplicationsthat
areintheirproof-of-conceptstage,withtheacceptionofaneuralnetworkthatwilldecide
whetherornottograntaloan,somethingthathasalreadybeenusedmoresuccessfullythan
manyhumans.
MiscellaneousApplications-Thesearesomeveryinteresting(albeitattimesalittleabsurd)
applicationsofneuralnetworks.
TensorFlowforclassification
Classificationisoneofthemostimportantpartsofmachinelearning,asmostofpeoples
communicationisdoneviatext.Wewriteblogarticles,email,tweet,leavenotesandcomments.
Allthisinformationistherebutisreallyhardtousecomparedtoaformordatacollectedfrom
somesensor.
TherebeenclassicNLPtechniquesdealingwiththis,bymostlyusingwordsassymbolsand
runninglinearmodels.Thistechniquesworkedbutwereverybrittle.Recentadoptionof
embeddingsanddeeplearningopenedupanewwaysofhandlingtext.
"""Supervised problem."""
import tensorflow as tf
dataframe = dataframe[0:10]
print(dataframe)
# print(inputX)
# print(inputY)
learning_rate = 0.000001
training__epochs = 2000
display_steps = 50
n_samples = inputY.size
# create weights
# 2x2 float matrix, that We'll keep updating through our training process
w = tf.Variable(tf.zeros([2, 2]))
b = tf.Variable(tf.zeros([2]))
y = tf.nn.softmax(y_values)
# perform training
# Gradient descent
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)
# training loop
for i in range(training__epochs):
# That's all! The rest of the cell just outputs debug messages.
if (i) % display_steps == 0:
print("Optimization Finished!")
# So It's guessing they're all good houses. That makes it get 7/10 correct
sess.run(tf.nn.softmax([1., 2.]))