Documente Academic
Documente Profesional
Documente Cultură
66
More importantly, LSTM reduces the computation time of weight complexity of each time step as O(1).
the network by guaranteeing the computational complexity and
idealized manner, and the most important reorganization is the
cross-exchange of biologically mature germ cell staining, that is,
introduction of cross breeding. From the initial population P(t),
the concept can be selected heuristically or randomly [14]
between the parent and the parent, as shown in the following
figure 3:
67
and the basic model of the LSTM neuron optimized by GA is
provided. Parameter Ranges
68
5. Summary of issues
This experiment is based on the study of GA's optimization of
LSTM neurons as a core goal. Based on the model proposed by
Jian Yi [21], the model is further modified. Through a series of
experimental procedures, it is proved that the complex calculation
of GA-optimized LSTM internal neurons can reduce the running
time of the model to a certain degree, which has great application
in the field of financial forecasting and other commercial fields.
As shown in Table 3 and Figure 6 above, the feasibility of GA
optimization of LSTM neurons was compared with that of David
M. Q. Nelson.
There are differences in the operating model time for different
hardware devices. The main processing parameters of this
experimental environment are: Intel(R) Core(TM) i7 8700K CPU
@2.40GHz, programming language is Python.
Figure 6.Different propagation algebraic accuracy. This paper proposes a GA optimization method for LSTM
neurons. By comparing the running time of predecessor models, it
As the picture shows. We obtained 22 iterations and 23 GA is possible to overcome the shortcomings of slow operation of
optimization neurons to achieve the highest accuracy and minimal LSTM to a certain extent by optimizing the internal weight of
difference from the original LSTM. neurons through GA update.
b): Study the effect of GA on neurons speed In addition, this article does not discuss the generalization ability
Based on the given parameter range and the data obtained in of the network and other parameters of the network at the same
experiment a, we selected a set of parameters with the highest time. The research purpose is too single. However, for the short-
accuracy score to test the effect of GA on the neuron operating term prediction of financial markets, the importance of research
speed. The parameters are shown in the following table 3. on the timeliness and advancing of the model is much higher than
that of the generalized network model.
Table 3.Determined parameter list.
Parameter Value
6. REFERENCES
[1] B. G. Malkiel, A Random Walk Down Wall Street. Norton
LSTM hidden layers 2 [M], 332.6.dc21 [S.l], 1973
[2] Nelson D M Q, Pereira A C M, Oliveira R A D. Stock
Single-layer neurons 150
market's price movement prediction with LSTM neural
networks[C]//International Joint Conference on Neural
Genetic algorithm iteration 22 Networks. IEEE, 2017:1419-1426.
number
[3] S. Hochreiter and J. Schmidhuber. Long short-term memory
Network iterations 100 [J] Neural computation, vol. 9, no. 8, pp. 1735–1780, 1997.
[4] Graves A. Supervised Sequence Labelling with Recurrent
Pass the parameters that have already been confirmed into
Neural Networks[J]. Studies in Computational Intelligence,
two networks, run the same data, in order to guarantee the
2008, 385.
accuracy of the experiment, we carry on three times of
experiments, get the running time of the model as shown in Figure [5] Akerkar, Rajendra, Sajja, et al. Intelligent Techniques for
7 Data Science[M]. Springer International Publishing, 2016..
[6] SUN Rui-qi. Research on Forecast Model of Stock Market
Price Trend Based on LSTM Neural Network[D]. Capital
University of Economics and Business, 2015
[7] S. Hochreiter and J. Schmidhuber. Long short-term memory
[J] Neural computation, vol. 9, no. 8, pp. 1735–1780, 1997.
[8] Ren Zhihui, Xu Haoyu, Feng Songlin, et al. Chinese
segmentation method based on LSTM network for sequence
labeling[J]. Journal of Computer Applications, 2017, 34(5):
1321-1324.
[9] Sreelekshmy.Selvin, Vinayakumar R, et al. Stock price
prediction using LSTM, RNN and CNN-sliding window
Figure 7.Running time comparison chart. model[J], Centre for Computational Engineering and
As can be seen from the above figure, the LSTM model based on Networking, 2017 IEEE.
GA-optimized neurons achieves better results in operating [10] Lin Y F, Huang C F, Tseng V S. A Novel Methodology for
efficiency than the traditional LSTM. Stock Investment using High Utility Episode Mining and
Genetic algorithm[J]. Applied Soft Computing, 2017.
69
[11] Aguilar-Rivera R, Valenzuela-Rendón M, Rodríguez-Ortiz J [17] Cuprak T, Wage K E. Efficient Doppler Compensated
J. Genetic algorithms and Darwinian approaches in financial Reiterative Minimum Mean Squared Error Processing[J].
applications: A survey[J]. Expert Systems with Applications, IEEE Transactions on Aerospace & Electronic Systems,
2015, 42(21):7684-7697. 2017, PP(99):1-1.
[12] Kirkpatrick S, Jr G C, Vecchi M P. Optimization by [18] Ruiz G R, Bandera C F, Temes G A, et al. Genetic algorithm
simulated annealing [J]. Science, 1983, 220(4598):671-680. for building envelope calibration[J]. Applied Energy, 2016,
[13] Zhao Y, Zhao H, Huo X, et al. Angular Rate Sensing with 168:691-705.
GyroWheel Using Genetic Algorithm Optimized Neural [19] Zhang Zhihua, Xu Yuanhong. Application of GRNN Neural
Networks: [J]. Sensors, 2017, 17(7):1692. Network Based on Particle Swarm Optimization in Stock
[14] Grefenstette J J. Optimization of Control Parameters for Forecasting[J].Journal of Mathematical Learning and
Genetic Algorithms [J]. Systems Man & Cybernetics IEEE Research,2017(14):13.
Transactions on, 1986, 16(1):122-128. [20] Hu Zhaoju, Liang Ning. Special subject sentiment analysis
[15] Liang Jun, Chai Yumei, Yuan Huibin, et al. Sentiment based on deep attention LSTM [J/OL]. Journal of Computer
Analysis Based on Polarity Transfer and LSTM Recursive Applications, 2019(05):1-3[2018-03-23]
Networks [J]. Chinese Journal of Information, 2015, [21] Jian Yi, Lu Wei, Pu Yongji, Li Kunhe. Face Recognition
29(5):152-159. Algorithm Based on Genetic Optimization and GRNN
Neural Network[J].Ordnance Equipment
[16] WU Di , Y Nie , J Huang. Parametric Dropout in
Engineering,2018,39(02):131-135.
RNN[J].Advanced Science and Industry Research
Center.2017
70