Documente Academic
Documente Profesional
Documente Cultură
A common LSTM unit is composed of a cell, an input gate, an output gate and a forget
gate. The cell remembers values over arbitrary time intervals and the three gates regulate the
flow of information into and out of the cell.
LSTM networks are well-suited to classifying, processing and making predictions based
on time series data, since there can be lags of unknown duration between important events in a
time series. LSTMs were developed to deal with the exploding and vanishing gradient problems
that can be encountered when training traditional RNNs. Relative insensitivity to gap length is an
advantage of LSTM over RNNs, hidden Markov models and other sequence learning methods in
numerous applications.[citation needed]
CNN:
Text classification is a classic task in the field of natural language processing. however, the
existing methods of text classification tasks still need to be improved because of the complex
abstraction of text semantic information and the strong relecvance of context. In this paper, we
combine the advantages of two traditional neural network model, Long Short-Term
Memory(LSTM) and Convolutional Neural Network(CNN). LSTM can effectively preserve the
characteristics of historical information in long text sequences, and extract local features of text
by using the structure of CNN. We proposes a hybrid model of LSTM and CNN, construct CNN
model on the top of LSTM, the text feature vector output from LSTM is further extracted by
CNN structure. The performance of the hybrid model is compared with that of other models in
the experiment. The experimental results show that the hybrid model can effectively improve the
accuracy of text classification.