Long short-term memory

Long short-term memory

A variation of recurrent neural network that can be backpropagated through time and layers

All edits

Edits on 28 Feb, 2020
Golden AI"Attach Wikidata entity ID"
Golden AI edited on 28 Feb, 2020
Edits made to:
Infobox (+1 properties)
Infobox
Wikidata entity ID
Q6673524
Edits on 22 Feb, 2020
Axel Fehr
Axel Fehr edited on 22 Feb, 2020
Edits made to:
Article (+34/-34 characters)
Further reading (+34/-33 characters)
Article

Long short-term memory network (LSTM) is a variation of recurrent neural network. It was proposed by the German researchers Sepp HochreiterSepp Hochreiter and Juergen SchmidhuberJuergen Schmidhuber as a solution to the vanishing gradient problem.

Further reading

Title
Author
Link
Type
Date

LONG SHORT-TERM MEMORY

Sepp HochreiterSepp Hochreiter and JurgenJuergen Schmidhuber Schmidhuber

Academic paper

Edits on 17 Jun, 2019
Carla Faraguna
Carla Faraguna edited on 17 Jun, 2019
Edits made to:
Categories (+1 topics)
Categories
Edits on 15 Jun, 2019
Axel Fehr
Axel Fehr edited on 15 Jun, 2019
Edits made to:
Categories (-2 topics)
Categories
Axel Fehr
Axel Fehr edited on 15 Jun, 2019
Edits made to:
Categories (+3 topics)
Related Topics (+1 topics)
Categories
Related Topics
Edits on 7 Aug, 2018
Golden AI"Linkify text links in standard tables"
Golden AI edited on 7 Aug, 2018
Edits made to:
Further reading (+106/-106 characters)
Further reading

Author
Title
Link
Type

Yuzhen Lu and Fathi M. Salem

Simplified Gating in Long Short-term Memory (LSTM) Recurrent Neural Networks

https://arxiv.org/ftp/arxiv/papers/1701/1701.03441.pdfhttps://arxiv.org/ftp/arxiv/papers/1701/1701.03441.pdf

Academic paper

Sepp Hochreiter and Jurgen Schmidhuber

LONG SHORT-TERM MEMORY

http://www.bioinf.jku.at/publications/older/2604.pdfhttp://www.bioinf.jku.at/publications/older/2604.pdf

Academic paper

Edits on 5 Jun, 2018
Golden AI"Corrections"
Golden AI edited on 5 Jun, 2018
Edits made to:
Further reading (+10/-10 characters)
Further reading

Author
Title
Link
Type

Yuzhen Lu and Fathi M. Salem

Simplified Gating in Long Short-term Memory (LSTM) Recurrent Neural Networks

https://arxiv.org/ftp/arxiv/papers/1701/1701.03441.pdf

Academic Paperpaper

Sepp Hochreiter and Jurgen Schmidhuber

LONG SHORT-TERM MEMORY

http://www.bioinf.jku.at/publications/older/2604.pdf

Academic Paperpaper

Edits on 1 Jun, 2018
Golden AI"Merging standard tables"
Golden AI edited on 1 Jun, 2018
Edits made to:
Academic papers (-2 rows) (-6 cells) (-273 characters)
Further reading (+2 rows) (+8 cells) (+301 characters)
Academic papers

Author
Title
Link

Yuzhen Lu and Fathi M. Salem

Simplified Gating in Long Short-term Memory (LSTM) Recurrent Neural Networks

https://arxiv.org/ftp/arxiv/papers/1701/1701.03441.pdf

Sepp Hochreiter and Jurgen Schmidhuber

LONG SHORT-TERM MEMORY

http://www.bioinf.jku.at/publications/older/2604.pdf

Further reading

Author
Title
Link
Type

Yuzhen Lu and Fathi M. Salem

Simplified Gating in Long Short-term Memory (LSTM) Recurrent Neural Networks

https://arxiv.org/ftp/arxiv/papers/1701/1701.03441.pdf

Academic Paper

Sepp Hochreiter and Jurgen Schmidhuber

LONG SHORT-TERM MEMORY

http://www.bioinf.jku.at/publications/older/2604.pdf

Academic Paper

Edits on 24 Jan, 2018
Melanie Manipula
Melanie Manipula edited on 24 Jan, 2018
Edits made to:
Article (+727/-697 characters)
Academic papers (+1 rows) (+3 cells) (+161 characters)
Article

LSTM networks have internal long-term or short-term memory cells. The output of the LSTM network is modulated by these cells. LSTMs are needed for prediction of the neural networks. Predictions depend on the historical context of inputs not only on the very last input.

LSTMs hold information outside the normal flow of the recurrent neural network in its memory blocks or cells. The information can be stored in, written to or read from a cell as if it is data in a computer. The memory blocks are responsible for remembering things and manipulations and regulated by structures called gates. The gating mechanism contains three non-linear gates, input, output and forget gate.

...

LSTMs hold information outside the normal flow of the recurrent neural network in its memory blocks or cells. The information can be stored in, written to or read from a cell as if it is data in a computer. The memory blocks are responsible for remembering things and manipulations. LSTMs are implemented with element-wise multiplication by Sigmoids. It has the advantage of being differentiable and suited for backpropagation.

LSTMs are implemented with element-wise multiplication by Sigmoids layers output of one and zero. It has the advantage of being differentiable and suited for backpropagation.

LSTMs are used in text generation, handwriting recognition, handwriting generation, music generation, language translation and image captioning.

Academic papers

Author
Title
Link

Yuzhen Lu and Fathi M. Salem

Simplified Gating in Long Short-term Memory (LSTM) Recurrent Neural Networks

https://arxiv.org/ftp/arxiv/papers/1701/1701.03441.pdf

Melanie Manipula
Melanie Manipula edited on 24 Jan, 2018
Edits made to:
Description (+90/-24 characters)
Article (+910 characters)
Academic papers (+1 rows)
Topic thumbnail

Long short-term memory

Recurrent neural network

A variation of recurrent neural network that can be backpropagated through time and layers

Article

Long short-term memory network (LSTM) is a variation of recurrent neural network. It was proposed by the German researchers Sepp Hochreiter and Juergen Schmidhuber as a solution to the vanishing gradient problem.

...

LSTM networks have internal long-term or short-term memory cells. The output of the LSTM network is modulated by these cells. LSTMs are needed for prediction of the neural networks. Predictions depend on the historical context of inputs not only on the very last input.

LSTMs hold information outside the normal flow of the recurrent neural network in its memory blocks or cells. The information can be stored in, written to or read from a cell as if it is data in a computer. The memory blocks are responsible for remembering things and manipulations. LSTMs are implemented with element-wise multiplication by Sigmoids. It has the advantage of being differentiable and suited for backpropagation.

Academic papers

Author
Title
Link

Sepp Hochreiter and Jurgen Schmidhuber

LONG SHORT-TERM MEMORY

http://www.bioinf.jku.at/publications/older/2604.pdf

Edits on 1 Jan, 2017
Golden AI"Initial topic creation"
Golden AI created this topic on 1 Jan, 2017
Edits made to:
Description (+24 characters)
Article
Topic thumbnail

 Long short-term memory

Recurrent neural network

Golden logo
Text is available under the Creative Commons Attribution-ShareAlike 4.0; additional terms apply. By using this site, you agree to our Terms & Conditions.