Viet-Anh on Software Logo

Viet-Anh on Software

AI & Software Engineering

HomeAboutProjectsBlogNotes

What is: Recurrent Dropout?

SourceRecurrent Dropout without Memory Loss
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Recurrent Dropout is a regularization method for recurrent neural networks. Dropout is applied to the updates to LSTM memory cells (or GRU states), i.e. it drops out the input/update gate in LSTM/GRU.

Collections

Regularization

Previous Term

ShuffleNet V2 Block

Next Term

Implicit Subspace Prior Learning
← Back to the glossary list
Viet-Anh on Software Logo

Viet-Anh on Software

AI & Software Engineering

Sharing insights, tutorials, and experiences in artificial intelligence, machine learning, and software development.

Explore

  • Home
  • About
  • Projects
  • Contact

Content

  • Blog
  • Notes
  • Videos
  • Glossary

Tools & Resources

  • Source Code: Based On Tailblaze
  • PageSpeed Insights
© 2025 Viet-Anh Nguyen. All rights reserved.
Privacy PolicyTerms of ServicePageSpeed