Viet-Anh on Software Logo

Viet-Anh on Software

AI & Software Engineering

HomeAboutProjectsBlogNotes

What is: ReLU6?

SourceMobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of 666. This is due to increased robustness when used with low-precision computation.

Image Credit: PyTorch

Collections

Activation-Functions

Previous Term

Unsupervised Feature Loss

Next Term

MultiGrain
← Back to the glossary list
Viet-Anh on Software Logo

Viet-Anh on Software

AI & Software Engineering

Sharing insights, tutorials, and experiences in artificial intelligence, machine learning, and software development.

Explore

  • Home
  • About
  • Projects
  • Contact

Content

  • Blog
  • Notes
  • Videos
  • Glossary

Tools & Resources

  • Source Code: Based On Tailblaze
  • PageSpeed Insights
© 2025 Viet-Anh Nguyen. All rights reserved.
Privacy PolicyTerms of ServicePageSpeed