Viet-Anh on Software Logo

Viet-Anh on Software

AI & Software Engineering

HomeAboutProjectsBlogNotes

What is: Shifted Rectified Linear Unit?

SourceTrainable Activations for Image Classification
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

The Shifted Rectified Linear Unit, or ShiLU, is a modification of ReLU activation function that has trainable parameters.

ShiLU(x)=αReLU(x)+βShiLU(x) = \alpha ReLU(x) + \betaShiLU(x)=αReLU(x)+β

Collections

Activation-Functions

Previous Term

CodeT5

Next Term

Locality Sensitive Hashing Attention
← Back to the glossary list
Viet-Anh on Software Logo

Viet-Anh on Software

AI & Software Engineering

Sharing insights, tutorials, and experiences in artificial intelligence, machine learning, and software development.

Explore

  • Home
  • About
  • Projects
  • Contact

Content

  • Blog
  • Notes
  • Videos
  • Glossary

Tools & Resources

  • Source Code: Based On Tailblaze
  • PageSpeed Insights
© 2025 Viet-Anh Nguyen. All rights reserved.
Privacy PolicyTerms of ServicePageSpeed