Viet-Anh on Software Logo

Viet-Anh on Software

AI & Software Engineering

HomeAboutProjectsBlogNotes

What is: Hard Swish?

SourceSearching for MobileNetV3
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Hard Swish is a type of activation function based on Swish, but replaces the computationally expensive sigmoid with a piecewise linear analogue:

h-swish(x)=xReLU6(x+3)6\text{h-swish}\left(x\right) = x\frac{\text{ReLU6}\left(x+3\right)}{6} h-swish(x)=x6ReLU6(x+3)​

Collections

Activation-Functions

Previous Term

Feedforward Network

Next Term

Model-based Subsampling
← Back to the glossary list
Viet-Anh on Software Logo

Viet-Anh on Software

AI & Software Engineering

Sharing insights, tutorials, and experiences in artificial intelligence, machine learning, and software development.

Explore

  • Home
  • About
  • Projects
  • Contact

Content

  • Blog
  • Notes
  • Videos
  • Glossary

Tools & Resources

  • Source Code: Based On Tailblaze
  • PageSpeed Insights
© 2025 Viet-Anh Nguyen. All rights reserved.
Privacy PolicyTerms of ServicePageSpeed