Skip to content

A Pytorch implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE

Notifications You must be signed in to change notification settings

David-19940718/SMU_pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

SMU_pytorch

A Pytorch Implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE

arXiv

https://arxiv.org/abs/2111.04682

Requirements

Pytorch 1.7

Tensorflow version of SMU activation

Please check https://github.com/iFe1er/SMU for tensorflow 2.x implementation.

Reference:

@ARTICLE{2021arXiv211104682B, author = {{Biswas}, Koushik and {Kumar}, Sandeep and {Banerjee}, Shilpak and {Pandey}, Ashish Kumar}, title = "{SMU: smooth activation function for deep networks using smoothing maximum technique}", journal = {arXiv e-prints}, keywords = {Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Computer Science - Computer Vision and Pattern Recognition, Computer Science - Neural and Evolutionary Computing}, year = 2021, month = nov, eid = {arXiv:2111.04682}, pages = {arXiv:2111.04682}, archivePrefix = {arXiv}, eprint = {2111.04682}, primaryClass = {cs.LG}, adsurl = {https://ui.adsabs.harvard.edu/abs/2021arXiv211104682B}, adsnote = {Provided by the SAO/NASA Astrophysics Data System} }

About

A Pytorch implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%