Softplus is an activation function $f\left(x\right) = \log\left(1+\exp\left(x\right)\right)$. It can be viewed as a smooth version of ReLU.
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Object Detection | 67 | 17.09% |
Image Classification | 25 | 6.38% |
General Classification | 11 | 2.81% |
Real-Time Object Detection | 10 | 2.55% |
Semantic Segmentation | 10 | 2.55% |
Medical Diagnosis | 10 | 2.55% |
Classification | 9 | 2.30% |
Image Generation | 8 | 2.04% |
Autonomous Driving | 7 | 1.79% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |