Attention Mechanisms are a component used in neural networks to model long-range interaction, for example across a text in NLP. The key idea is to build shortcuts between a context vector and the input, to allow a model to attend to different parts. Below you can find a continuously updating list of attention mechanisms.
Subcategories
Method | Year | Papers |
---|---|---|
2017 | 17684 | |
2019 | 1378 | |
2019 | 1369 | |
2017 | 261 | |
2017 | 219 | |
2015 | 218 | |
2014 | 196 | |
2017 | 170 | |
2022 | 115 | |
2014 | 105 | |
2020 | 84 | |
2020 | 78 | |
2020 | 77 | |
2020 | 72 | |
2020 | 66 | |
2019 | 66 | |
2019 | 52 | |
2021 | 45 | |
2020 | 43 | |
2018 | 39 | |
2017 | 37 | |
2015 | 33 | |
2014 | 32 | |
2021 | 32 | |
2015 | 31 | |
2015 | 24 | |
2020 | 24 | |
2018 | 24 | |
2019 | 23 | |
2021 | 21 | |
2019 | 20 | |
2015 | 19 | |
2020 | 19 | |
2020 | 17 | |
2018 | 16 | |
2020 | 15 | |
2022 | 14 | |
2020 | 11 | |
2019 | 9 | |
2018 | 9 | |
2019 | 7 | |
2020 | 7 | |
2019 | 6 | |
2018 | 6 | |
2015 | 6 | |
2021 | 5 | |
2018 | 4 | |
2020 | 4 | |
2018 | 3 | |
2016 | 3 | |
2015 | 3 | |
2020 | 3 | |
2017 | 2 | |
2020 | 2 | |
2016 | 2 | |
2018 | 2 | |
2021 | 2 | |
2020 | 2 | |
2017 | 2 | |
2021 | 2 | |
2019 | 2 | |
2020 | 2 | |
2021 | 2 | |
2020 | 1 | |
2021 | 1 | |
2021 | 1 | |
2020 | 1 | |
2018 | 1 | |
2018 | 1 | |
2020 | 1 | |
2022 | 1 | |
2016 | 1 | |
2022 | 1 | |
2020 | 1 | |
2021 | 1 | |
2021 | 1 | |
2020 | 1 | |
2020 | 1 | |
2000 | 0 |