@@ -30,44 +30,55 @@ Installation process is simple, just::
30
30
Supported Optimizers
31
31
====================
32
32
33
- +-------------+-------------------------------------------------------------------------------+
34
- | | |
35
- | AccSGD | https://arxiv.org/abs/1803.05591 |
36
- +-------------+-------------------------------------------------------------------------------+
37
- | | |
38
- | AdaBound | https://arxiv.org/abs/1902.09843 |
39
- +-------------+-------------------------------------------------------------------------------+
40
- | | |
41
- | AdaMod | https://arxiv.org/abs/1910.12249 |
42
- +-------------+-------------------------------------------------------------------------------+
43
- | | |
44
- | DiffGrad | https://arxiv.org/abs/1909.11015 |
45
- +-------------+-------------------------------------------------------------------------------+
46
- | | |
47
- | Lamb | https://arxiv.org/abs/1904.00962 |
48
- +-------------+-------------------------------------------------------------------------------+
49
- | | |
50
- | NovoGrad | https://arxiv.org/abs/1905.11286 |
51
- +-------------+-------------------------------------------------------------------------------+
52
- | | |
53
- | PID | https://www4.comp.polyu.edu.hk/~cslzhang/paper/CVPR18_PID.pdf |
54
- +-------------+-------------------------------------------------------------------------------+
55
- | | |
56
- | QHAdam | https://arxiv.org/abs/1810.06801 |
57
- +-------------+-------------------------------------------------------------------------------+
58
- | | |
59
- | QHM | https://arxiv.org/abs/1810.06801 |
60
- +-------------+-------------------------------------------------------------------------------+
61
- | | |
62
- | RAdam | https://arxiv.org/abs/1908.03265 |
63
- +-------------+-------------------------------------------------------------------------------+
64
- | | |
65
- | SGDW | https://arxiv.org/abs/1608.03983 |
66
- +-------------+-------------------------------------------------------------------------------+
67
- | | |
68
- | Yogi | https://papers.nips.cc/paper/8186-adaptive-methods-for-nonconvex-optimization |
69
- +-------------+-------------------------------------------------------------------------------+
70
-
33
+ +-----------------+-------------------------------------------------------------------------------+
34
+ | | |
35
+ | :ref: `AccSGD ` | https://arxiv.org/abs/1803.05591 |
36
+ +-----------------+-------------------------------------------------------------------------------+
37
+ | | |
38
+ | :ref: `AdaBound ` | https://arxiv.org/abs/1902.09843 |
39
+ +-----------------+-------------------------------------------------------------------------------+
40
+ | | |
41
+ | :ref: `AdaMod ` | https://arxiv.org/abs/1910.12249 |
42
+ +-----------------+-------------------------------------------------------------------------------+
43
+ | | |
44
+ | :ref: `DiffGrad ` | https://arxiv.org/abs/1909.11015 |
45
+ +-----------------+-------------------------------------------------------------------------------+
46
+ | | |
47
+ | :ref: `Lamb ` | https://arxiv.org/abs/1904.00962 |
48
+ +-----------------+-------------------------------------------------------------------------------+
49
+ | | |
50
+ | :ref: `NovoGrad ` | https://arxiv.org/abs/1905.11286 |
51
+ +-----------------+-------------------------------------------------------------------------------+
52
+ | | |
53
+ | :ref: `PID ` | https://www4.comp.polyu.edu.hk/~cslzhang/paper/CVPR18_PID.pdf |
54
+ +-----------------+-------------------------------------------------------------------------------+
55
+ | | |
56
+ | :ref: `QHAdam ` | https://arxiv.org/abs/1810.06801 |
57
+ +-----------------+-------------------------------------------------------------------------------+
58
+ | | |
59
+ | :ref: `QHM ` | https://arxiv.org/abs/1810.06801 |
60
+ +-----------------+-------------------------------------------------------------------------------+
61
+ | | |
62
+ | :ref: `RAdam ` | https://arxiv.org/abs/1908.03265 |
63
+ +-----------------+-------------------------------------------------------------------------------+
64
+ | | |
65
+ | :ref: `Ranger ` | https://arxiv.org/abs/1908.00700v2 |
66
+ +-----------------+-------------------------------------------------------------------------------+
67
+ | | |
68
+ | :ref: `RangerQH ` | https://arxiv.org/abs/1908.00700v2 |
69
+ +-----------------+-------------------------------------------------------------------------------+
70
+ | | |
71
+ | :ref: `RangerVA ` | https://arxiv.org/abs/1908.00700v2 |
72
+ +-----------------+-------------------------------------------------------------------------------+
73
+ | | |
74
+ | :ref: `SGDW ` | https://arxiv.org/abs/1608.03983 |
75
+ +-----------------+-------------------------------------------------------------------------------+
76
+ | | |
77
+ | :ref: `Shampoo ` | https://arxiv.org/abs/1802.09568 |
78
+ +-----------------+-------------------------------------------------------------------------------+
79
+ | | |
80
+ | :ref: `Yogi ` | https://papers.nips.cc/paper/8186-adaptive-methods-for-nonconvex-optimization |
81
+ +-----------------+-------------------------------------------------------------------------------+
71
82
72
83
.. toctree ::
73
84
:maxdepth: 2
0 commit comments