Skip to content

Commit b3adae7

Browse files
authored
Refresh docs (#122)
1 parent bfa8358 commit b3adae7

File tree

2 files changed

+80
-38
lines changed

2 files changed

+80
-38
lines changed

docs/api.rst

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,73 +1,104 @@
11
Available Optimizers
22
====================
33

4+
.. _AccSGD:
45

56
AccSGD
67
------
78

89
.. autoclass:: torch_optimizer.AccSGD
910
:members:
1011

12+
.. _AdaBound:
13+
1114
AdaBound
1215
--------
1316

1417
.. autoclass:: torch_optimizer.AdaBound
1518
:members:
1619

20+
.. _AdaMod:
21+
1722
AdaMod
1823
------
1924

2025
.. autoclass:: torch_optimizer.AdaMod
2126
:members:
2227

28+
.. _DiffGrad:
29+
2330
DiffGrad
2431
--------
2532

2633
.. autoclass:: torch_optimizer.DiffGrad
2734
:members:
2835

36+
.. _Lamb:
37+
2938
Lamb
3039
----
3140

3241
.. autoclass:: torch_optimizer.Lamb
3342
:members:
3443

44+
.. _NovoGrad:
45+
3546
NovoGrad
3647
--------
3748

3849
.. autoclass:: torch_optimizer.NovoGrad
3950
:members:
4051

52+
.. _PID:
53+
4154
PID
4255
---
4356

4457
.. autoclass:: torch_optimizer.PID
4558
:members:
4659

60+
.. _QHAdam:
61+
4762
QHAdam
4863
------
4964

5065
.. autoclass:: torch_optimizer.QHAdam
5166
:members:
5267

68+
.. _QHM:
69+
5370
QHM
5471
---
5572

5673
.. autoclass:: torch_optimizer.QHM
5774
:members:
5875

76+
.. _RAdam:
77+
5978
RAdam
6079
-----
6180

6281
.. autoclass:: torch_optimizer.RAdam
6382
:members:
6483

84+
.. _SGDW:
85+
6586
SGDW
6687
----
6788

6889
.. autoclass:: torch_optimizer.SGDW
6990
:members:
7091

92+
.. _Shampoo:
93+
94+
Shampoo
95+
-------
96+
97+
.. autoclass:: torch_optimizer.Shampoo
98+
:members:
99+
100+
.. _Yogi:
101+
71102
Yogi
72103
----
73104

docs/index.rst

Lines changed: 49 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -30,44 +30,55 @@ Installation process is simple, just::
3030
Supported Optimizers
3131
====================
3232

33-
+-------------+-------------------------------------------------------------------------------+
34-
| | |
35-
| AccSGD | https://arxiv.org/abs/1803.05591 |
36-
+-------------+-------------------------------------------------------------------------------+
37-
| | |
38-
| AdaBound | https://arxiv.org/abs/1902.09843 |
39-
+-------------+-------------------------------------------------------------------------------+
40-
| | |
41-
| AdaMod | https://arxiv.org/abs/1910.12249 |
42-
+-------------+-------------------------------------------------------------------------------+
43-
| | |
44-
| DiffGrad | https://arxiv.org/abs/1909.11015 |
45-
+-------------+-------------------------------------------------------------------------------+
46-
| | |
47-
| Lamb | https://arxiv.org/abs/1904.00962 |
48-
+-------------+-------------------------------------------------------------------------------+
49-
| | |
50-
| NovoGrad | https://arxiv.org/abs/1905.11286 |
51-
+-------------+-------------------------------------------------------------------------------+
52-
| | |
53-
| PID | https://www4.comp.polyu.edu.hk/~cslzhang/paper/CVPR18_PID.pdf |
54-
+-------------+-------------------------------------------------------------------------------+
55-
| | |
56-
| QHAdam | https://arxiv.org/abs/1810.06801 |
57-
+-------------+-------------------------------------------------------------------------------+
58-
| | |
59-
| QHM | https://arxiv.org/abs/1810.06801 |
60-
+-------------+-------------------------------------------------------------------------------+
61-
| | |
62-
| RAdam | https://arxiv.org/abs/1908.03265 |
63-
+-------------+-------------------------------------------------------------------------------+
64-
| | |
65-
| SGDW | https://arxiv.org/abs/1608.03983 |
66-
+-------------+-------------------------------------------------------------------------------+
67-
| | |
68-
| Yogi | https://papers.nips.cc/paper/8186-adaptive-methods-for-nonconvex-optimization |
69-
+-------------+-------------------------------------------------------------------------------+
70-
33+
+-----------------+-------------------------------------------------------------------------------+
34+
| | |
35+
| :ref:`AccSGD` | https://arxiv.org/abs/1803.05591 |
36+
+-----------------+-------------------------------------------------------------------------------+
37+
| | |
38+
| :ref:`AdaBound` | https://arxiv.org/abs/1902.09843 |
39+
+-----------------+-------------------------------------------------------------------------------+
40+
| | |
41+
| :ref:`AdaMod` | https://arxiv.org/abs/1910.12249 |
42+
+-----------------+-------------------------------------------------------------------------------+
43+
| | |
44+
| :ref:`DiffGrad` | https://arxiv.org/abs/1909.11015 |
45+
+-----------------+-------------------------------------------------------------------------------+
46+
| | |
47+
| :ref:`Lamb` | https://arxiv.org/abs/1904.00962 |
48+
+-----------------+-------------------------------------------------------------------------------+
49+
| | |
50+
| :ref:`NovoGrad` | https://arxiv.org/abs/1905.11286 |
51+
+-----------------+-------------------------------------------------------------------------------+
52+
| | |
53+
| :ref:`PID` | https://www4.comp.polyu.edu.hk/~cslzhang/paper/CVPR18_PID.pdf |
54+
+-----------------+-------------------------------------------------------------------------------+
55+
| | |
56+
| :ref:`QHAdam` | https://arxiv.org/abs/1810.06801 |
57+
+-----------------+-------------------------------------------------------------------------------+
58+
| | |
59+
| :ref:`QHM` | https://arxiv.org/abs/1810.06801 |
60+
+-----------------+-------------------------------------------------------------------------------+
61+
| | |
62+
| :ref:`RAdam` | https://arxiv.org/abs/1908.03265 |
63+
+-----------------+-------------------------------------------------------------------------------+
64+
| | |
65+
| :ref:`Ranger` | https://arxiv.org/abs/1908.00700v2 |
66+
+-----------------+-------------------------------------------------------------------------------+
67+
| | |
68+
| :ref:`RangerQH` | https://arxiv.org/abs/1908.00700v2 |
69+
+-----------------+-------------------------------------------------------------------------------+
70+
| | |
71+
| :ref:`RangerVA` | https://arxiv.org/abs/1908.00700v2 |
72+
+-----------------+-------------------------------------------------------------------------------+
73+
| | |
74+
| :ref:`SGDW` | https://arxiv.org/abs/1608.03983 |
75+
+-----------------+-------------------------------------------------------------------------------+
76+
| | |
77+
| :ref:`Shampoo` | https://arxiv.org/abs/1802.09568 |
78+
+-----------------+-------------------------------------------------------------------------------+
79+
| | |
80+
| :ref:`Yogi` | https://papers.nips.cc/paper/8186-adaptive-methods-for-nonconvex-optimization |
81+
+-----------------+-------------------------------------------------------------------------------+
7182

7283
.. toctree::
7384
:maxdepth: 2

0 commit comments

Comments
 (0)