|
86491ad841
|
add Adamax optimizer
|
2019-03-22 12:57:05 +01:00 |
|
|
2a4f92154d
|
rewrite momentum optimizer
the original version wasn't very useful as it would typically diverge.
|
2019-03-22 12:55:13 +01:00 |
|
|
2b5798332d
|
allow configuration of Neumann hyperparameters
|
2019-02-17 07:47:53 +01:00 |
|
|
c92082e07a
|
fix momentum quantity in Neumann optimizer
|
2019-02-17 07:47:05 +01:00 |
|
|
2c921d34c2
|
add Yogi optimizer
|
2019-02-05 04:19:48 +01:00 |
|
|
7deaa3c3f6
|
reword some comments
|
2019-02-05 04:19:14 +01:00 |
|
|
7227559912
|
reset learning rates in optimizers
|
2019-02-05 04:15:28 +01:00 |
|
|
54ea41711b
|
refactor gradient filtering
|
2019-02-03 15:10:43 +01:00 |
|
|
0d28882ef0
|
remove YellowFin because it's not worth maintaining
|
2019-02-03 15:03:03 +01:00 |
|
|
5fd2b7b546
|
remove old versions of optimizers
|
2019-02-03 14:43:04 +01:00 |
|
|
b8c40d2e2f
|
rewrite some comments
|
2019-02-03 14:30:58 +01:00 |
|
|
94f27d6f2a
|
add Adadelta optimizer
|
2019-02-03 14:30:47 +01:00 |
|
|
f60535aa01
|
generalize Adam-like optimizers
|
2019-02-03 14:30:03 +01:00 |
|
|
7161f983ab
|
fix __name__ being incorrect due to extra __all__
this fixes tracebacks and checks for __main__,
among other things.
|
2018-03-17 14:09:15 +01:00 |
|
|
3aa3b70a9f
|
add AMSgrad optimizer
|
2018-03-07 01:30:04 +01:00 |
|
|
169303813d
|
basic PEP 8 compliance
rip readability
|
2018-01-22 19:40:36 +00:00 |
|
|
c81ce0afbb
|
rename stuff and add a couple missing imports
|
2018-01-21 22:16:36 +00:00 |
|
|
bbdb91fcb1
|
merge and split modules into a package
|
2018-01-21 22:07:57 +00:00 |
|