f28e8d3a54
add/remove comments and fix code style
2017-08-02 03:59:15 +00:00
5d9efa71c1
move SquaredHalved to core
2017-07-25 22:14:17 +00:00
f43063928e
rename Linear activation to Identity layer
2017-07-25 22:12:27 +00:00
2cf38d4ece
finally fix learning rate scheduling for real
...
okay, this is a disaster, but i think i've got it under control now.
the way batch-based learners now work is:
the epoch we're working towards is the truncated part of the epoch variable,
and how far we are into the epoch is the fractional part.
epoch starts at 1, so subtract by 1 when doing periodic operations.
2017-07-25 04:25:35 +00:00
93547b1974
add a linear (identity) activation for good measure
2017-07-25 04:24:32 +00:00
be1795f6ed
use in-place (additive) form of filters
2017-07-21 21:02:47 +00:00
7c4ef4ad05
fix Softplus derivative
2017-07-21 21:02:04 +00:00
c2bb2cfcd5
add centered variant of RMS Prop
2017-07-21 20:20:42 +00:00
fb22f64716
tweak semantics etc.
2017-07-21 19:45:58 +00:00
928850c2a8
lower process priority
2017-07-11 12:44:26 +00:00
112e263056
fix code i forgot to test, plus some tweaks
2017-07-11 11:36:11 +00:00
7bd5518650
note to self on how to handle generators
2017-07-11 11:23:27 +00:00
436f45fbb0
rewrite Ritual to reduce code duplication
2017-07-03 11:54:37 +00:00
6a3f047ddc
rename alpha to lr where applicable
2017-07-02 05:39:51 +00:00
1b1184480a
allow optimizers to adjust their own learning rate
2017-07-02 02:52:07 +00:00
22dc651cce
move lament into core
2017-07-01 02:22:34 +00:00
7da93e93a8
move graph printing into Model class
2017-07-01 02:17:46 +00:00
69786b40a1
begin work on multiple input/output nodes
2017-07-01 00:44:56 +00:00
a7c4bdaa2e
remove dead line and punctuate comment
2017-06-30 21:13:37 +00:00
c02fba01e2
various
...
use updated filenames.
don't use emnist by default.
tweak expando integer handling.
add some comments.
2017-06-26 00:16:51 +00:00
a770444199
shorten names
2017-06-25 22:08:07 +00:00