personal repo for experimenting with neural networks
Find a file
2017-04-10 14:36:08 +00:00
optim_nn.py add weight regularization 2017-04-10 14:36:08 +00:00
optim_nn_core.py add weight regularization 2017-04-10 14:36:08 +00:00
optim_nn_mnist.py fix epoch incrementing 2017-03-22 21:41:24 +00:00
README.md Create README.md 2017-03-14 03:10:47 -07:00

random optimization code

not unlike my dsp repo, it's also a bunch of half-baked python code that's kinda handy. i give no guarantee anything provided here is correct.

don't expect commits, docs, or comments to be any verbose.

heads up

this was formerly a gist. i might rewrite the git history at some point to add meaningful commit messages.

other stuff

if you're coming here from Google: sorry, keep searching. i know Google sometimes likes to give random repositories a high search ranking. maybe consider one of the following:

  • keras for easy tensor-optimized networks. strong tensorflow integration as of version 2.0. also check out the keras-contrib library for more components based on recent papers.
  • theano's source code contains pure numpy test methods to reference against.
  • minpy for tensor-powered numpy routines and automatic differentiation.
  • autograd for automatic differentiation without tensors.

dependencies

python 3.5+

numpy scipy h5py sklearn dotmap

contributing

i'm just throwing this code out there, so i don't actually expect anyone to contribute, but if you do find a blatant issue, maybe yell at me on twitter.