personal repo for experimenting with neural networks
Go to file
2017-04-11 10:39:42 +00:00
optim_nn_core.py add dropout and deterministic predictions 2017-04-11 10:39:42 +00:00
optim_nn_mnist.py add dropout and deterministic predictions 2017-04-11 10:39:42 +00:00
optim_nn.py convert target losses to log10 format 2017-04-11 04:47:30 +00:00
README.md Create README.md 2017-03-14 03:10:47 -07:00

random optimization code

not unlike my dsp repo, it's also a bunch of half-baked python code that's kinda handy. i give no guarantee anything provided here is correct.

don't expect commits, docs, or comments to be any verbose.

heads up

this was formerly a gist. i might rewrite the git history at some point to add meaningful commit messages.

other stuff

if you're coming here from Google: sorry, keep searching. i know Google sometimes likes to give random repositories a high search ranking. maybe consider one of the following:

  • keras for easy tensor-optimized networks. strong tensorflow integration as of version 2.0. also check out the keras-contrib library for more components based on recent papers.
  • theano's source code contains pure numpy test methods to reference against.
  • minpy for tensor-powered numpy routines and automatic differentiation.
  • autograd for automatic differentiation without tensors.

dependencies

python 3.5+

numpy scipy h5py sklearn dotmap

contributing

i'm just throwing this code out there, so i don't actually expect anyone to contribute, but if you do find a blatant issue, maybe yell at me on twitter.