diff --git a/README.md b/README.md new file mode 100644 index 0000000..232bd5c --- /dev/null +++ b/README.md @@ -0,0 +1,45 @@ +# random optimization code + +not unlike [my dsp repo,](https://github.com/notwa/dsp) +it's also a bunch of half-baked python code that's kinda handy. +i give no guarantee anything provided here is correct. + +don't expect commits, docs, or comments to be any verbose. + +### heads up + +this was formerly a gist. +**i might rewrite the git history** +at some point to add meaningful commit messages. + +## other stuff + +if you're coming here from Google: sorry, keep searching. +i know Google sometimes likes to give random repositories a high search ranking. +maybe consider one of the following: + +* [keras](https://github.com/fchollet/keras) + for easy tensor-optimized networks. + strong [tensorflow](http://tensorflow.org) integration as of version 2.0. + also check out the + [keras-contrib](https://github.com/farizrahman4u/keras-contrib) + library for more components based on recent papers. +* [theano's source code](https://github.com/Theano/theano/blob/master/theano/tensor/nnet/nnet.py) + contains pure numpy test methods to reference against. +* [minpy](https://github.com/dmlc/minpy) + for tensor-powered numpy routines and automatic differentiation. +* [autograd](https://github.com/HIPS/autograd) + for automatic differentiation without tensors. + +## dependencies + +python 3.5+ + +numpy scipy h5py sklearn dotmap + +## contributing + +i'm just throwing this code out there, +so i don't actually expect anyone to contribute, +*but* if you do find a blatant issue, +maybe [yell at me on twitter.](https://twitter.com/antiformant)