From e2b179b2e6934507565b6d3a5f7698e977b6e892 Mon Sep 17 00:00:00 2001 From: Connor Date: Tue, 14 Mar 2017 03:10:47 -0700 Subject: [PATCH] Create README.md --- README.md | 45 +++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 45 insertions(+) create mode 100644 README.md diff --git a/README.md b/README.md new file mode 100644 index 0000000..232bd5c --- /dev/null +++ b/README.md @@ -0,0 +1,45 @@ +# random optimization code + +not unlike [my dsp repo,](https://github.com/notwa/dsp) +it's also a bunch of half-baked python code that's kinda handy. +i give no guarantee anything provided here is correct. + +don't expect commits, docs, or comments to be any verbose. + +### heads up + +this was formerly a gist. +**i might rewrite the git history** +at some point to add meaningful commit messages. + +## other stuff + +if you're coming here from Google: sorry, keep searching. +i know Google sometimes likes to give random repositories a high search ranking. +maybe consider one of the following: + +* [keras](https://github.com/fchollet/keras) + for easy tensor-optimized networks. + strong [tensorflow](http://tensorflow.org) integration as of version 2.0. + also check out the + [keras-contrib](https://github.com/farizrahman4u/keras-contrib) + library for more components based on recent papers. +* [theano's source code](https://github.com/Theano/theano/blob/master/theano/tensor/nnet/nnet.py) + contains pure numpy test methods to reference against. +* [minpy](https://github.com/dmlc/minpy) + for tensor-powered numpy routines and automatic differentiation. +* [autograd](https://github.com/HIPS/autograd) + for automatic differentiation without tensors. + +## dependencies + +python 3.5+ + +numpy scipy h5py sklearn dotmap + +## contributing + +i'm just throwing this code out there, +so i don't actually expect anyone to contribute, +*but* if you do find a blatant issue, +maybe [yell at me on twitter.](https://twitter.com/antiformant)