Interactive tool for creating directed graphs, created using d3.js.
Operation:
- drag/scroll to translate/zoom the graph
- shift-click on graph to create a node
- shift-click on a node and then drag to another node to connect them with a directed edge
| """ | |
| Possibly correct implementation of an all conv neural network using a single residual module | |
| This code was written for instruction purposes and no attempt to get the best results were made. | |
| References: | |
| Deep Residual Learning for Image Recognition: http://arxiv.org/pdf/1512.03385v1.pdf | |
| STRIVING FOR SIMPLICITY, THE ALL CONVOLUTIONAL NET: http://arxiv.org/pdf/1412.6806v3.pdf | |
| A video walking through the code and main ideas: https://youtu.be/-N_zlfKo4Ec |
| # Typical setup to include TensorFlow. | |
| import tensorflow as tf | |
| # Make a queue of file names including all the JPEG images files in the relative | |
| # image directory. | |
| filename_queue = tf.train.string_input_producer( | |
| tf.train.match_filenames_once("./images/*.jpg")) | |
| # Read an entire image file which is required since they're JPEGs, if the images | |
| # are too large they could be split in advance to smaller files or use the Fixed |
The following recipes are sampled from a trained neural net. You can find the repo to train your own neural net here: https://github.com/karpathy/char-rnn Thanks to Andrej Karpathy for the great code! It's really easy to setup.
The recipes I used for training the char-rnn are from a recipe collection called ffts.com And here is the actual zipped data (uncompressed ~35 MB) I used for training. The ZIP is also archived @ archive.org in case the original links becomes invalid in the future.
| import numpy as np | |
| import pdb | |
| from sklearn.datasets import make_classification | |
| from sklearn.mixture import GaussianMixture as GMM | |
| def fisher_vector(xx, gmm): | |
| """Computes the Fisher vector on a set of descriptors. |