└── keras_tutorial ├── README.md ├── create_vector.py ├── imdb_cnn.py ├── imdb_lstm.py ├── install_script.sh ├── ironic.txt ├── keras_deck.pdf ├── keras_tutorial.pdf ├── mnist_cnn.py ├── regular.txt ├── sarcasm_net.py ├── sarcasm_net_cnn.py └── sentiment_analyzer.py /keras_tutorial/README.md: -------------------------------------------------------------------------------- 1 | ## Synopsis 2 | 3 | The purpose of this project was two-fold. One was that it serves as a quick and simple intro to Keras, the second was that it was a proof of concept for the idea that sarcasm or verbal irony (I'm aware there's a subtle difference between the two but let's move on...) can be detected by drastic or frequent changes in the sentiment between sentences within a paragraph. 4 | 5 | To test this, a corpus was found with sarcastic and regular reviews from amazon.com. Using the Stanford Sentiment Analysis Recursive Network (part of Stanford's CoreNLP toolkit, see below) each sentence was determined to be Very Negative, Negative, Neutral, Positive or Very Positive. From this, each paragraph's sentences were coded into a vector of numbers with Very Negative as -2, Negative as -1, Neutral as 0, Positive as +1 and Very Positive as +2. The final vector is one training/testing sample. For more information on Keras, please check out their site (https://keras.io/) or look at a tutorial I did some time ago (https://www.youtube.com/watch?v=Tp3SaRbql4k) 6 | 7 | ## Install Keras 8 | 9 | To install Keras first update your repos... 10 | 11 | ~$ sudo apt-get update 12 | 13 | Keras will need numerous extra libraries to run properly so run this line and obtain these packages 14 | 15 | ~$ sudo apt-get install python-numpy python-scipy python-dev python-pip python-nose g++ libopenblas-dev git libhdf5-serial-dev python-pip python-matplotlib python-sklearn 16 | 17 | Installing Theano and Keras is easiest using the pip package manager. Update it first 18 | 19 | ~$ sudo pip install --upgrade pip 20 | 21 | Use pip to install the following packages. NOTE: the newer version of theano is needed for "ReLu" to work properly 22 | 23 | ~$ sudo pip install pyyaml h5py 24 | ~$ sudo pip install git+git://github.com/Theano/Theano.git 25 | ~$ sudo pip install keras 26 | 27 | ## Install Stanford CoreNLP toolkit for sentiment analysis 28 | 29 | The vectors used to train the networks are based on sentiment analysis that was performed by the excellent Stanford Sentiment Analysis Recursive Network project (terminal commands and more info here: http://nlp.stanford.edu/sentiment/code.html). This project is part of the Stanford CoreNLP toolkit which can be downloaded from: http://stanfordnlp.github.io/CoreNLP/. Once these tools are installed, analysis on a desired corpus is possible. 30 | 31 | ## Amazon Sarcasm Corpus 32 | 33 | The code here uses the corpus downloaded from the Sarcasm Corpus (http://storm.cis.fordham.edu/~filatova/SarcasmCorpus.html). In theory, the sentiment_analyzer.py and create_vector.py scripts can be altered and used on other corpuses, so long as the Stanford Sentiment Analyser is installed. 34 | 35 | ## Using the Scripts 36 | 37 | To generate vectors to feed the learning algorithms, first set up your corpus (NOTE: these scripts were written to run with the Sarcasm Corpus mentioned above, they won't work for another corpus without modification). Run sentiment_analyzer.py on your corpus first; it will insert the perceived sentiment of a sentence into a new line below that sentence. This is done for every sentence in a file. To codify these inserted sentiments, run create_vectors.py to generate one large .txt file of vectors (values range between -2 and +2). In the final .txt file, each line corresponds to one paragraph/file. An example of these files can be seen in the ironic.txt and regular.txt filed in this project. In these files, each line in a file corresponds to one review vector. 38 | 39 | The Keras based neural networks are configured to take in these text files and perform learning and prediction. 40 | -------------------------------------------------------------------------------- /keras_tutorial/create_vector.py: -------------------------------------------------------------------------------- 1 | import os 2 | import glob 3 | import fileinput 4 | import numpy as np 5 | import matplotlib.pyplot as plt 6 | 7 | # After the Stanford CoreNLP toolkit has analysed the corpus, use this script 8 | # to codify sentiments as integers [-2,2] for one line in a large text file 9 | # that can be passed to a learning algorithm. 10 | 11 | # Example of output vectors can be seen in ironic.txt and regular.txt 12 | 13 | pathname1 = '/path_to_sentiment_analysed_files/Sentiment1/*.txt' 14 | pathname2 = '/path_to_sentiment_analysed_files/Sentiment2/*.txt' 15 | 16 | #Ironic Reviews 17 | 18 | out1 = file('ironic.txt', 'a') 19 | 20 | for fname in glob.glob(pathname1): 21 | cur_review = open(fname, 'r') 22 | sent_arr = np.array([]) 23 | i = 1 24 | for line in cur_review: 25 | if i % 2 == 0: 26 | if line == ' Very negative\n': 27 | sent_arr = np.append([sent_arr], [-2]) 28 | elif line == ' Negative\n': 29 | sent_arr = np.append([sent_arr], [-1]) 30 | elif line == ' Neutral\n': 31 | sent_arr = np.append([sent_arr], [0]) 32 | elif line == ' Positive\n': 33 | sent_arr = np.append([sent_arr], [1]) 34 | elif line == ' Very positive\n': 35 | sent_arr = np.append([sent_arr], [2]) 36 | i += 1 37 | #print(sent_arr) 38 | np.savetxt(out1, [sent_arr], fmt='%i', delimiter = ' ') 39 | 40 | out1.close() 41 | 42 | #Regular Reviews 43 | 44 | out2 = file('regular.txt', 'a') 45 | 46 | for fname in glob.glob(pathname2): 47 | cur_review = open(fname, 'r') 48 | sent_arr = np.array([]) 49 | i = 1 50 | for line in cur_review: 51 | if i % 2 == 0: 52 | if line == ' Very negative\n': 53 | sent_arr = np.append([sent_arr], [-2]) 54 | elif line == ' Negative\n': 55 | sent_arr = np.append([sent_arr], [-1]) 56 | elif line == ' Neutral\n': 57 | sent_arr = np.append([sent_arr], [0]) 58 | elif line == ' Positive\n': 59 | sent_arr = np.append([sent_arr], [1]) 60 | elif line == ' Very positive\n': 61 | sent_arr = np.append([sent_arr], [2]) 62 | i += 1 63 | #print(sent_arr) 64 | np.savetxt(out2, [sent_arr], fmt='%i', delimiter = ' ') 65 | 66 | out2.close() 67 | -------------------------------------------------------------------------------- /keras_tutorial/imdb_cnn.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | from __future__ import print_function 3 | import numpy as np 4 | np.random.seed(1337) # for reproducibility 5 | 6 | from keras.preprocessing import sequence 7 | from keras.models import Sequential 8 | from keras.layers.core import Dense, Dropout, Activation, Flatten 9 | from keras.layers.embeddings import Embedding 10 | from keras.layers.convolutional import Convolution1D, MaxPooling1D 11 | from keras.datasets import imdb 12 | 13 | ''' 14 | This example demonstrates the use of Convolution1D 15 | for text classification. 16 | Run on GPU: THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 python imdb_cnn.py 17 | Get to 0.835 test accuracy after 2 epochs. 100s/epoch on K520 GPU. 18 | ''' 19 | 20 | # set parameters: 21 | max_features = 5000 22 | maxlen = 100 23 | batch_size = 32 24 | embedding_dims = 100 25 | nb_filter = 250 26 | filter_length = 3 27 | hidden_dims = 250 28 | nb_epoch = 2 29 | 30 | print("Loading data...") 31 | (X_train, y_train), (X_test, y_test) = imdb.load_data(nb_words=max_features, 32 | test_split=0.2) 33 | print(len(X_train), 'train sequences') 34 | print(len(X_test), 'test sequences') 35 | 36 | print("Pad sequences (samples x time)") 37 | X_train = sequence.pad_sequences(X_train, maxlen=maxlen) 38 | X_test = sequence.pad_sequences(X_test, maxlen=maxlen) 39 | print('X_train shape:', X_train.shape) 40 | print('X_test shape:', X_test.shape) 41 | 42 | print('Build model...') 43 | model = Sequential() 44 | 45 | # we start off with an efficient embedding layer which maps 46 | # our vocab indices into embedding_dims dimensions 47 | model.add(Embedding(max_features, embedding_dims, input_length=maxlen)) 48 | model.add(Dropout(0.25)) 49 | 50 | # we add a Convolution1D, which will learn nb_filter 51 | # word group filters of size filter_length: 52 | model.add(Convolution1D(nb_filter=nb_filter, 53 | filter_length=filter_length, 54 | border_mode="valid", 55 | activation="relu", 56 | subsample_length=1)) 57 | # we use standard max pooling (halving the output of the previous layer): 58 | model.add(MaxPooling1D(pool_length=2)) 59 | 60 | # We flatten the output of the conv layer, 61 | # so that we can add a vanilla dense layer: 62 | model.add(Flatten()) 63 | 64 | # We add a vanilla hidden layer: 65 | model.add(Dense(hidden_dims)) 66 | model.add(Dropout(0.25)) 67 | model.add(Activation('relu')) 68 | 69 | # We project onto a single unit output layer, and squash it with a sigmoid: 70 | model.add(Dense(1)) 71 | model.add(Activation('sigmoid')) 72 | 73 | model.compile(loss='binary_crossentropy', 74 | optimizer='rmsprop', 75 | class_mode="binary") 76 | model.fit(X_train, y_train, batch_size=batch_size, 77 | nb_epoch=nb_epoch, show_accuracy=True, 78 | validation_data=(X_test, y_test)) 79 | -------------------------------------------------------------------------------- /keras_tutorial/imdb_lstm.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | from __future__ import print_function 3 | import numpy as np 4 | np.random.seed(1337) # for reproducibility 5 | 6 | from keras.preprocessing import sequence 7 | from keras.utils import np_utils 8 | from keras.models import Sequential 9 | from keras.layers.core import Dense, Dropout, Activation 10 | from keras.layers.embeddings import Embedding 11 | from keras.layers.recurrent import LSTM 12 | from keras.datasets import imdb 13 | 14 | ''' 15 | Train a LSTM on the IMDB sentiment classification task. 16 | The dataset is actually too small for LSTM to be of any advantage 17 | compared to simpler, much faster methods such as TF-IDF+LogReg. 18 | Notes: 19 | - RNNs are tricky. Choice of batch size is important, 20 | choice of loss and optimizer is critical, etc. 21 | Some configurations won't converge. 22 | - LSTM loss decrease patterns during training can be quite different 23 | from what you see with CNNs/MLPs/etc. 24 | GPU command: 25 | THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 python imdb_lstm.py 26 | ''' 27 | 28 | max_features = 20000 29 | maxlen = 100 # cut texts after this number of words (among top max_features most common words) 30 | batch_size = 32 31 | 32 | print("Loading data...") 33 | (X_train, y_train), (X_test, y_test) = imdb.load_data(nb_words=max_features, 34 | test_split=0.2) 35 | print(len(X_train), 'train sequences') 36 | print(len(X_test), 'test sequences') 37 | 38 | print("Pad sequences (samples x time)") 39 | X_train = sequence.pad_sequences(X_train, maxlen=maxlen) 40 | X_test = sequence.pad_sequences(X_test, maxlen=maxlen) 41 | print('X_train shape:', X_train.shape) 42 | print('X_test shape:', X_test.shape) 43 | 44 | print('Build model...') 45 | model = Sequential() 46 | model.add(Embedding(max_features, 128, input_length=maxlen)) 47 | model.add(LSTM(128)) # try using a GRU instead, for fun 48 | model.add(Dropout(0.5)) 49 | model.add(Dense(1)) 50 | model.add(Activation('sigmoid')) 51 | 52 | # try using different optimizers and different optimizer configs 53 | model.compile(loss='binary_crossentropy', 54 | optimizer='adam', 55 | class_mode="binary") 56 | 57 | print("Train...") 58 | model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=3, 59 | validation_data=(X_test, y_test), show_accuracy=True) 60 | score, acc = model.evaluate(X_test, y_test, 61 | batch_size=batch_size, 62 | show_accuracy=True) 63 | print('Test score:', score) 64 | print('Test accuracy:', acc) 65 | -------------------------------------------------------------------------------- /keras_tutorial/install_script.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | sudo apt-get update 4 | 5 | #Install all the python tools and fun other bits 6 | sudo apt-get install python-numpy python-scipy python-dev python-pip python-nose g++ libopenblas-dev git libhdf5-serial-dev python-pip python-matplotlib python-sklearn 7 | 8 | #Update pip 9 | sudo pip install --upgrade pip 10 | 11 | #Use pip to install theano, keras and other things I guess 12 | sudo pip install pyyaml h5py 13 | sudo pip install git+git://github.com/Theano/Theano.git 14 | sudo pip install keras 15 | -------------------------------------------------------------------------------- /keras_tutorial/ironic.txt: -------------------------------------------------------------------------------- 1 | 1 1 -2 -1 1 0 -1 -1 -1 2 -1 0 -1 -1 -1 1 1 0 -1 -1 -1 -2 1 2 | -1 0 -1 -1 -1 3 | -1 -1 -1 -1 -1 1 -1 -1 1 -1 -1 -2 -1 -1 -1 -1 1 -2 1 -1 4 | -1 0 -1 5 | 1 -1 0 -1 -1 6 | -1 0 7 | -1 -1 -1 -1 -1 1 -1 -2 -1 -1 -1 8 | -1 -1 -1 -1 -1 -1 -1 -1 -1 1 0 -1 0 -1 -1 2 9 | -1 -1 -1 -1 -1 -1 0 -1 1 1 -1 1 -1 10 | 1 -1 -1 0 -1 1 -1 -1 -1 0 -1 0 -1 -1 1 1 1 -1 1 -1 -1 -1 -1 -2 -2 0 -1 -1 1 11 | -1 0 -1 -1 -1 -1 -1 12 | -1 0 0 -1 1 -1 1 -1 0 -1 -1 -2 -1 -1 -1 1 -1 13 | -2 14 | -1 1 -2 -2 -1 1 0 1 -1 0 0 0 -1 -1 15 | -1 1 -1 1 -1 0 0 0 0 -1 -1 1 -1 -1 1 -1 -1 16 | -1 1 1 1 0 -1 1 2 1 -1 0 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 1 -1 -1 -1 -1 0 0 -1 0 -1 1 -1 -1 0 -1 17 | 1 -1 0 -2 -1 -1 -1 -1 18 | -2 0 0 -1 -1 -1 0 19 | 0 0 0 -1 -1 -1 -2 1 0 0 0 0 -1 -1 1 0 -1 -1 -1 1 -1 -1 20 | -1 -1 -1 -1 -1 -2 -1 -1 0 -1 -1 1 1 -1 -1 -1 0 21 | -1 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 22 | 2 -1 -1 1 23 | -1 -1 -1 0 -1 0 -1 -1 -1 -1 -1 24 | -1 1 -1 -1 1 0 -1 25 | 1 -1 1 -1 0 -1 -2 1 26 | -1 1 -1 1 -1 1 -1 -1 -1 -1 0 27 | -1 1 -1 -1 -1 -1 -1 -1 28 | -1 1 -1 29 | -1 -1 -1 -1 -1 -1 -1 1 -1 -1 -1 0 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 30 | 0 1 -1 -1 -1 -1 -1 0 0 -1 -1 -1 -1 -1 -1 -1 -1 -2 -1 31 | -1 -1 -1 0 -1 -1 1 -1 1 32 | -1 1 -1 2 33 | -1 -1 -2 -1 0 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 0 -1 -1 0 -1 -1 -1 -1 -1 -2 -1 -1 34 | -1 1 -1 -1 -1 -1 1 -1 0 -1 1 35 | -1 -1 -1 -1 -2 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -2 -1 -1 -1 1 36 | 0 1 -1 1 0 -1 1 0 0 0 -1 0 -1 0 0 -1 -1 0 0 -1 -1 1 -2 -1 -1 0 -1 -1 1 -1 0 0 0 -1 1 -1 -1 1 -1 0 0 -1 -1 -2 -1 -1 -1 37 | -1 -1 0 -1 1 0 1 -1 0 38 | -1 1 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 39 | -1 -1 -1 -1 0 40 | -1 -1 -1 -1 -1 1 -1 41 | 1 -1 -1 -1 -1 -1 -1 -1 1 0 0 42 | -1 -1 43 | -1 -1 -2 -2 -1 -1 -2 44 | 1 1 -2 -1 1 1 -1 0 -1 -1 -1 -1 1 -1 -1 2 -1 -1 -1 -1 -1 0 -1 0 -1 -1 1 1 0 -1 0 -1 -1 -1 45 | -1 2 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 0 -1 0 -1 -1 -1 -1 -1 46 | -1 -1 -1 -1 -1 -1 1 -1 -1 -1 -1 47 | -1 -1 -1 -1 -1 0 -1 0 -1 -1 -1 -1 48 | -1 -1 -1 1 1 -1 -1 0 -1 -1 -1 -1 0 -1 -1 -1 1 -1 0 0 0 49 | -1 0 0 -1 50 | 1 -1 -1 -1 -1 -1 -1 1 -1 -1 1 -1 -1 51 | -1 -1 1 -1 0 -1 0 52 | 0 -1 0 -1 0 -1 -1 -1 -1 -1 0 1 -1 -1 -1 -1 1 0 0 -1 0 -1 -1 0 -1 -1 -1 -1 -1 -1 0 2 0 0 0 -1 -1 -1 -1 -1 0 0 -1 0 -1 0 -1 -1 -1 0 -1 -1 1 0 -1 0 1 -1 0 -1 1 1 -1 1 0 53 | 1 -1 -1 -1 1 1 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 0 1 0 -1 54 | 0 -1 0 0 -1 -1 -1 -1 0 55 | 0 1 0 0 -1 -1 0 2 56 | 1 1 -2 0 1 -1 1 -1 0 -1 -1 1 -1 -1 1 -1 -1 -1 1 1 -1 57 | -1 -1 -1 -1 -1 -2 -1 0 1 58 | -1 -1 -1 0 1 -2 0 -1 -1 -1 -1 0 0 0 -1 59 | -1 1 -1 60 | -1 -1 0 -1 -1 -1 -1 -1 -1 61 | -1 -1 -1 62 | 0 1 1 0 63 | 1 -1 -1 64 | -1 -1 -1 1 1 -1 0 0 -1 0 0 1 0 0 1 -1 0 65 | 1 -1 -1 -1 0 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 0 0 -1 -1 0 -1 0 -1 0 0 -2 -1 -1 -1 0 0 0 -1 0 -1 0 0 1 0 0 -1 1 -1 0 66 | -1 -1 -2 -1 67 | -1 -1 -1 -1 68 | 2 1 1 69 | 1 -1 0 -1 1 -1 -1 1 0 -1 -1 -1 -1 -1 70 | -1 1 71 | 1 -1 72 | -1 -1 -1 1 -1 73 | 2 -1 74 | 1 1 -1 1 75 | -1 -1 76 | -1 0 -1 -1 0 -1 -1 1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 77 | 1 0 -1 -1 -1 0 0 -1 -1 1 78 | -1 1 -1 1 79 | -1 80 | -1 -1 -1 0 -1 -1 -1 -1 0 -1 81 | 1 -1 82 | -1 -1 1 0 0 1 83 | 0 -1 1 -1 1 -1 2 1 2 1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 1 -1 0 0 0 -1 -1 1 0 -1 -1 -1 -1 -1 -1 -1 84 | -1 -1 1 0 1 -1 0 -1 85 | -1 -2 -1 1 -1 -1 -1 -1 -1 1 -1 1 0 -1 -1 -1 1 0 0 -1 -1 -1 -1 0 -1 0 0 -1 -1 0 0 -1 0 0 -1 0 -1 0 0 -1 0 0 -1 -1 -1 -1 -2 86 | -1 0 0 87 | -1 -1 -1 1 88 | 2 89 | -1 -1 2 1 90 | 1 1 -1 0 -1 -1 1 1 -1 0 0 0 0 -1 0 -1 91 | -1 -1 -1 1 -1 -1 -1 0 0 92 | -1 1 1 93 | -1 -1 -1 -1 1 -1 0 -1 -1 -1 0 1 94 | 0 0 -1 -1 -1 -1 -1 0 1 1 1 -1 0 95 | 1 -1 -1 1 -1 -1 0 -1 -1 -1 96 | 0 -1 0 -1 0 97 | -1 -1 -1 -1 1 98 | -1 0 1 1 -1 -1 0 0 0 1 -1 1 -1 99 | -1 0 0 -1 1 -1 -1 -1 -1 -1 0 0 -1 -1 -1 -1 -1 1 1 -1 1 100 | 0 -1 -1 -1 -1 0 0 101 | -1 -1 -1 0 -1 -1 1 1 -1 102 | -2 -1 -1 -1 1 0 0 -1 1 0 -1 0 -1 0 -1 0 0 -1 1 1 0 -1 -1 -1 0 -1 0 -1 -1 103 | 1 -1 -1 -2 -1 104 | 1 0 -1 0 0 -1 0 -1 -2 1 1 0 0 0 1 0 -1 105 | 1 0 -1 -1 -1 -1 1 1 106 | 1 -1 -1 -2 -1 -1 -1 107 | 1 -1 1 0 1 0 0 1 1 1 1 108 | -1 0 0 -1 1 1 -1 109 | -1 -1 -1 0 0 -1 110 | 0 0 0 0 -1 -1 -1 -1 -1 -1 1 0 0 1 111 | -1 -2 0 -1 -1 0 -1 -1 1 -1 112 | -1 -1 -1 -1 -1 -1 -1 -1 1 113 | -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -2 -1 -1 -1 1 0 -1 114 | 1 1 1 2 2 -1 0 115 | -1 0 -1 0 1 1 116 | -1 1 -1 -1 0 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 0 -1 1 1 -1 0 -2 0 0 -1 -1 0 0 -1 1 0 1 1 -1 1 1 -1 1 -1 -1 -1 -1 1 -1 -2 -1 -1 0 117 | 1 0 -1 1 -1 -2 -1 -1 -1 -1 -1 1 0 -1 0 0 0 118 | -1 1 1 -1 -1 -1 119 | -1 -1 0 -1 -1 120 | -1 -1 -1 1 1 1 1 121 | 1 1 1 -1 -1 -1 -1 -1 1 122 | -2 1 -1 -1 -1 1 123 | -1 1 -1 -1 124 | -1 -1 -1 -1 -1 -1 125 | -1 -1 -1 -1 126 | 0 -1 -1 1 -1 0 -1 -1 -1 1 1 -1 -1 1 0 127 | -1 -2 128 | 0 1 -2 -2 1 0 0 -1 -1 0 -1 -1 -1 129 | 1 -2 -1 0 130 | -1 0 -1 -1 0 0 0 0 1 1 -1 0 1 1 -1 0 -1 1 1 -1 131 | -1 1 -1 1 0 0 1 1 -1 -1 132 | 1 -1 -1 -1 -1 -1 1 1 133 | 1 -2 0 -1 0 0 1 -2 -1 1 -2 134 | 1 0 -2 -1 -1 1 -1 0 135 | 2 -1 2 -1 0 1 1 136 | 0 0 1 -1 -1 -1 -1 0 -1 1 1 -1 1 1 -2 0 137 | -1 -1 -1 -1 -1 -1 138 | -1 -1 139 | 1 0 0 1 1 0 -1 -1 1 -1 -1 -1 140 | 0 -1 -1 0 1 -1 0 -1 -1 -1 -1 -1 -1 0 -1 -1 0 -2 -1 -1 -1 0 -1 -1 1 -1 0 1 0 2 1 1 141 | -1 -1 1 -1 -1 0 142 | 1 -1 -1 -1 0 -1 -2 0 1 -1 1 1 -1 -1 0 -1 -1 -1 -1 -1 -1 1 -1 -1 -1 1 -1 -1 -1 0 -1 0 -1 -1 -1 -1 -1 143 | 0 -1 -1 144 | -1 -1 1 -1 -1 -1 -1 -1 -1 1 -1 -1 0 -1 -1 145 | 1 -1 2 -1 -1 0 146 | -2 0 -1 147 | -1 -1 1 1 -1 -1 -1 1 0 -1 -2 148 | -1 2 1 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -1 0 -1 1 -1 0 -1 1 0 -1 0 0 0 0 0 -1 0 0 149 | -2 -1 -1 -1 -1 -2 -1 -1 1 -2 -1 150 | -1 -1 -1 -1 -1 1 -1 -1 0 -1 -1 -1 0 -1 1 1 0 -1 1 0 2 -1 -1 -1 0 -1 0 -1 0 151 | -1 -1 0 0 0 -1 -1 0 -1 0 -1 -1 1 -1 -1 -1 1 -1 -1 -1 1 -1 1 1 0 -2 -1 -2 -1 -1 -1 1 -1 -1 -1 -1 0 1 -1 -1 -1 -1 1 0 1 -1 -1 152 | 1 153 | 1 -1 0 -1 -1 0 -1 0 1 1 -1 0 0 154 | -1 0 -1 -1 -1 -1 -1 -1 -2 -1 -1 -2 -2 -1 0 -1 155 | -2 0 -1 -1 -1 -1 0 -2 -1 -1 -1 -1 0 -1 0 -1 -1 -1 -1 -1 -1 0 -1 0 -1 -1 0 -2 -1 -2 -1 0 -1 -1 -1 1 0 -1 -2 0 -1 -1 -1 -1 0 -1 -1 0 -1 0 -2 -1 156 | 0 1 1 -1 0 2 -1 1 1 -1 0 0 0 0 -1 0 157 | -1 0 0 0 1 -1 0 -1 -2 158 | -1 -1 1 0 -1 -1 -1 1 -1 159 | -1 -1 0 1 1 160 | 0 0 1 -1 1 -1 0 0 -1 1 -1 0 -2 -1 0 -1 0 -1 -1 -1 1 -1 -1 0 -1 -1 -1 -1 -1 161 | -1 0 1 0 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 -2 1 162 | -1 1 -1 -1 -1 -1 -1 -1 163 | -1 -1 -1 -1 164 | -1 -1 -1 -1 -2 -1 -2 -1 1 1 -2 -1 -1 -1 -1 165 | 1 -1 -1 -1 -1 -1 -1 0 1 0 0 166 | -1 1 0 1 -1 -1 1 1 1 -1 1 -1 -1 1 1 -1 0 167 | -1 0 -1 -1 1 0 -1 -1 0 -1 -1 168 | -1 -1 169 | 1 1 -1 -1 -1 -1 0 1 170 | -1 1 0 -1 1 -1 0 -1 171 | -1 0 -1 -1 -1 1 -2 -1 1 -1 -1 -1 -1 1 -1 -1 -1 0 -1 -1 -1 0 -1 1 -1 0 -1 -1 -1 0 0 -1 0 1 0 -1 1 172 | -1 0 -2 -1 -1 -1 1 -1 -1 173 | -1 1 0 -1 -1 174 | 1 1 -1 -1 175 | 0 1 -1 -1 0 -1 1 -1 0 -1 -1 0 0 0 176 | -1 0 -1 -1 -1 -1 -1 0 0 -1 -1 0 0 -1 1 0 -1 -1 -1 1 -1 1 177 | -1 -1 -2 1 0 1 2 -1 -1 -1 -1 -1 0 -1 -1 0 0 0 0 0 0 0 0 1 -1 -1 1 178 | 0 1 -1 179 | 1 0 -1 -1 180 | -1 -1 -1 -1 -2 -1 -1 0 -1 1 181 | -1 -2 182 | -1 -1 1 1 1 1 0 0 0 -2 183 | -1 -1 -1 0 0 -1 1 -1 -1 -1 -1 -1 0 -1 -1 -2 0 -1 0 184 | -1 -1 1 185 | -1 -1 -1 -1 -1 0 -1 0 1 -1 0 186 | 0 -1 -1 -1 187 | -1 -1 188 | -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 1 1 1 1 189 | -1 1 1 -1 0 1 -1 1 1 2 -1 -1 -1 1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 1 190 | 1 1 -2 1 -1 0 -1 -1 191 | -1 -1 -1 0 -1 192 | -2 -1 -1 0 0 -1 0 0 -1 -2 -1 2 193 | -1 -1 -1 1 0 -2 0 -1 0 -1 0 0 1 0 -1 -1 1 -1 0 0 0 -1 -1 -1 0 0 -1 -1 194 | 1 1 1 -1 -1 -1 1 -1 -1 -1 0 0 -1 -1 -1 -1 -1 -2 0 195 | -1 -1 1 1 1 -1 -2 -1 -2 -1 -1 -1 -1 -1 196 | -1 1 -1 0 197 | -2 -1 0 0 198 | 0 0 1 -1 0 -1 -1 0 -1 1 1 0 0 0 199 | 1 -1 1 0 -1 -1 1 0 0 -1 0 -1 -1 -1 -1 0 0 200 | 1 1 -2 0 1 -1 1 -1 0 -1 -1 1 -1 -1 1 -1 -1 -1 1 1 -1 0 0 201 | 0 1 1 -1 -1 -1 0 0 -1 0 0 -1 1 202 | 0 -1 -1 1 0 -1 -1 0 -1 -1 -1 -1 -1 -1 203 | -1 1 -1 -1 -1 -1 0 -1 0 -1 0 -1 0 204 | -1 -2 -1 -1 -2 -1 1 -1 205 | -1 1 -1 -1 206 | 0 -1 -1 -1 2 1 1 -1 1 207 | -1 0 -1 1 -1 1 0 208 | -1 -1 0 -1 -1 0 -1 -1 -1 209 | 0 -1 1 2 0 -1 -1 -1 0 -1 -1 1 0 -1 1 -1 -1 -1 -1 -1 1 1 210 | -1 -1 1 211 | 0 -1 -1 0 -1 1 212 | -1 1 -1 -1 -1 -1 1 1 -1 1 -1 -1 0 0 213 | -1 -1 -1 -1 1 -1 -2 1 -1 -1 -1 0 214 | -1 -1 -1 1 -1 1 -1 215 | -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 216 | 1 -1 0 0 1 -1 -1 0 1 -1 -1 217 | 1 1 0 -1 0 0 0 -1 0 0 -1 0 0 0 -1 0 -1 -1 -1 0 1 -1 218 | -1 219 | -1 0 2 -1 -1 0 -1 -1 1 -1 -1 0 -1 220 | -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 0 221 | -1 0 -1 -1 1 -1 -1 0 0 2 1 -1 -1 -2 -1 -1 -1 -1 222 | 1 0 -1 -1 1 -1 -1 -1 0 223 | 0 1 -1 224 | 1 1 -1 -1 -1 0 1 225 | -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 0 226 | 1 227 | -1 -2 0 -1 1 1 228 | -1 -1 -1 -1 -1 -1 0 229 | -1 -2 -1 -1 1 230 | -1 -1 -1 -1 0 231 | -1 0 -2 -1 -1 -1 -1 -2 -1 -1 -1 0 -1 0 -1 -1 232 | -1 -1 -1 1 0 1 0 -1 233 | -1 -1 0 0 -1 1 -1 0 0 -1 -1 1 1 -1 -1 1 -1 -1 -1 -1 -1 0 0 -1 -1 0 -1 -1 -1 -1 0 -1 -1 1 -1 -1 -1 -1 -1 0 1 -1 0 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 -2 -1 -2 1 -1 234 | 0 0 -1 -1 -1 -1 -1 1 1 -1 -1 -1 1 -1 0 -1 -1 -1 1 0 1 -1 -1 -1 235 | -1 -1 0 236 | -1 0 1 -1 0 0 1 0 0 1 1 -1 -1 -1 1 237 | 2 0 0 -1 1 -1 2 1 -1 2 238 | -1 -1 -1 -1 0 -1 -1 0 0 0 -1 -1 0 239 | -1 -1 -1 1 1 -1 -1 -1 240 | 1 0 -1 1 -1 0 0 0 241 | -2 -1 -1 0 -1 0 -1 1 -1 -1 0 1 -1 -1 242 | -1 0 1 0 -1 -1 -1 -1 -1 0 -1 -1 0 -1 243 | -1 1 -1 -1 -1 -1 0 244 | 0 -1 -1 0 -1 1 0 -1 -1 -1 -1 -1 0 -1 245 | -2 246 | 2 1 -2 1 -1 1 247 | -1 -1 248 | 0 1 -1 -1 -1 -1 -1 -1 1 -2 1 0 -1 -1 0 1 -1 -1 -1 -1 0 -1 -1 -1 -1 0 0 -1 1 -1 0 1 2 0 -1 -2 -1 1 -2 0 -2 -1 -1 -1 1 -1 0 249 | 0 -1 -1 -1 -1 -1 250 | -1 1 -1 -1 -1 -1 1 0 251 | -1 -1 -2 -1 -2 1 -1 -1 -1 -1 1 -1 0 0 -1 -1 -2 -1 252 | 2 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 253 | 0 -1 -1 -1 -1 -1 0 -1 -1 1 254 | 0 0 0 -1 0 0 -1 255 | -1 -1 -1 -1 -1 -1 -1 0 -1 -1 0 1 -1 1 0 -1 1 -1 1 1 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 1 -1 -2 1 1 -1 -1 -1 -1 1 -1 -1 256 | 0 -1 -1 -1 -1 -1 1 -1 -1 0 -1 0 0 -1 0 -1 -1 257 | -1 -1 1 1 2 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 1 -1 -1 -2 258 | -1 -1 -1 -1 0 259 | -1 -1 -1 -1 0 -1 -1 260 | 1 -1 0 -1 1 261 | 0 -1 1 -1 -1 -1 -1 262 | 0 0 1 -1 -1 -1 -1 -1 -1 0 0 0 -1 1 1 -1 0 0 0 -1 1 -1 -1 1 -1 0 1 0 0 1 0 -1 -1 -1 -1 -1 0 1 -1 1 -1 263 | -1 1 -1 -1 -1 -1 -1 -1 -2 0 -1 -1 -1 -1 1 264 | -1 -1 265 | 1 -1 -1 -1 -1 266 | 2 1 1 0 0 0 1 1 1 -1 -1 -2 -1 267 | -1 0 0 -1 -1 0 1 2 -1 268 | -1 0 -1 1 -1 -1 -1 0 269 | -1 1 -1 270 | -1 -1 -1 -1 -1 2 -1 -1 -1 1 271 | -1 -1 -1 272 | -1 -1 -1 -1 0 -1 -1 -1 0 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 -1 0 0 0 0 -1 -1 -1 1 1 1 -1 1 -1 0 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 1 0 273 | -1 -2 -1 -1 -1 -1 0 0 -1 274 | -1 275 | 1 276 | 0 1 277 | -1 278 | 0 -1 -1 -1 -1 1 0 0 -1 -1 -1 0 -1 279 | -1 1 -1 1 -1 0 -1 -1 1 280 | -1 -1 -1 -1 -1 -1 -1 -1 -1 1 1 -1 -1 -1 -1 281 | 0 -1 0 -1 -1 -1 -1 -1 -1 -1 1 -1 0 1 282 | -1 1 283 | 1 -2 -1 0 -1 -1 0 -1 -1 -1 1 2 -1 284 | -1 -1 -1 285 | 2 -1 1 -2 -1 0 -1 0 -1 0 0 1 -1 1 286 | 1 -1 -2 -1 -1 -1 -1 -1 -1 0 1 0 287 | 0 -1 -1 -1 -1 -1 1 288 | 0 0 -1 289 | -1 -1 -1 -1 0 -1 -1 290 | 0 -1 -1 0 -2 291 | 2 -1 1 0 1 0 0 -1 -1 -1 -1 0 0 -1 -1 0 292 | -1 -1 1 -1 293 | -1 1 -1 -1 -1 -1 0 0 1 294 | 0 -1 -1 -1 -1 -1 -1 -1 295 | 0 0 -1 0 -1 -1 296 | -1 -1 1 -1 297 | 1 -1 -1 298 | 0 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 1 -1 -1 -2 -1 1 -1 -2 0 -1 1 -1 -1 -1 0 0 299 | -1 -1 -1 0 -1 0 0 0 -1 1 1 -1 -1 0 -1 1 300 | 1 1 -1 -1 1 1 301 | 1 -1 -2 -1 -1 -1 0 0 0 -1 0 0 0 -1 0 -1 1 1 1 302 | 0 -1 -1 -1 -1 -1 1 -1 0 -1 -1 0 -1 1 303 | -1 -1 -1 -2 -1 -1 -1 304 | -2 1 0 305 | -1 -1 1 -1 -1 -1 0 0 1 1 -1 0 0 0 0 0 0 1 306 | 0 0 -1 0 -1 0 -1 0 307 | 0 0 308 | -1 -1 -2 -1 -2 -1 309 | -1 -1 310 | -1 1 -1 -1 -1 -1 -1 -1 0 -1 -1 0 -1 0 0 0 0 311 | 1 -1 -1 -1 -1 -1 0 0 -1 -1 -1 1 1 -1 -1 1 -2 -1 -1 1 1 312 | -1 -1 -1 -1 -1 313 | -2 -1 -1 314 | -1 -1 0 -1 -1 1 315 | -1 0 0 0 0 0 0 0 -1 316 | -1 -1 0 1 -1 -1 -1 1 317 | -1 -1 -1 -2 -2 -1 -2 -1 1 -1 -1 318 | 0 0 -1 0 1 -1 -1 0 -1 -1 0 0 -1 0 0 -1 -1 1 -1 -1 -1 1 1 0 0 -1 -1 -1 0 1 -1 0 0 0 -1 1 0 -1 0 -1 -1 -1 0 -1 -1 -1 319 | 0 1 0 -1 0 -1 0 -1 0 -1 0 -1 -2 320 | -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 321 | -1 -1 1 -1 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 1 -1 322 | -1 1 -1 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 0 -1 -1 -2 323 | -1 -1 2 324 | -1 -1 1 -2 -2 -1 -1 -1 -1 -1 1 -1 1 0 -1 -1 0 -1 -2 1 -1 1 325 | -1 -1 0 -1 1 -1 -1 326 | -1 -2 -1 0 -1 -1 -1 0 0 0 1 -1 -1 -1 0 0 -1 0 -1 -1 -1 1 1 327 | 1 -1 -1 0 -1 -1 1 1 -1 0 1 -1 -1 -1 328 | -1 -1 -1 0 329 | 2 1 330 | 1 0 -1 -1 -2 1 331 | -1 0 -1 332 | -1 1 0 -1 -1 -1 -1 0 -1 -1 333 | -1 -1 -1 -1 1 -1 0 0 0 1 -1 -1 334 | -1 0 0 -1 1 0 335 | -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 0 -1 -1 336 | -1 -1 1 0 -1 -1 337 | -1 -1 1 -1 338 | -1 0 -1 0 -2 -1 1 -1 -1 0 -1 -1 -1 -1 -1 0 -1 -1 0 0 -1 -1 -1 1 1 -1 0 -1 0 -1 -1 -1 -1 -1 -1 0 -1 1 0 -1 1 1 0 -1 -1 1 1 -1 -1 -1 0 -1 0 1 -1 -1 -1 -1 -1 0 339 | 0 -1 -1 -1 -1 -1 -1 -1 0 -1 340 | -1 -1 -2 -1 -1 -2 341 | -1 -1 1 342 | -1 1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 343 | -1 0 -1 344 | 0 -1 345 | -1 -1 0 -1 -1 -1 0 -1 -1 0 346 | -1 -1 -2 0 1 -1 -1 -1 347 | 0 -1 -1 0 -2 -1 -1 -1 -1 -1 -1 -1 0 0 348 | 0 -1 -1 -1 -1 -1 1 -1 349 | 1 -1 350 | -1 -1 0 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -2 -1 -1 -1 0 0 -1 -1 0 0 -1 0 1 1 0 0 0 -1 -1 -1 -1 0 -2 0 -1 -1 -1 0 1 0 -1 351 | 0 1 -1 -1 -1 -2 -1 352 | 1 2 1 0 -1 -1 -1 -1 -1 -1 -1 1 -1 -1 -1 -1 0 -1 -1 0 -1 -1 1 -2 -1 -1 1 -1 -2 1 -1 -1 -1 1 -1 1 -2 -1 0 -1 1 0 -1 -1 0 -1 353 | -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 0 0 0 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 354 | -1 0 -1 -1 1 2 -1 -1 -1 -1 -1 -2 1 0 -1 -1 1 -1 -1 355 | 0 0 -1 -1 -1 0 1 -1 1 -1 1 356 | 0 -1 0 0 -1 2 357 | -1 -1 1 358 | 0 0 -1 -1 -1 359 | -1 0 1 -1 0 -1 2 0 0 0 0 -2 -1 -1 0 0 360 | 1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 1 -1 -1 -1 -1 0 361 | 1 -1 -1 -1 -1 0 0 -1 -1 -1 -1 0 1 362 | -1 -1 0 363 | -1 -1 -1 -1 -2 0 -1 0 0 364 | 1 0 -1 1 -1 -1 0 0 1 365 | -1 -1 0 1 -1 -1 -1 1 1 -1 0 -1 -1 -1 -1 -1 0 -1 -1 -1 0 -1 0 -1 -2 -1 0 -1 -1 -1 -1 -1 -2 -1 0 -1 -1 -1 0 1 0 0 -1 0 0 0 2 -1 -1 366 | -1 -1 0 -1 -1 -1 -1 -1 0 367 | -1 1 -1 -1 -1 -1 -1 0 -1 -1 1 -1 0 1 -1 0 0 0 368 | 1 -1 1 -1 -1 -1 0 369 | -1 -1 -1 -1 -1 -1 370 | -1 -1 -1 371 | -1 -1 1 -1 -1 0 372 | -1 -1 -1 -1 -1 -1 -2 -1 -1 -1 -2 1 1 -1 373 | -1 -1 -1 1 -1 374 | -1 -1 -1 -1 -1 -1 -1 -1 0 1 -1 375 | -1 -1 0 376 | 0 -1 -1 0 0 -1 -1 -2 -1 -1 -1 0 -1 -1 1 377 | -1 -1 -1 378 | -1 1 1 -1 -1 0 -1 -1 0 1 -1 -1 -1 -1 -1 -1 1 0 1 -1 -1 -1 -1 0 -1 -1 0 -1 -1 1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 1 1 0 -2 -1 -1 2 -1 -1 0 379 | -1 0 -1 -1 -1 1 0 380 | -1 -1 -1 0 -1 381 | -1 -1 -2 -1 0 1 -1 -1 1 0 -2 -1 -1 0 0 -1 -1 1 -1 -1 -1 0 0 1 -1 -1 2 0 -1 -1 -1 382 | -1 -1 1 -1 -1 -1 1 1 -1 0 -1 -1 -1 0 1 383 | -1 -1 2 384 | -1 -1 -1 -1 -1 -1 -1 -1 1 0 -1 -1 -1 -1 1 -1 -1 0 -1 1 1 385 | 0 -1 0 -1 -1 -2 1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 386 | -1 -1 1 -1 -1 -1 -1 1 -1 387 | 1 0 -2 -1 -1 -1 -1 -1 -1 -1 1 388 | -1 -1 -1 -1 -1 0 -1 389 | 1 1 -1 -1 -1 -1 1 1 -1 390 | -1 -1 -1 -1 -1 -1 -1 0 -1 -1 391 | 0 -1 0 -1 1 -1 0 -1 -1 -1 -1 0 392 | -1 -1 0 -1 -1 393 | 1 -1 0 -1 -1 -1 -1 -1 -1 -1 394 | -1 -1 0 -1 -1 395 | 0 0 -1 1 1 -1 -1 0 -1 -1 -1 -1 1 -1 0 1 0 -1 1 0 -1 0 1 1 0 -1 -1 -1 -1 -1 -1 0 0 0 -1 -1 -1 0 -1 0 0 -1 1 -1 -1 0 -1 -1 0 -1 0 0 0 -1 -1 -1 0 -1 0 -1 0 -1 1 -1 0 1 -1 396 | 1 -1 0 0 0 -1 -2 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 0 -1 -1 397 | 1 0 -1 1 0 -1 -1 398 | -1 -1 1 -1 -1 0 -1 0 1 0 -1 1 1 0 -1 -1 399 | -1 0 0 -1 1 -1 0 -1 -1 -1 400 | -1 -2 -1 -1 -1 -1 0 1 -1 -1 -1 0 0 401 | -1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 -2 0 1 0 -1 1 -1 0 -1 1 -1 0 -1 -1 -2 0 -1 2 -1 -1 1 402 | 0 0 0 -1 -1 -1 -1 -1 -1 0 0 0 0 -1 -1 -1 -1 -1 -1 -1 -1 0 0 0 -1 403 | 0 -1 -1 0 1 404 | -1 -1 -1 405 | -1 -1 406 | 1 0 0 -1 -1 -1 -1 0 -1 1 407 | -1 -1 1 -1 1 0 2 -1 -1 -1 -1 -1 1 0 -1 -1 -1 -1 1 0 1 1 0 -1 0 0 -1 -1 -1 0 408 | -1 0 -2 -1 1 0 -1 -1 -2 -1 0 0 0 0 -1 1 0 -1 -1 -1 0 -2 -2 -1 1 -1 409 | -1 -1 0 1 -1 -1 -1 -1 0 -1 -1 -1 0 -1 -1 0 -1 -1 0 0 -1 -1 -1 -1 1 1 1 -1 1 1 -2 -1 -1 410 | -1 -1 0 -1 -1 -1 -1 1 -1 -1 411 | 0 -1 0 0 -1 -1 0 412 | -1 -1 413 | 1 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 414 | -1 1 -1 -1 -1 -2 -1 -1 415 | 1 0 -1 -1 -1 -1 1 -1 0 0 -1 0 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 0 1 -1 -1 416 | 0 0 0 0 -1 -1 -1 -1 -1 -2 0 417 | -1 -1 1 -1 0 -1 -1 1 418 | -1 0 -1 -1 -1 -1 -1 0 1 419 | -1 -1 420 | -1 -1 0 -2 421 | -1 0 -1 -1 1 1 -1 -1 -1 -1 -1 -1 0 -1 -1 0 -1 -1 -1 -1 0 -1 1 -1 -1 -1 -1 -1 -1 -1 0 0 1 0 0 -1 -1 -1 -1 -1 -1 2 -2 1 -1 422 | 0 -1 -1 -1 -1 -1 0 -1 -1 1 0 0 -1 -1 -1 -2 -2 -1 -1 -1 -2 -1 -1 0 0 423 | 0 -1 -1 -1 -1 0 0 -1 0 -1 -1 424 | 0 -1 -1 1 0 -1 1 -1 425 | 1 1 -1 -1 0 -1 -1 -1 1 -1 -1 -1 1 -1 426 | 1 -2 427 | 1 -1 -1 1 428 | -1 -1 1 -1 -1 -1 0 -1 -1 -1 -2 0 -2 0 -1 -1 -1 429 | 1 -1 1 430 | -------------------------------------------------------------------------------- /keras_tutorial/keras_deck.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dylandrover/keras_tutorial/027b7a95968ea5046d22c6fc44029e441f554f1e/keras_tutorial/keras_deck.pdf -------------------------------------------------------------------------------- /keras_tutorial/keras_tutorial.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dylandrover/keras_tutorial/027b7a95968ea5046d22c6fc44029e441f554f1e/keras_tutorial/keras_tutorial.pdf -------------------------------------------------------------------------------- /keras_tutorial/mnist_cnn.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | from __future__ import print_function 3 | import numpy as np 4 | np.random.seed(1337) # for reproducibility 5 | 6 | from keras.datasets import mnist 7 | from keras.models import Sequential 8 | from keras.layers.core import Dense, Dropout, Activation, Flatten 9 | from keras.layers.convolutional import Convolution2D, MaxPooling2D 10 | from keras.utils import np_utils 11 | 12 | ''' 13 | Train a simple convnet on the MNIST dataset. 14 | Run on GPU: THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 python mnist_cnn.py 15 | Get to 99.25% test accuracy after 12 epochs (there is still a lot of margin for parameter tuning). 16 | 16 seconds per epoch on a GRID K520 GPU. 17 | ''' 18 | 19 | batch_size = 128 20 | nb_classes = 10 21 | nb_epoch = 12 22 | 23 | # input image dimensions 24 | img_rows, img_cols = 28, 28 25 | # number of convolutional filters to use 26 | nb_filters = 32 27 | # size of pooling area for max pooling 28 | nb_pool = 2 29 | # convolution kernel size 30 | nb_conv = 3 31 | 32 | # the data, shuffled and split between tran and test sets 33 | (X_train, y_train), (X_test, y_test) = mnist.load_data() 34 | 35 | X_train = X_train.reshape(X_train.shape[0], 1, img_rows, img_cols) 36 | X_test = X_test.reshape(X_test.shape[0], 1, img_rows, img_cols) 37 | X_train = X_train.astype("float32") 38 | X_test = X_test.astype("float32") 39 | X_train /= 255 40 | X_test /= 255 41 | print('X_train shape:', X_train.shape) 42 | print(X_train.shape[0], 'train samples') 43 | print(X_test.shape[0], 'test samples') 44 | 45 | # convert class vectors to binary class matrices 46 | Y_train = np_utils.to_categorical(y_train, nb_classes) 47 | Y_test = np_utils.to_categorical(y_test, nb_classes) 48 | 49 | model = Sequential() 50 | 51 | model.add(Convolution2D(nb_filters, nb_conv, nb_conv, 52 | border_mode='same', 53 | input_shape=(1, img_rows, img_cols))) 54 | model.add(Activation('relu')) 55 | model.add(Convolution2D(nb_filters, nb_conv, nb_conv)) 56 | model.add(Activation('relu')) 57 | model.add(MaxPooling2D(pool_size=(nb_pool, nb_pool))) 58 | model.add(Dropout(0.25)) 59 | 60 | model.add(Flatten()) 61 | model.add(Dense(128)) 62 | model.add(Activation('relu')) 63 | model.add(Dropout(0.5)) 64 | model.add(Dense(nb_classes)) 65 | model.add(Activation('softmax')) 66 | 67 | model.compile(loss='categorical_crossentropy', optimizer='adadelta') 68 | 69 | model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch, show_accuracy=True, verbose=1, validation_data=(X_test, Y_test)) 70 | score = model.evaluate(X_test, Y_test, show_accuracy=True, verbose=0) 71 | print('Test score:', score[0]) 72 | print('Test accuracy:', score[1]) 73 | -------------------------------------------------------------------------------- /keras_tutorial/regular.txt: -------------------------------------------------------------------------------- 1 | -1 -1 -1 1 1 -1 -1 2 | 1 -1 -1 3 | 1 -1 -1 -1 1 -1 -1 -1 -1 -1 2 -1 -1 1 1 4 | 2 2 1 1 0 1 0 1 1 0 -1 2 0 1 0 -2 1 0 0 -1 -1 1 5 | -1 -1 -1 1 6 | 1 1 1 1 7 | -1 1 0 8 | 0 1 1 1 9 | 1 -1 1 -1 -1 1 -1 0 1 -1 -1 -2 -1 -1 0 -1 -1 10 | 1 -1 1 2 1 2 -1 1 0 2 1 1 1 1 1 -1 -1 -1 11 | 1 -1 -1 -1 -1 -1 12 | -1 -1 -1 -1 1 -1 -1 1 -1 -1 1 13 | 1 -1 1 1 1 2 1 -1 -2 -1 -1 -1 0 1 -1 0 0 14 | 1 -1 15 | -1 0 1 -1 1 16 | 0 -1 -1 0 -1 -1 -1 1 1 -1 1 17 | -1 -1 -1 -1 1 -1 1 -1 -1 -1 -1 -1 -1 1 1 -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 18 | 1 1 1 -1 -1 1 19 | 1 2 1 20 | -1 21 | 1 -1 -1 1 22 | -1 -1 1 1 -1 -1 -1 0 0 -1 1 -1 1 23 | 0 0 1 24 | -1 -1 -1 -1 1 -1 -1 0 2 1 -1 -1 -1 1 -1 -1 -1 1 1 1 2 1 -1 2 -1 1 0 1 -1 0 1 25 | 0 -1 -1 -1 2 26 | 0 1 0 1 -1 -1 27 | -1 -1 -1 -1 0 28 | -1 -1 -1 1 -1 -1 -2 -1 1 -1 1 1 1 -1 1 1 0 1 29 | 2 1 -1 0 -1 -1 1 -1 1 0 30 | 1 0 -1 31 | 0 0 0 32 | -1 -1 1 -1 -1 1 -1 -1 -1 -1 1 -1 1 2 1 -1 1 33 | -1 1 -1 -1 -1 1 1 -1 -1 -1 -1 -1 -1 -1 -1 0 1 34 | -2 -1 1 1 0 -1 -1 -1 -1 35 | 2 0 1 -1 1 -1 -1 36 | 1 -1 -1 1 -1 1 0 -1 -1 -1 1 37 | 1 -1 -1 -1 -1 -1 -1 38 | -1 -1 1 1 -1 0 0 1 -1 0 39 | 1 -1 -1 -1 40 | 1 1 1 -1 -1 -1 -1 -1 -1 1 41 | -1 1 -1 42 | 1 1 -1 0 1 0 0 43 | -1 0 1 -1 -1 1 1 -1 -1 0 -1 1 1 -1 2 0 -1 1 2 44 | -1 1 -1 0 -1 1 -1 1 1 0 2 45 | 1 -1 -1 2 -1 -1 1 46 | -1 -1 0 0 -1 0 0 0 1 -1 -1 1 -1 0 1 1 -1 -1 -1 -1 1 1 -1 2 -1 -1 -1 0 47 | -1 -1 -1 1 0 -1 -2 1 -1 -1 -1 0 1 -1 -1 -1 -1 -1 1 -1 48 | -1 1 -1 -1 0 -1 -1 -1 -1 -1 0 -1 -1 -1 0 49 | 1 0 1 -1 -2 -1 -1 -1 -1 50 | 1 1 1 1 -1 1 -1 0 51 | 0 -1 1 -1 1 0 1 1 0 52 | -1 -1 1 1 53 | -1 -1 1 -1 1 0 -1 -1 -1 1 -1 -1 -1 54 | 1 -1 1 -1 -1 0 0 -1 1 -1 -1 -1 -1 55 | -1 -1 -1 -1 1 0 -1 -1 0 -1 1 1 -1 -1 -1 -1 -1 0 1 -1 56 | 1 0 1 57 | -1 -2 0 -1 1 -1 -1 0 58 | -1 0 1 1 1 -1 -1 1 0 0 -1 -1 -1 -1 -1 2 -1 -1 59 | -1 -1 2 -1 60 | 1 1 -1 0 1 -1 1 0 1 1 1 1 -1 1 -1 1 0 2 -1 1 1 1 -1 -1 61 | 1 -1 1 -1 1 1 -1 0 -1 1 -1 -1 0 0 -1 -1 -1 1 -2 0 1 1 1 1 1 -1 -1 0 -1 62 | -1 1 63 | 1 1 0 0 0 0 1 0 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 64 | 1 -1 -1 -1 0 -2 1 -1 2 1 -1 -2 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 2 1 1 2 -1 65 | -1 -1 -1 -1 -1 1 -1 1 1 -1 -1 0 66 | -1 -1 1 -1 -1 -1 1 -1 -1 -1 1 -1 2 -1 67 | -1 -1 1 -1 -1 1 1 -1 -1 -1 1 -1 -1 1 -1 -1 1 1 -1 0 0 -1 -1 -1 0 68 | 1 1 1 -1 -1 -1 1 1 69 | -2 -1 -1 -2 -1 -1 70 | -1 -1 -1 1 0 1 1 -1 -1 -1 -1 -1 1 0 2 -1 -1 1 -1 1 1 1 -1 71 | -1 1 -1 -1 1 -1 -2 -1 0 72 | -1 -1 1 -1 -1 -1 -1 1 73 | -1 1 -1 -1 -1 74 | 1 -1 0 1 -1 -1 -1 -2 1 75 | -1 -1 1 1 -1 0 76 | 1 -1 1 77 | -1 1 -1 -1 -1 1 78 | 1 -1 2 -1 0 -1 0 0 1 -1 0 1 0 -1 -1 -1 -1 1 1 -1 0 0 1 -1 -1 1 2 -1 -1 0 1 0 -1 2 1 79 | 1 1 -1 2 2 -1 0 0 1 -1 -1 -1 -1 80 | 1 1 2 81 | -1 -1 1 -1 1 -1 -1 1 1 -1 -1 1 -1 -1 0 -1 -1 82 | -1 -1 -1 -1 1 1 -1 1 1 1 -1 -1 -1 1 1 -1 -1 1 1 1 -1 -1 -1 1 -2 2 -1 -1 -1 83 | 1 -1 -1 -1 0 -1 1 0 -1 1 -1 -1 1 -1 1 0 -1 1 -1 0 0 1 -1 0 -1 -1 -1 0 0 1 -1 84 | -1 -1 0 1 -1 -1 0 1 -1 -1 1 2 -1 0 85 | 0 -1 2 0 -1 -1 -1 1 0 0 -1 0 -1 -1 -1 86 | 1 -1 -1 -1 -1 -1 87 | -1 2 1 -1 0 -1 -1 1 -1 88 | 0 1 -1 -1 0 -1 1 89 | 1 1 -1 1 -1 1 0 1 1 -1 2 1 1 -1 2 -1 90 | 1 1 -1 -1 0 1 -1 0 -1 0 -1 1 -1 2 1 1 -1 1 1 91 | -1 -1 -1 -1 -1 -1 0 0 0 0 1 0 92 | -1 93 | -1 -1 1 1 1 94 | 0 -1 -1 -1 -1 -1 -1 1 1 1 95 | 0 -1 0 -1 -1 -1 96 | 1 1 1 0 97 | -1 0 1 98 | -1 2 1 1 1 99 | -1 -1 1 1 -1 -1 1 -1 1 -1 -1 -1 1 1 -1 -1 1 100 | -1 -1 0 -1 -1 -1 -1 -1 0 1 1 -1 1 -1 -1 -1 -1 -1 -1 101 | -1 -1 -1 -1 1 102 | -1 1 -1 1 -1 1 -1 2 103 | -1 1 -1 104 | 1 1 -1 1 105 | 1 1 -1 0 -1 0 0 1 1 106 | -1 -1 0 0 1 107 | -1 0 1 -1 -1 0 0 -1 -1 -1 -1 -1 0 -1 -1 -1 0 -1 0 -1 1 -1 -1 -1 108 | 1 2 0 1 0 -1 -1 -1 -1 -1 -1 109 | 1 1 -1 1 -1 1 1 -1 -1 110 | 1 0 1 1 1 0 -1 1 0 -1 111 | 1 112 | 1 2 -2 -1 -2 113 | -1 1 114 | -1 -1 -1 -1 -1 -1 115 | 1 -1 1 1 1 116 | 0 -1 -1 1 1 -1 -1 -1 1 117 | -1 -1 1 -1 118 | 1 1 -1 119 | 1 -1 1 -1 120 | 1 0 1 -1 -1 -1 1 -1 -1 0 1 -1 121 | 1 -1 0 0 0 -1 0 122 | 0 1 -1 -1 -1 123 | -1 -1 124 | -2 -1 1 -1 1 125 | 1 1 -1 -1 1 -1 0 1 -1 2 -1 1 1 -1 126 | 1 1 1 0 127 | 1 -1 -1 1 0 0 -1 -1 -1 -1 -1 1 -1 -1 1 1 -1 0 -1 -1 1 -1 -1 1 128 | -1 1 0 -1 -1 -1 1 1 0 0 0 1 -1 -1 1 -1 1 -1 0 -1 -1 1 0 1 0 -1 0 1 1 1 -1 1 -1 -1 0 0 -1 0 0 -1 0 0 1 -1 -1 1 -1 129 | -1 -1 -1 -1 -1 -1 1 1 130 | -2 1 1 0 1 1 2 1 -1 1 1 131 | 1 0 -1 -2 -1 -2 -2 -1 -1 1 -1 1 -1 -2 0 132 | -1 0 -1 -1 1 -1 0 0 -1 1 -1 -1 1 -2 -1 -1 1 -1 1 1 1 0 1 -1 -1 -1 -1 2 1 1 1 1 -1 2 -1 1 133 | -1 1 -1 -1 -1 -1 1 1 -1 -1 1 -1 134 | -1 -1 1 1 0 -1 1 -1 1 0 2 -1 -1 2 -1 0 -1 1 -1 1 1 1 135 | -1 1 -1 136 | -1 -1 0 0 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 1 -1 1 -1 -1 -1 -1 -1 1 0 137 | -1 1 138 | 0 1 1 0 1 -1 -1 -1 0 -1 0 -1 -1 -1 139 | 1 -1 1 -1 140 | 1 1 -1 1 -1 -1 -1 -1 -1 -1 -1 0 1 2 141 | -1 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 -2 -1 -1 -1 -1 -1 -2 -1 -1 0 1 -1 1 1 0 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 1 1 -1 1 0 142 | -2 -1 -1 -1 -1 -2 -1 1 1 -1 -1 1 143 | 2 1 1 -1 2 1 0 1 -1 -1 1 2 144 | -1 -1 1 -1 0 -1 -1 0 -1 1 -1 1 -1 -1 -1 -1 0 -1 1 1 0 -1 1 -1 0 -1 -1 1 0 -1 -1 0 -1 0 1 0 145 | -1 1 -1 1 -1 -1 -1 -1 -1 -1 146 | -1 -1 -1 -1 -1 1 -1 0 1 -1 -1 -1 -1 1 0 -1 -1 0 0 -1 1 -1 -1 -1 -1 0 1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 1 -1 -1 1 0 -1 0 -1 -1 1 -1 1 1 -1 -1 -1 2 1 -1 -1 -1 -1 -1 147 | 0 -1 -1 -1 -1 1 148 | -1 -1 -1 149 | 1 -1 1 0 0 1 -1 -1 1 -1 -1 -1 1 150 | 1 1 -1 0 151 | -1 -1 -1 -1 -1 1 0 1 -1 -1 -1 -1 2 -1 0 1 -1 -1 1 0 -1 1 -1 1 -1 -1 -1 -1 -1 -1 -1 1 -2 -2 1 1 -1 1 -1 1 -1 0 0 1 -1 0 1 -1 -1 -2 1 -1 0 0 -1 -1 -1 -1 0 -1 152 | -1 -1 1 1 -1 -1 1 -1 -1 153 | 1 1 1 154 | -1 -1 0 2 1 -1 -2 -1 155 | 1 1 156 | -1 0 -1 -1 0 -1 -1 -1 -1 0 1 -1 -1 -1 1 1 1 1 157 | 2 1 1 1 1 1 -1 0 158 | 1 1 1 -1 1 159 | -1 1 1 160 | 1 2 -1 1 1 1 -1 -1 -1 -1 0 1 1 0 -1 -1 1 -1 -1 -1 1 161 | -1 1 1 1 1 162 | -1 1 -1 -1 0 1 163 | 1 1 0 -1 -1 -1 -1 -1 0 1 -1 -1 0 -1 -1 1 0 1 -1 1 -1 0 -1 1 -1 -1 1 -1 -1 -1 1 164 | 1 -1 1 2 1 165 | -1 -1 1 -1 -1 -1 0 166 | 0 -1 167 | 1 -1 1 2 2 -1 168 | 1 0 -1 -1 -1 -1 -1 -1 -1 -1 1 1 1 1 -1 -1 -1 -1 -1 0 -1 169 | -2 -1 0 -1 0 -1 0 -1 -1 0 0 1 0 -1 -1 0 -1 -1 -1 -1 0 2 0 -1 170 | -1 -1 1 -1 1 -1 1 1 -1 171 | 0 -1 172 | -1 -1 0 -1 1 173 | 2 -1 1 -1 -1 -1 -1 -1 174 | 1 1 1 1 2 175 | 1 1 1 1 -1 176 | 1 2 177 | 1 1 0 1 2 178 | 1 2 1 1 1 -1 1 -1 1 0 1 -1 0 0 1 2 1 1 179 | -1 0 0 0 1 -1 1 1 1 -1 -1 0 -1 0 180 | 2 -1 0 181 | 0 0 0 -1 -1 0 -1 -1 -1 -1 0 1 -1 -1 0 0 -1 0 0 0 -1 1 -1 0 0 0 -1 0 -1 0 1 0 1 -1 1 0 0 -1 0 -1 -1 0 -1 182 | 0 -1 -1 1 -1 1 -1 1 -1 183 | 1 0 184 | 1 1 1 0 -1 2 -1 -1 1 2 185 | -1 -1 -1 -1 -1 1 1 1 186 | -1 -1 -1 1 -1 -1 1 -1 1 0 -1 -1 -1 0 187 | 1 0 -1 -1 -1 0 -1 -1 -1 -1 0 -1 1 -1 -1 0 1 0 -1 -1 1 -1 -1 188 | 1 -1 -2 1 -1 1 -1 -1 1 -1 -1 0 -1 0 -1 0 0 0 0 189 | 0 0 0 -1 -1 0 1 0 1 -1 1 0 -1 2 1 1 0 -1 1 1 1 1 -1 -1 -1 1 0 -1 -1 0 1 -1 0 -1 1 0 -1 1 1 1 -1 -1 -1 -1 -1 0 -1 1 1 -1 0 1 -1 1 -1 -1 0 -1 -1 0 1 1 -1 1 -1 -1 190 | 0 -1 1 191 | 2 1 -1 1 2 192 | -1 -1 1 -1 -1 -1 -1 1 -1 0 -1 -1 -1 -1 -1 -1 0 -1 -1 0 -1 0 1 -1 -1 -1 0 -1 1 -1 193 | -1 -1 -1 -1 -1 -1 -1 -1 0 1 -1 1 1 1 -1 -1 -1 -1 -1 0 1 1 -1 1 -1 -1 1 0 -1 -1 -1 -1 0 1 194 | 1 1 -1 1 1 195 | 2 1 -1 -1 196 | 1 0 0 1 -1 0 197 | -1 -1 198 | -1 -1 -1 0 1 0 1 -1 -1 -1 1 1 199 | -1 -1 -1 1 200 | 0 1 1 1 0 1 -1 0 -1 -1 -1 -1 -1 1 -1 0 -1 0 -1 1 1 1 0 0 -1 -1 201 | -1 1 -1 1 -1 1 1 202 | -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 0 -1 1 203 | 0 1 -1 0 1 2 1 -1 1 -1 0 204 | -1 -1 0 205 | -1 1 -1 -1 -1 -1 -1 1 206 | 2 -1 1 -1 -1 1 0 0 207 | 1 -1 -1 -1 -1 208 | -1 1 1 -1 1 209 | -1 -1 -1 -1 210 | 1 -1 1 -1 -1 1 -1 -1 -1 -1 211 | -1 1 -1 0 1 212 | -1 1 1 -1 -1 -1 -1 213 | -1 -1 -1 -1 -1 -1 -1 1 -1 -1 -1 0 -1 -1 -1 1 -1 1 0 -1 1 -1 -1 214 | -1 -1 215 | -1 -1 -1 1 -1 1 -1 216 | 0 1 -1 1 2 1 217 | 1 -1 0 -1 218 | -1 -1 -1 1 -1 219 | -1 0 1 -1 -1 1 1 0 1 -1 1 2 -1 0 0 -1 0 1 1 -1 0 220 | -1 -1 1 1 -1 -1 -1 -1 0 -1 -1 -1 0 -1 0 0 -1 1 1 -1 0 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 -1 0 0 -1 -1 0 0 2 1 221 | 1 222 | 1 -1 -1 223 | -1 -1 1 -1 0 1 -1 -1 -2 -1 -1 -1 -1 0 -1 -1 1 -1 -1 0 -1 0 0 -1 -1 224 | 1 0 225 | -2 1 1 -1 1 2 1 1 1 1 1 2 1 -1 -1 0 226 | 2 1 0 1 1 1 227 | -1 0 1 228 | -1 0 229 | 1 0 -1 -1 -1 -1 -1 2 1 230 | -1 -1 -2 -1 1 -1 0 1 -1 0 -1 0 -1 -1 -1 1 1 -1 0 -1 -1 -1 -1 1 1 2 0 -1 0 231 | -1 0 1 -1 0 -1 1 -1 0 0 1 -1 1 0 0 -1 0 -1 -1 232 | 1 -1 1 233 | 1 0 0 0 1 1 2 -1 -1 -1 0 0 -1 -1 234 | 0 -1 235 | 1 -1 -1 1 1 1 -1 1 -1 -1 -1 -2 236 | 1 -1 -1 1 0 1 -1 0 -1 -1 0 -1 -1 -1 -1 -1 0 -1 1 237 | 2 1 -1 1 -1 1 2 -1 0 1 -1 -1 -1 238 | 1 -1 -1 2 -1 -1 -1 1 -1 -1 1 1 1 239 | 0 0 -1 1 1 -1 240 | -1 -1 -1 -1 -1 0 241 | 1 0 1 -1 1 0 1 1 2 242 | -1 -1 1 -2 -1 0 1 1 1 0 -1 1 -1 1 1 0 -1 -1 -1 1 -1 0 1 -1 0 0 -1 -1 -1 -1 1 243 | 1 -1 2 -1 -1 1 -1 -1 -1 2 1 244 | 1 -1 -1 1 1 -1 1 -1 -1 1 1 -1 245 | 1 -1 1 0 0 -1 246 | 1 1 247 | 1 1 0 248 | -1 249 | 1 1 0 1 1 1 -1 -1 -1 -1 -1 -1 1 -1 1 -1 1 1 250 | 2 -1 -1 1 0 -1 -1 -1 1 -1 0 251 | -1 -1 -1 -1 1 1 1 -1 0 1 1 -1 -1 1 -1 -1 1 -1 -1 1 -1 1 1 1 1 1 -1 1 -1 -1 2 1 1 -1 -1 1 1 252 | -1 1 -1 1 -1 -1 -1 -1 1 1 1 0 1 1 -1 1 -1 -1 0 0 0 -1 2 1 -1 -1 1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 1 -1 -1 -1 2 253 | 1 1 -1 -1 254 | 1 1 1 1 -1 255 | -1 -1 1 -1 -1 0 -1 1 -1 -1 -1 -1 -1 1 -1 0 -1 -1 -1 -1 -1 0 1 -1 1 -1 1 -1 2 -1 0 -1 -1 1 -1 -1 1 0 256 | -1 -1 -1 -1 -1 -1 -1 -1 -1 0 0 1 1 0 -1 257 | -1 -1 -1 -1 -1 -1 -1 1 1 -1 1 1 -1 1 258 | -1 -1 -1 0 0 1 1 259 | -1 0 -1 -1 -1 0 1 1 -1 -1 1 -2 -1 -2 1 -1 -1 0 -1 -2 -1 -1 0 1 260 | -1 2 -1 1 1 -1 2 1 2 261 | -1 -1 -1 1 1 -1 1 -1 1 -1 0 1 -1 -1 1 1 1 1 262 | 1 -1 0 -1 -1 -1 -1 -1 -1 1 1 -1 -2 -1 -1 1 1 2 -1 -1 -1 -1 -1 -1 0 -1 1 1 -2 -1 1 -1 0 1 0 1 -1 1 1 1 1 1 0 1 -2 -2 -1 1 1 0 263 | 1 1 1 264 | 1 -1 -1 0 -1 -1 0 1 0 0 265 | -1 -1 1 0 266 | -1 1 267 | -1 -1 1 -1 -1 -1 -2 1 -1 268 | 1 -1 -1 -1 -1 269 | 1 1 -1 1 0 270 | -1 -1 0 -1 -1 -1 1 -1 -1 -1 -1 1 -1 1 271 | 1 1 272 | 1 1 -1 -1 1 -1 1 0 1 273 | 1 -1 1 274 | 1 -1 1 2 -1 275 | -1 -1 -1 -1 0 0 1 -1 1 -2 -1 -1 0 1 1 2 0 276 | -1 1 1 277 | -1 -1 -1 -1 -1 1 0 1 1 -1 0 1 0 -1 -1 -1 -1 -1 1 0 -1 -1 -1 0 278 | 1 2 1 -1 1 -1 -1 -1 -1 1 1 1 -1 0 -2 1 1 1 1 1 2 1 279 | -1 -1 -1 1 -1 1 -2 280 | -1 0 2 -1 281 | -1 -1 -1 2 1 2 -1 1 -1 1 1 0 -1 282 | 2 -1 -1 -1 -1 -1 1 1 -1 -1 1 -1 2 283 | -1 1 0 -1 1 -1 0 -1 -1 0 1 0 -1 1 -1 0 1 -1 -1 1 -1 1 -1 1 -1 0 1 2 -1 1 -1 284 | -1 0 285 | -1 -1 -1 1 1 1 1 1 -1 0 1 -1 1 286 | -1 1 0 1 1 -1 0 -1 0 -1 -1 -1 -1 287 | 1 0 -1 -1 -1 1 -1 288 | 1 1 -2 1 289 | 0 1 -1 1 2 290 | 1 2 2 -1 2 291 | 1 0 292 | 1 -1 2 1 293 | -1 1 1 1 1 1 0 1 -1 -1 -1 1 294 | 1 295 | 0 1 1 2 -2 1 -1 2 2 -1 1 -1 1 -1 0 -1 1 0 296 | -1 -1 -1 -1 0 -1 -1 -1 -1 1 -1 2 -1 -1 1 297 | -2 -1 1 -1 0 -1 -1 -1 -1 -1 -1 -2 -1 1 1 -1 0 -2 -1 -2 298 | -1 -1 -1 -1 0 0 0 2 -1 -1 1 1 -1 1 0 -1 -1 0 1 -1 0 1 0 299 | -1 -1 -2 1 1 -1 -1 0 -1 0 -1 0 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 1 -1 -1 1 0 -1 0 1 -1 -1 -1 1 1 -1 -1 0 -1 -1 2 -1 -1 0 -1 -1 -1 0 0 300 | -1 1 1 0 1 1 -1 -1 2 301 | 1 -1 1 -1 -1 -1 -1 1 -1 302 | -1 0 -1 -1 1 -1 -1 -1 303 | 2 1 1 1 -1 -1 -1 -1 304 | -1 -1 -1 -1 -1 1 -1 0 0 -1 305 | 1 0 -1 306 | 1 1 -1 -1 -1 0 -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 1 -1 2 2 -1 307 | -1 -2 1 -1 1 1 -1 -1 -1 1 -1 -1 -1 -1 0 -1 -1 -1 -1 1 -2 -1 1 -1 1 308 | 2 1 1 0 309 | 1 1 0 1 2 2 0 -1 0 -1 -1 -1 0 -1 1 0 0 1 0 -1 -1 1 0 310 | -1 1 1 0 -1 1 311 | 0 -1 1 1 -1 -1 1 -1 -1 1 -1 -1 0 1 2 1 2 1 312 | -1 -1 -1 -1 -1 -1 -1 -1 313 | -1 1 1 -1 1 -1 -1 -1 -1 -1 314 | 1 1 -1 1 1 315 | -1 -1 -1 0 -1 2 1 1 -1 -1 -1 1 -1 -1 -1 -1 1 -1 -1 -2 1 -2 0 -2 -1 -1 -1 -1 1 -1 1 -1 0 -1 -1 1 -1 -1 316 | -1 -1 -2 0 -1 1 1 -1 -1 317 | -1 -1 1 -1 1 1 1 2 0 -1 1 2 1 1 1 0 1 -1 1 1 2 1 318 | 0 1 2 2 319 | -1 -1 1 -1 -1 -1 -1 -1 0 1 1 -2 -1 1 1 -1 1 0 -1 1 -1 -1 -1 0 0 -1 0 1 0 320 | 1 2 1 1 2 1 2 321 | 1 -1 0 -1 -1 1 -1 1 0 -2 -1 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 0 1 -1 -1 0 2 -1 1 -1 0 322 | 1 -1 1 -1 -1 -1 -1 -1 0 -1 0 -1 -1 -1 0 -1 1 323 | 0 -1 0 0 0 324 | 1 0 0 -1 1 1 -1 325 | -1 1 0 -1 -1 0 -1 0 1 1 1 1 -1 1 -1 -1 -1 -1 0 -1 -1 1 -1 -1 -1 1 326 | 1 1 1 1 1 2 -1 -1 -1 -1 -1 327 | 1 1 1 1 328 | -1 -1 -1 -1 1 1 1 -1 -1 -1 -1 -1 -1 -1 1 -1 -1 1 2 -1 0 -1 -1 -1 1 -1 1 1 1 1 -1 1 1 1 329 | 1 -1 330 | 1 -1 0 -1 -1 -1 331 | -1 1 -1 1 1 -1 -1 0 2 0 -1 -1 1 1 -1 -1 -1 -1 -1 -1 -1 0 1 1 -1 -1 332 | -1 2 -1 1 -1 0 -1 2 -1 1 0 1 1 -1 333 | -1 -1 1 1 -1 -1 0 0 -1 0 334 | -1 2 1 -1 -1 -1 -1 -1 -2 0 -1 -1 335 | 0 1 1 -1 1 1 1 2 -1 1 1 0 1 2 1 1 -1 1 -1 336 | 1 1 0 1 -1 0 -1 -1 0 0 -1 1 0 1 1 337 | -1 -1 0 -1 -1 -1 -2 0 -1 338 | 1 -1 -1 0 -1 -1 1 -2 1 339 | 2 0 340 | 0 -1 1 -1 -1 0 -1 341 | 1 -2 -1 1 -2 0 -1 -1 -1 -1 0 0 1 1 -1 342 | -1 0 -1 -1 0 -1 1 1 -1 -1 -1 0 -1 1 1 -1 -1 1 -1 -1 0 -1 1 1 1 1 -1 -1 -1 1 1 -1 -1 -1 -1 1 343 | -1 -1 0 0 -1 0 344 | 1 -1 345 | 1 -1 -1 -2 -1 -2 1 -1 346 | 1 -1 -1 1 347 | -1 -1 -1 1 -1 1 -1 1 0 -1 -1 -1 -1 -1 -1 1 348 | 1 1 1 1 1 1 -1 1 -1 -1 1 -1 349 | 1 2 -1 -1 1 0 350 | -1 1 -1 -1 0 -1 -1 1 2 -1 -1 -1 -1 -1 351 | -1 1 -1 352 | -1 1 -1 -1 -1 -1 -1 0 1 -1 -1 1 353 | 0 -1 1 -1 1 354 | -1 1 1 0 -1 -1 -1 1 1 1 -1 -1 1 -2 -1 0 -1 -1 -1 0 1 -1 0 355 | -1 1 -1 -1 -1 -2 -1 -1 0 -1 -1 -1 -1 -1 1 1 0 -1 -2 -1 -1 -1 -1 -1 1 -1 -1 1 -1 1 -1 -1 -1 -1 1 0 356 | 1 -1 -1 -1 1 1 1 357 | 2 1 0 2 -1 2 358 | -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 1 1 1 -1 -1 0 -1 1 -1 -1 0 -1 -1 -1 359 | -1 1 360 | 1 1 1 1 -1 2 2 0 -1 -1 1 361 | -1 -1 1 -1 -1 -1 -1 1 1 -1 -1 2 1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 362 | 1 1 -1 1 0 1 2 363 | 2 2 1 -1 2 364 | -1 1 0 365 | -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 366 | -1 1 -1 -1 -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 1 -1 -1 -1 -2 -1 -1 1 -1 1 0 1 1 1 -1 2 2 -1 -1 1 1 1 -1 367 | 1 -1 0 -1 -1 -2 0 -1 -2 -1 368 | -1 -1 -1 -1 -1 -1 -1 -1 369 | -1 0 1 -1 0 1 -1 -1 -1 -1 -1 1 -1 1 0 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 2 1 0 -1 -1 -1 0 -1 1 2 370 | 1 -1 371 | 2 1 1 -1 1 -1 0 -1 -1 1 1 372 | 1 -1 1 0 -1 -1 -1 373 | -1 0 -1 1 -1 374 | -1 -1 -1 -1 1 -1 -1 -1 1 -1 -1 -1 0 -1 -1 -1 0 -1 0 -1 -1 -1 -1 -1 -1 1 -1 -1 375 | -1 -1 1 -1 -2 -1 1 -1 -1 376 | 1 1 0 1 -1 0 -1 -1 1 -1 0 1 1 1 1 -1 1 -1 1 377 | 1 -1 -1 -1 0 0 -1 -1 1 1 -1 2 1 -1 1 -1 -1 1 0 -1 1 -1 -1 -1 -1 -1 2 1 378 | -1 0 -1 1 -1 379 | 1 1 -1 -1 -1 -1 0 -1 0 -1 380 | 1 -2 1 -1 1 1 -1 -1 0 1 1 381 | -1 1 -2 -1 -1 1 -1 -1 -1 382 | 1 -1 -1 1 -1 1 -1 0 -1 383 | 1 1 0 -1 -1 -1 1 -1 -1 1 -1 0 -1 0 1 384 | 0 1 -1 385 | -1 -1 1 1 1 1 0 -1 0 1 0 1 0 1 1 -1 1 0 0 -1 -1 -1 1 386 | 1 1 0 -1 1 387 | -1 -1 1 388 | -1 1 1 -1 0 -1 -1 -1 -1 1 0 0 389 | 2 -1 -1 1 1 -1 1 -1 1 1 1 1 390 | -1 -1 1 1 -1 0 391 | -1 -1 -1 1 -1 -1 -2 -1 1 -1 -1 -1 -1 1 1 1 -1 0 0 1 1 -1 0 392 | -1 -1 -1 393 | -1 -1 -1 -1 2 -1 0 -1 -1 -1 394 | -1 0 1 -1 0 -1 1 -1 -1 395 | -1 1 -1 1 1 -1 1 1 1 1 -1 -1 -1 2 396 | -1 -1 0 -2 -1 -1 -1 -2 -1 1 397 | -1 -1 1 -1 1 -1 -1 -1 1 1 -1 0 0 0 -1 -1 398 | 1 1 -1 0 1 -1 0 -1 1 1 0 -1 1 1 2 399 | 1 1 -1 0 -1 -1 400 | -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 1 -1 -1 -1 1 -1 -1 1 1 -1 2 -1 401 | 1 -1 1 1 402 | 1 0 -1 403 | 1 -1 404 | -1 -1 1 -1 1 -1 -1 -1 -1 405 | -1 -1 -1 1 -1 -1 -1 -1 1 0 406 | 0 -1 -1 -1 -1 0 1 -1 -1 1 1 -1 1 0 1 -1 1 407 | 0 -1 1 0 1 0 -1 -1 0 0 -1 1 0 1 -1 0 0 -1 0 1 0 0 1 2 -1 -1 -1 0 -1 1 1 0 -1 1 1 1 0 1 1 1 0 0 -1 0 1 1 1 1 408 | 1 -1 -1 1 409 | 1 1 1 -1 410 | 1 1 1 2 -1 2 411 | 0 1 -1 -1 -1 -1 -1 -1 -2 412 | 1 -2 -1 -1 0 1 1 1 -1 -1 -1 0 -1 -1 -1 -1 -1 1 -1 -1 413 | 1 -1 -1 1 -1 0 -1 -1 1 -1 -2 -1 1 -1 0 -1 1 -1 2 -1 -1 -1 -1 -1 -1 1 -1 1 -1 -1 -1 -1 1 1 -1 0 1 -1 1 414 | -1 -1 -1 1 -1 2 -1 415 | 1 -1 -1 0 416 | 2 -1 -1 1 1 -1 -1 0 -1 -1 -1 417 | -1 2 -1 1 -1 -1 -1 418 | -1 -1 1 1 -1 -1 -1 1 419 | -1 -1 -2 -1 -1 -1 -1 -1 0 -1 1 -1 0 0 1 0 1 -1 -1 1 -1 -1 -1 0 -1 -1 -1 1 1 -1 420 | -1 0 421 | 1 1 422 | 0 1 -1 0 1 -1 0 1 423 | -1 1 -1 1 -1 -1 1 1 424 | -1 1 -1 -1 -1 -1 -1 -1 0 1 -1 -1 1 -1 0 1 2 1 1 1 -1 0 425 | -1 0 -1 0 -1 0 -1 -1 -1 -1 -1 426 | 1 1 0 2 -1 427 | 2 1 -2 -1 -1 -1 -1 -1 1 -1 -1 -1 428 | -1 1 1 -1 1 -1 429 | 2 -1 0 1 -1 -1 -1 0 -1 -1 1 430 | -1 -1 1 -1 -1 -1 -1 1 1 431 | -1 1 0 -1 -1 -1 -1 -1 -1 -1 0 -1 432 | -1 -1 -1 1 -1 -1 2 1 433 | -1 2 1 1 434 | 1 0 -1 -1 -1 1 -1 -1 435 | -1 1 436 | 2 -1 -1 -1 437 | -1 438 | 1 0 1 1 439 | -1 -1 -1 1 2 0 -1 -1 1 -2 -1 -1 -1 0 -1 -1 1 1 0 -1 -1 1 -1 -2 0 -1 -1 -1 0 0 0 1 -1 -1 -1 1 -1 -1 -1 0 -1 440 | 0 -1 -1 0 1 -1 -1 1 -1 -1 0 0 441 | 0 0 0 -1 0 442 | -1 -1 1 -1 -1 1 -1 0 1 443 | 1 1 -1 -1 0 0 444 | 1 -1 -1 1 0 -1 1 1 -1 0 -1 2 0 1 -2 0 1 1 445 | -1 -1 0 1 -1 -1 0 1 -1 446 | -1 1 -1 447 | -1 1 -1 1 1 1 1 -1 -1 448 | 0 -1 -1 0 0 449 | -1 -1 1 1 450 | -1 1 451 | -1 2 2 452 | -1 -1 1 453 | -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 1 -1 1 -1 -1 2 0 2 454 | -1 -1 1 -1 -1 1 0 -2 1 1 0 -1 0 1 0 455 | 0 0 -1 456 | -2 457 | -1 0 -1 0 2 0 -1 0 -1 0 458 | 1 2 -1 1 1 1 -1 -1 459 | -1 2 -1 1 460 | -1 0 1 -1 -1 -1 -1 -1 0 1 461 | 2 1 462 | 1 1 -1 -1 -1 1 -1 -1 -1 -1 0 1 -1 463 | -1 1 1 1 -1 -1 -1 1 1 0 -1 1 -1 2 2 1 1 1 -1 0 -1 1 0 464 | -1 465 | -1 1 1 0 -1 -1 2 466 | -1 -1 -2 -2 1 1 -1 1 0 1 467 | -1 -1 1 1 -1 -1 1 -1 -1 -1 -1 -1 -1 -2 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 1 468 | 1 1 -1 -1 1 -1 1 1 1 1 -1 -1 -1 -1 1 1 -1 1 1 0 469 | 1 -1 1 470 | -1 1 0 -1 1 0 471 | -1 -1 0 -1 1 1 -1 1 2 1 1 -1 -2 -1 -1 -1 -1 -1 1 -1 0 0 1 0 -1 -1 -1 -1 -2 -1 0 -1 1 472 | -1 1 -1 0 -1 -1 0 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 1 -1 1 1 1 -1 1 1 1 -1 -1 -1 -1 -1 1 1 -1 0 -1 -1 -1 1 0 1 1 -1 1 -1 1 -1 1 -1 -1 -1 -1 2 2 1 0 -1 1 1 -1 473 | 2 1 1 2 1 -1 0 -1 -1 1 474 | -1 -1 1 -1 1 -1 1 475 | 1 -1 1 1 -1 -1 -1 -1 -1 0 -1 -1 -1 0 1 -1 -1 0 -1 -1 -1 0 476 | 1 -1 -1 1 477 | -1 -1 -1 1 0 1 -1 1 0 -1 -1 1 1 0 -1 -1 1 1 1 -1 1 -1 -1 -1 -1 478 | 1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 1 0 1 -1 -1 479 | -1 1 -1 1 -1 1 1 -1 -1 -1 0 -1 -1 1 0 1 1 -1 -1 -1 480 | -1 -1 -1 1 1 -1 -1 1 1 -1 0 -1 0 -1 1 -1 0 2 481 | 1 1 -1 -1 -1 482 | -1 1 -1 -1 1 -1 -1 -1 -1 1 -1 483 | -1 -1 -1 -1 -1 -1 -1 0 1 0 484 | -1 -1 1 -1 -1 -1 -1 0 1 -1 -1 1 -1 1 2 -1 0 1 -1 0 -1 -1 0 485 | 2 -1 1 1 1 1 1 -1 1 1 -1 -1 1 1 486 | -1 -1 -1 -1 -1 -1 1 -1 1 487 | 0 1 -1 -1 1 -1 -1 -1 1 0 -1 -1 1 1 -1 488 | -1 -1 1 -1 1 1 489 | 1 -1 490 | 1 1 -1 491 | -1 -1 -1 -1 1 -2 0 -1 0 -1 0 -1 -1 0 -1 492 | 1 1 -1 -1 1 1 -1 1 1 0 0 1 0 0 -1 -1 -1 0 -1 1 1 493 | -1 0 1 0 494 | 1 -1 -1 -1 -1 0 -1 1 495 | -1 0 1 -1 -2 0 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 1 496 | -1 0 -2 497 | 0 -1 1 -1 -1 -1 1 -1 498 | 1 1 -1 -1 1 0 0 -1 -1 1 1 499 | 2 -1 1 0 -1 2 500 | -1 -1 -1 1 -1 1 1 0 1 1 -1 0 -1 -1 0 -2 -1 0 1 0 1 -1 -1 1 -1 -1 -1 1 -1 1 -1 501 | 1 1 502 | 1 0 -1 1 -1 1 0 503 | -1 -1 -1 0 504 | -1 -1 -1 -1 -1 -1 -1 -1 1 1 -1 -1 -2 -1 -1 -1 -1 -1 -1 -1 1 -1 1 0 -1 505 | -1 -1 1 1 -1 2 -1 1 -1 1 -1 506 | 0 1 -1 -1 1 1 507 | -1 -1 -1 -1 0 -1 508 | 0 0 1 -1 2 1 0 0 0 2 1 -1 1 0 509 | -1 1 1 510 | -1 -1 -1 -1 -1 0 511 | 1 1 512 | -1 -1 513 | 1 0 1 0 0 0 1 1 1 -1 0 0 514 | -1 1 0 515 | -1 -1 -1 0 0 -1 1 2 1 0 1 1 -1 -1 -2 1 0 -1 0 -1 -1 1 -1 0 -1 -1 -1 516 | -1 -1 0 1 -1 0 -1 -1 -1 -1 1 517 | 1 2 1 1 -1 1 -1 -1 0 0 1 0 -1 -1 -1 1 2 -1 -1 1 -1 2 -1 2 1 2 -1 -1 2 1 1 518 | -1 0 -1 1 -1 2 -1 -1 0 -1 1 -1 1 2 519 | 1 -1 -1 -1 -1 -1 0 -1 520 | 1 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 1 1 521 | 1 1 1 1 522 | 1 1 1 523 | -1 0 1 0 -1 1 0 0 1 -1 524 | 2 2 1 525 | -1 -1 1 -1 2 2 -1 -1 -1 526 | 1 527 | 1 1 1 2 1 1 -1 1 1 1 1 0 1 0 528 | 1 2 -1 -1 1 1 0 0 -1 -1 -1 -1 1 -1 529 | -1 0 -1 -2 -1 -1 -1 -1 -1 -1 -1 -1 -1 530 | 1 1 -1 0 -1 1 0 -2 -1 1 -1 531 | 1 0 532 | -1 -1 0 -1 -1 -1 533 | -1 1 -1 -1 -1 -1 -1 -1 0 534 | 0 -1 1 -1 -1 -1 -1 -1 0 0 1 -1 -1 -1 535 | 1 1 -1 1 1 -1 1 1 1 536 | -1 -1 -1 1 1 1 -1 1 537 | 1 1 1 538 | 0 1 -1 -1 -1 -1 -1 1 0 1 1 -1 -1 -1 -1 -1 0 -1 0 1 0 -1 -1 0 -1 -1 0 1 -1 -1 -1 -1 -1 1 1 -1 0 -1 539 | 1 1 540 | 2 1 1 0 0 -1 2 541 | -1 -2 0 1 542 | -1 -1 1 1 0 2 543 | 0 1 -1 -1 -1 -1 -1 -1 -1 -2 -1 1 0 0 0 2 544 | 1 1 1 1 1 545 | 1 -1 -1 -1 -1 -1 -1 2 1 -1 1 -1 0 2 546 | 1 2 1 1 547 | -1 0 0 0 -1 -1 1 -1 0 0 -1 -1 0 -1 1 2 -1 1 0 -1 0 1 -1 0 1 -1 1 1 -1 -1 -1 -1 1 -1 1 -1 -1 -1 -1 0 -1 0 -1 -1 -1 1 2 1 1 548 | 2 -2 -1 2 -1 -1 -1 1 -1 1 1 1 549 | -1 1 -1 1 1 1 0 1 1 1 0 -1 1 -1 1 -1 -1 -1 1 2 -1 1 550 | -1 1 1 -1 1 551 | -1 -1 552 | -1 -1 -1 1 -1 553 | 1 -1 -1 1 -1 1 554 | -1 -1 0 0 -1 1 -1 0 -1 1 -1 2 0 -1 555 | -1 -1 -1 -1 1 -1 556 | 1 -1 557 | -1 -1 -1 0 1 -1 -2 1 1 1 -1 -1 0 -1 -1 -2 -1 -1 -1 -1 -1 1 -1 -1 1 -1 -1 1 -1 -1 -1 1 1 -1 1 1 558 | 0 -1 -1 1 1 -1 -1 0 559 | 1 1 -1 1 1 1 1 -1 1 -1 0 -1 1 1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 1 560 | -1 -1 -1 1 1 0 -1 1 -1 0 -1 1 -1 0 561 | 1 -1 -1 -1 -1 1 -1 -1 -2 -1 1 -1 562 | 0 -1 -1 -1 -2 563 | 1 1 -1 0 564 | 1 1 1 1 1 565 | -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 1 1 1 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 1 -1 -1 0 -1 0 1 -1 -1 566 | 0 -1 0 -1 -1 -1 -1 0 567 | -1 1 1 -1 0 1 0 568 | 0 -1 -1 -1 2 1 -1 1 0 569 | 1 1 -1 -1 -1 1 1 570 | -1 -1 -1 -1 0 -1 0 -1 -1 571 | 1 1 -1 -1 -1 -1 -1 1 -1 0 0 572 | 0 0 1 -1 1 1 -1 -1 1 0 -1 573 | 1 1 -1 -2 0 -1 1 1 574 | -1 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 575 | 1 0 -1 2 1 -1 -1 1 1 -1 1 1 0 1 2 1 576 | 1 1 577 | 0 -2 578 | -1 0 -1 -2 0 -1 -1 -1 -1 1 579 | 1 1 -1 1 -1 -1 1 0 -1 0 580 | -1 0 2 0 -1 -1 -1 -1 1 0 1 -1 1 1 -1 -1 1 581 | 2 582 | -2 -1 -1 -2 1 583 | -1 1 1 2 -1 584 | -1 -1 -1 -1 0 0 585 | -1 -1 1 586 | -1 1 -1 -1 -2 -1 0 -1 587 | 2 0 -1 -1 0 -1 -1 0 1 0 1 588 | 1 0 1 -1 -1 -1 -1 -1 -2 -1 -1 1 0 589 | -1 2 2 590 | -1 0 -1 1 -1 1 0 1 0 2 -1 2 0 -1 -1 -1 -1 591 | -1 -1 -1 -1 -1 -1 -1 -1 2 -1 0 -1 -1 1 2 -1 2 -1 -1 1 0 -1 1 -1 -1 -1 -1 -1 -1 -1 1 -1 1 0 592 | 2 -1 -1 593 | 1 1 0 -1 594 | -1 -1 -1 1 -1 0 -1 1 -1 -1 1 -1 -1 -1 2 595 | 1 1 -1 -1 -1 596 | 2 1 1 0 1 -1 -1 1 -1 1 597 | 1 -1 -1 1 -1 1 1 -1 1 0 598 | -1 -1 1 -1 1 1 -1 -1 -1 -1 -1 -1 -1 -1 1 1 1 -1 -1 1 -1 0 599 | 1 -2 -1 0 1 -1 0 0 600 | -1 1 1 -1 0 2 -1 -1 601 | 1 1 1 1 1 1 1 -1 602 | 1 2 1 0 1 -1 603 | 0 -1 0 -1 1 -1 -1 -1 -1 0 604 | 1 -1 -1 -1 -1 -1 -1 -1 1 1 605 | 1 1 -1 1 606 | 1 1 1 2 607 | -1 -1 0 1 1 -1 -1 -1 -1 -1 -1 0 -1 0 1 -1 1 -1 -1 1 1 1 2 608 | 0 0 0 -1 -1 -1 -1 1 1 609 | -1 -2 -1 610 | 0 -1 -1 1 -1 -1 2 2 0 2 0 1 -1 1 1 1 1 1 1 -1 -1 -1 0 1 -1 1 1 0 2 0 -2 611 | 1 0 -1 1 0 1 1 612 | -1 -1 -1 1 1 0 613 | 2 -1 1 1 0 -1 -1 -1 -1 2 -1 -1 -1 1 614 | 1 1 1 1 1 615 | 0 0 616 | -1 -1 1 617 | -1 -1 -1 -1 -1 618 | -1 -1 -1 0 1 -1 0 619 | 2 1 -1 1 620 | 1 -1 0 -1 -1 0 -1 -1 0 1 -1 -1 1 -1 -1 1 621 | 1 -1 -1 0 -1 -1 -1 2 1 1 1 1 -1 2 1 1 1 1 622 | 1 -1 -1 1 623 | -1 2 1 -1 1 624 | -1 1 -1 -2 -1 0 625 | -1 -1 0 1 626 | 0 2 -2 -1 1 1 627 | 1 0 -1 2 -1 -1 -1 -1 -1 -1 1 1 1 -1 1 -1 -1 1 0 0 1 1 -1 -1 -1 -1 -1 0 -1 1 1 -1 -1 -1 1 -2 -1 -1 -1 1 -1 -2 -1 -1 0 -1 1 1 -1 -1 -1 -1 -1 -1 -1 -1 1 -1 0 1 1 1 1 628 | -1 -1 1 -1 -2 -1 0 -1 -1 -1 1 -1 -1 -1 1 0 0 -1 -1 -1 -1 -1 -1 -1 -1 1 629 | 1 -1 -1 -1 1 -1 1 -1 1 0 0 0 -1 1 1 1 630 | 1 -1 1 -1 -1 -1 -1 0 631 | 1 0 -1 -1 1 0 1 2 2 1 1 632 | 1 -1 0 1 -1 -1 -1 -1 633 | 1 -1 -2 -1 1 1 1 1 -1 -1 -1 -1 1 634 | 1 1 1 -1 0 1 635 | -1 -1 -1 2 -1 1 1 -1 636 | 1 1 -1 1 1 -1 2 637 | -1 1 1 0 0 -1 0 0 1 1 -1 -1 -1 1 638 | 1 0 1 1 1 1 639 | 1 1 -1 -1 -1 0 1 -1 -1 640 | -1 -2 0 1 641 | 1 1 1 1 1 -1 -1 -1 -1 0 0 -1 -1 -1 0 0 642 | -1 1 643 | 1 -1 1 1 -1 -1 2 644 | 2 -1 -1 1 2 -1 1 2 2 -1 1 2 1 1 645 | -2 -1 0 0 1 -1 1 -1 646 | -1 1 -1 1 1 647 | -1 -1 -1 -1 -1 -1 -1 0 0 -1 1 0 -1 -1 -1 -1 1 1 1 648 | 0 1 -1 1 -1 649 | -1 -1 0 -1 -1 -1 -1 1 2 650 | 2 -1 651 | -1 -1 -1 -1 2 -1 -1 -1 0 0 0 -1 -1 -2 1 0 1 1 0 -1 -1 -2 -1 0 0 -1 0 1 1 0 -1 -1 1 -1 -1 0 -1 -1 0 -1 -1 2 1 0 0 0 -1 1 -1 1 0 -1 -2 -1 -1 -1 0 652 | 0 -1 -1 -1 653 | -1 654 | 1 1 655 | -1 -1 -1 -1 1 0 -1 0 -1 -1 -1 -1 0 0 -1 -1 -1 -1 0 656 | 0 0 2 -1 -1 -1 -1 -1 0 -1 0 0 657 | 1 2 1 -1 -1 -1 1 -1 0 1 1 1 0 658 | -1 -1 1 -1 1 -1 0 1 0 -1 -1 1 1 -1 659 | -1 1 -1 -1 1 0 1 -1 660 | 1 1 1 -1 -1 -1 2 -1 1 0 2 2 661 | 1 662 | -1 0 -1 1 -1 -1 1 663 | -1 1 664 | -1 1 -1 1 0 -1 1 -1 -1 1 -1 -1 0 -1 -1 1 -1 1 665 | 0 1 1 -1 0 -1 666 | 2 2 1 2 -1 2 667 | 1 1 1 0 1 2 1 -1 1 1 -1 0 0 1 -1 668 | 1 -1 -1 669 | 1 -1 0 -1 -1 1 -1 -1 -1 0 -1 1 1 1 0 670 | 0 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 671 | 1 -1 1 0 -1 2 672 | -1 1 -1 0 1 -1 1 -1 -1 -1 1 -1 673 | 1 1 -1 -2 0 -1 674 | -1 1 -1 1 675 | 1 676 | -1 -1 -1 -1 -1 -1 1 1 -1 -1 -1 -1 -1 -1 -1 677 | 1 -1 0 0 0 0 1 -1 -1 0 -1 0 1 0 -1 -1 -1 -1 0 -1 0 -1 -1 -1 0 0 1 1 0 0 0 0 0 -1 0 -1 -1 0 1 -1 0 0 -1 -1 1 0 -1 678 | 0 2 -1 -1 -1 -1 679 | -1 1 0 1 -1 -1 0 0 1 -1 -1 0 -1 -1 -1 -1 -1 0 1 -1 1 680 | 1 0 2 1 1 1 1 -1 1 2 -1 1 0 2 2 0 681 | -1 -1 0 0 -1 2 682 | -1 -1 0 -1 -1 683 | 1 -1 -1 1 0 0 0 1 1 1 0 -1 1 1 0 1 1 684 | -1 -1 -1 -1 1 -1 0 -1 -1 1 -2 1 -1 1 -1 1 -1 1 685 | -1 1 2 686 | 2 687 | -1 -1 1 -1 -1 -1 -1 1 1 688 | -1 -1 -1 2 1 2 -1 1 -1 -1 2 1 689 | 1 1 1 1 690 | -1 1 1 -1 -1 -1 -1 1 -1 -1 -1 -1 1 -1 1 691 | -1 -1 1 1 692 | 0 -1 -1 693 | 1 0 -1 1 1 0 694 | -1 1 2 0 -1 1 0 1 695 | 1 1 -1 1 1 -1 1 1 -1 696 | 1 1 1 -1 1 1 1 1 1 1 -1 0 -1 1 2 -1 1 1 697 | 0 1 -1 1 1 -1 -1 -1 698 | 1 -1 -1 1 -1 0 -1 0 -1 -1 -1 0 -1 -1 0 -1 -1 0 1 1 -1 0 1 1 -1 0 699 | 0 1 0 1 1 2 1 1 1 -1 -1 -1 -1 -1 -1 -1 -1 -2 -1 -1 -2 -1 -1 -1 -1 -1 1 1 -1 -1 1 1 1 -1 -1 -1 0 -1 -1 -1 -2 0 -2 -1 -1 -1 -1 -1 1 700 | -1 -1 1 1 2 1 1 -1 -1 -1 -1 -1 -1 -1 1 701 | 1 1 702 | 1 -1 0 -1 1 0 -1 0 0 1 -1 0 703 | 1 -1 1 -1 1 1 704 | -1 -1 -1 -1 705 | -1 -1 -1 -1 -1 0 -1 -1 0 0 1 -1 0 -1 2 -1 0 -1 1 -1 0 -1 -1 -1 0 0 0 1 1 0 1 0 -1 -1 -1 1 1 0 -1 2 -1 0 0 1 0 -1 -1 1 0 -1 -1 1 1 0 0 -1 -1 1 -1 -1 -1 706 | -1 -1 -1 1 707 | 1 -1 1 1 1 -1 -1 0 1 -1 -1 0 708 | 1 1 -1 -1 -1 1 1 709 | -1 -1 -1 1 0 -1 1 1 1 1 1 0 710 | -1 -1 -1 -1 1 0 -1 -1 -1 -1 1 711 | 0 1 1 1 0 -1 1 1 712 | 1 1 -1 -1 1 -1 1 -1 1 0 -1 -1 1 713 | 1 1 -1 -1 0 -1 1 1 714 | 0 -1 -1 -1 715 | 1 1 0 0 1 1 0 0 1 1 -1 -1 -1 0 1 1 -1 -1 1 1 1 -1 -1 2 -2 -1 2 716 | 0 -1 0 0 0 -1 0 -1 -1 -1 -1 -1 0 0 0 -1 717 | 0 2 0 -1 1 1 -1 -2 1 718 | 1 1 2 1 -1 -1 -1 -1 -1 -1 1 -1 -1 1 -1 1 1 -1 1 719 | 1 1 -1 1 0 0 0 720 | 1 -1 1 -1 2 -1 721 | 1 1 1 1 -1 2 722 | 0 -1 -1 1 -1 0 -1 -1 1 -1 723 | -1 0 -1 0 0 1 0 -1 -1 0 -1 1 -1 -1 0 724 | 1 725 | 1 1 1 -1 1 1 1 726 | 1 1 -1 -1 -1 -1 727 | 0 -1 1 1 -1 1 1 1 1 -1 1 1 0 -1 1 728 | -1 1 1 -1 0 1 729 | -1 -1 -1 730 | 1 -1 1 1 -1 1 731 | 1 1 -1 -1 -1 -1 -1 0 -1 0 1 0 1 1 -1 1 -1 0 -1 -1 2 -1 -1 1 -1 -1 0 732 | -1 -1 -2 -1 -1 -1 1 -1 0 -1 0 -1 -1 1 1 1 0 733 | -1 0 1 -1 2 1 -1 0 1 -1 0 -2 1 -1 1 -1 2 1 1 734 | -1 0 0 -1 0 0 0 -1 -1 -1 2 1 735 | 0 1 -1 -1 -1 -1 -1 -1 -1 -1 2 -1 1 2 1 1 736 | -1 737 | -1 -1 1 1 -1 -1 1 0 0 -1 -1 -1 -1 -1 1 2 738 | -1 -1 0 0 -1 -1 -1 -1 -1 0 -1 -1 0 739 | -1 0 -1 740 | 1 -1 1 -1 741 | 1 1 0 0 0 1 -1 -1 -1 1 -1 742 | -1 1 743 | 1 -1 -2 -1 -1 1 1 -1 0 -1 -1 -1 -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 0 -1 -1 -1 -1 -2 1 0 -1 744 | 1 1 0 1 1 1 1 -1 -1 -1 0 -1 -1 -2 -1 -1 0 1 1 745 | -1 -1 1 1 -1 746 | 0 -1 0 -1 1 1 1 -1 -1 -1 -1 -1 -1 1 0 0 1 1 1 -1 0 1 1 1 1 1 1 -1 1 -1 -1 1 -1 1 747 | -1 -1 -1 748 | 2 2 749 | 0 -1 0 1 -1 1 1 -1 -1 750 | -1 -1 1 -1 751 | 1 -1 -1 1 -1 0 1 -1 752 | 1 1 0 -1 753 | 1 1 -1 2 754 | -1 0 755 | 1 1 -1 0 -1 1 0 756 | -1 -2 -1 -1 757 | 1 -1 758 | 1 -1 -1 -1 0 1 -1 1 -1 -1 -1 -1 -1 -1 -1 759 | 1 0 1 0 -1 -1 -1 -1 1 -1 0 -1 -1 -1 1 -1 -1 0 0 760 | -1 -1 -1 1 -1 -2 1 0 0 0 -1 0 -1 -1 -1 0 -1 -1 -1 -1 -1 -1 -1 0 1 0 -1 -1 1 -1 0 0 761 | -1 0 762 | 1 -1 2 0 -1 -1 763 | 0 -1 -1 -1 -1 -1 764 | 0 -1 -1 -1 -1 0 765 | -1 0 1 -1 -1 1 -1 0 766 | -1 1 -1 -1 1 -1 1 -1 -1 1 1 2 -1 1 1 -1 -1 -1 1 2 767 | 1 -1 1 0 1 1 1 2 1 0 768 | -1 0 -1 1 1 -2 -1 -1 -1 1 -1 1 0 -1 -1 -1 1 1 769 | 0 -1 -1 2 -1 -1 0 770 | 1 0 -1 1 -1 771 | -1 0 -1 -1 -1 1 772 | 0 -1 -1 773 | -1 1 -1 1 1 1 1 -1 1 -1 1 1 0 1 -1 1 -1 1 774 | 1 -1 1 2 775 | -1 -1 0 -1 0 1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 0 0 1 -1 1 1 1 -1 1 0 -1 -1 0 -1 -1 776 | -1 1 -1 0 -1 -1 1 1 -1 -2 1 777 | 1 1 -1 0 0 0 -1 -1 -1 778 | -1 0 2 0 -1 -1 0 -1 -1 1 779 | 1 0 1 1 0 -1 -1 1 -1 1 0 0 0 -1 -1 -1 -1 1 780 | 0 1 -1 -1 -1 -1 -1 1 -1 -1 781 | -1 -1 -1 2 1 782 | 0 -1 0 -1 -1 -1 1 1 0 -1 0 1 0 783 | -1 -1 1 2 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 2 -1 0 0 1 1 1 -1 -1 0 -1 784 | -1 -1 -1 -2 -1 1 -1 -1 785 | -1 1 -1 -1 1 2 786 | 1 -1 -1 -2 -1 -1 787 | -1 -1 -1 788 | 1 1 1 -1 1 1 -1 -1 -1 1 -1 1 2 789 | -1 1 -1 -1 -1 1 -1 -1 1 790 | 0 -1 -1 1 -1 0 -1 -1 791 | -1 -1 -1 1 1 1 792 | 1 1 -1 -1 1 1 1 1 -1 1 -1 -1 -1 -1 -1 -1 -1 1 1 -1 -1 1 -1 -1 -1 -1 1 1 -1 -1 1 1 -1 -1 -1 -1 -1 1 2 2 -1 -1 -1 -1 1 -1 -1 0 -1 -1 0 793 | -1 -1 -1 -1 -1 -1 1 -1 -1 0 -1 -1 -1 1 -1 1 2 -1 -1 -1 794 | -1 1 0 0 795 | -1 -1 -1 1 -1 -1 1 -1 0 1 -1 -1 796 | 1 1 2 -1 2 -1 1 2 -1 -1 -1 -1 -1 -1 797 | -2 -1 -1 1 -1 -1 -2 1 0 -1 0 -1 -1 -1 -1 1 1 -1 -1 -1 1 0 798 | 1 1 -1 1 -1 1 1 1 799 | -1 -1 -1 1 -1 -1 1 -1 -2 -1 -1 -1 1 800 | 0 -1 1 -1 -1 1 801 | 0 -1 0 0 -1 -1 0 1 0 0 0 -1 -1 0 -1 -1 -1 0 1 0 -1 1 1 1 -1 -1 0 1 802 | 1 803 | -1 -1 1 -1 804 | 1 1 -1 -1 0 -2 -1 -1 0 -1 -1 -1 -1 0 1 -1 -1 -1 -1 -1 0 1 0 1 1 0 1 -1 -1 0 -1 -1 1 1 -1 -1 -1 1 1 -1 0 0 -1 805 | -1 -1 -1 0 -1 -1 -1 -1 806 | -1 0 0 -1 1 1 -1 -1 -1 807 | 1 1 0 1 1 1 1 808 | 1 -1 -1 -1 -1 1 -1 1 809 | -------------------------------------------------------------------------------- /keras_tutorial/sarcasm_net.py: -------------------------------------------------------------------------------- 1 | import scipy.io as io 2 | import numpy as np 3 | import theano.tensor as T 4 | import os 5 | import matplotlib.pyplot as plt 6 | 7 | from sklearn import preprocessing as pre 8 | from sklearn import metrics 9 | from sklearn import cross_validation as c_v 10 | from keras.models import Sequential 11 | from keras.layers.core import Dense, Dropout, Activation, AutoEncoder 12 | from keras.layers.embeddings import Embedding 13 | from keras.layers.recurrent import LSTM, GRU 14 | from keras.callbacks import EarlyStopping 15 | from keras.callbacks import ModelCheckpoint 16 | from keras.models import model_from_json 17 | from keras.regularizers import l2, activity_l2 18 | from keras.layers.normalization import BatchNormalization 19 | 20 | ironic = 'ironic.txt' 21 | regular ='regular.txt' 22 | 23 | targets = np.empty([1,]) 24 | full_data = np.empty([1,]) 25 | 26 | 27 | ironic_file = open(ironic, 'r') 28 | regular_file = open(regular, 'r') 29 | 30 | max_len = 0 31 | samples = 0 32 | 33 | while True: 34 | curr_line = ironic_file.readline() 35 | if not curr_line: 36 | break 37 | 38 | samples = samples + 1 39 | if len(curr_line) > max_len: 40 | max_len = len(curr_line) 41 | 42 | ironic_num = samples 43 | 44 | while True: 45 | curr_line = regular_file.readline() 46 | if not curr_line: 47 | break 48 | 49 | samples = samples + 1 50 | if len(curr_line) > max_len: 51 | max_len = len(curr_line) 52 | 53 | regular_num = samples - ironic_num 54 | 55 | ironic_file.close() 56 | regular_file.close() 57 | 58 | ironic_file = open(ironic, 'r') 59 | regular_file = open(regular, 'r') 60 | 61 | print max_len 62 | print ironic_num, regular_num 63 | 64 | data = np.empty([samples,max_len]) 65 | index = 0 66 | while True: 67 | curr_line = ironic_file.readline().replace('\n', '') 68 | if not curr_line: 69 | break 70 | 71 | curr_line = map(int, curr_line.split(' ')) 72 | diff = max_len - len(curr_line) 73 | data[index, :] = np.concatenate((curr_line, np.zeros(diff,))) 74 | index = index + 1 75 | 76 | while True: 77 | curr_line = regular_file.readline().replace('\n', '') 78 | if not curr_line: 79 | break 80 | 81 | curr_line = map(int, curr_line.split(' ')) 82 | diff = max_len - len(curr_line) 83 | data[index, :] = np.concatenate((curr_line, np.zeros(diff,))) 84 | 85 | index = index + 1 86 | 87 | targets = np.concatenate((np.ones(ironic_num, ), np.zeros(regular_num, ))) 88 | 89 | data = data.reshape(data.shape[0], data.shape[1], 1) 90 | 91 | print targets.shape 92 | print data.shape 93 | 94 | skf = c_v.StratifiedKFold(targets, n_folds=5, shuffle=True) 95 | 96 | for train_index, test_index in skf: 97 | #print("TRAIN:", train_index, "TEST:", test_index) 98 | X_train, X_test = data[train_index], data[test_index] 99 | y_train, y_test = targets[train_index], targets[test_index] 100 | 101 | model = Sequential() 102 | 103 | model.add(GRU(80, activation='relu', inner_activation='hard_sigmoid', 104 | input_dim=1, return_sequences=True)) 105 | model.add(GRU(80, activation='relu')) 106 | model.add(Dense(64, activation='relu')) 107 | model.add(Dropout(0.5)) 108 | model.add(Dense(32, activation='relu')) 109 | model.add(Dropout(0.5)) 110 | model.add(Dense(1, activation='sigmoid')) 111 | 112 | model.compile(loss='binary_crossentropy', optimizer='adam', class_mode='binary') 113 | histoty = model.fit(X_train, y_train, batch_size=32, nb_epoch=10, validation_split=0.2, 114 | show_accuracy=True) 115 | 116 | score, acc = model.evaluate(X_test, y_test, batch_size=8, show_accuracy=True) 117 | print ('Test score:', score) 118 | print ('Test accuracy:', acc) 119 | 120 | 121 | classes = model.predict_classes(X_test, batch_size=8, verbose=1) 122 | print y_test 123 | print classes 124 | 125 | print metrics.confusion_matrix(y_test.astype(int), classes) 126 | -------------------------------------------------------------------------------- /keras_tutorial/sarcasm_net_cnn.py: -------------------------------------------------------------------------------- 1 | import scipy.io as io 2 | import numpy as np 3 | import theano.tensor as T 4 | import os 5 | import matplotlib.pyplot as plt 6 | 7 | from sklearn import preprocessing as pre 8 | from sklearn import metrics 9 | from sklearn import cross_validation as c_v 10 | from keras.models import Sequential 11 | from keras.layers.core import Dense, Dropout, Activation, Flatten 12 | from keras.layers.embeddings import Embedding 13 | from keras.layers.convolutional import Convolution1D, MaxPooling1D 14 | from keras.callbacks import EarlyStopping 15 | from keras.callbacks import ModelCheckpoint 16 | from keras.models import model_from_json 17 | from keras.regularizers import l2, activity_l2 18 | from keras.layers.normalization import BatchNormalization 19 | 20 | earlystopping = EarlyStopping(monitor='val_loss', patience=4, verbose=1) 21 | 22 | ironic = 'ironic.txt' 23 | regular ='regular.txt' 24 | 25 | targets = np.empty([1,]) 26 | full_data = np.empty([1,]) 27 | 28 | 29 | ironic_file = open(ironic, 'r') 30 | regular_file = open(regular, 'r') 31 | 32 | max_len = 0 33 | samples = 0 34 | 35 | while True: 36 | curr_line = ironic_file.readline() 37 | if not curr_line: 38 | break 39 | 40 | samples = samples + 1 41 | if len(curr_line) > max_len: 42 | max_len = len(curr_line) 43 | 44 | ironic_num = samples 45 | 46 | while True: 47 | curr_line = regular_file.readline() 48 | if not curr_line: 49 | break 50 | 51 | samples = samples + 1 52 | if len(curr_line) > max_len: 53 | max_len = len(curr_line) 54 | 55 | regular_num = samples - ironic_num 56 | 57 | ironic_file.close() 58 | regular_file.close() 59 | 60 | ironic_file = open(ironic, 'r') 61 | regular_file = open(regular, 'r') 62 | 63 | print max_len 64 | print ironic_num, regular_num 65 | 66 | data = np.empty([samples,max_len]) 67 | index = 0 68 | while True: 69 | curr_line = ironic_file.readline().replace('\n', '') 70 | if not curr_line: 71 | break 72 | 73 | curr_line = map(int, curr_line.split(' ')) 74 | diff = max_len - len(curr_line) 75 | data[index, :] = np.concatenate((curr_line, np.zeros(diff,))) 76 | index = index + 1 77 | 78 | while True: 79 | curr_line = regular_file.readline().replace('\n', '') 80 | if not curr_line: 81 | break 82 | 83 | curr_line = map(int, curr_line.split(' ')) 84 | diff = max_len - len(curr_line) 85 | data[index, :] = np.concatenate((curr_line, np.zeros(diff,))) 86 | 87 | index = index + 1 88 | 89 | targets = np.concatenate((np.ones(ironic_num, ), np.zeros(regular_num, ))) 90 | 91 | data = data.reshape(data.shape[0], data.shape[1], 1) 92 | 93 | print targets.shape 94 | print data.shape 95 | 96 | skf = c_v.StratifiedKFold(targets, n_folds=5, shuffle=True) 97 | 98 | for train_index, test_index in skf: 99 | #print("TRAIN:", train_index, "TEST:", test_index) 100 | X_train, X_test = data[train_index], data[test_index] 101 | y_train, y_test = targets[train_index], targets[test_index] 102 | 103 | model = Sequential() 104 | 105 | model.add(Convolution1D(nb_filter=75, filter_length=3, activation='relu', input_dim=1, input_length=max_len)) 106 | model.add(Dropout(0.5)) 107 | model.add(MaxPooling1D(pool_length=8)) 108 | 109 | model.add(Flatten()) 110 | model.add(Dense(100, activation='relu')) 111 | model.add(Dropout(0.5)) 112 | model.add(Dense(30, activation='relu')) 113 | model.add(Dropout(0.5)) 114 | 115 | model.add(Dense(1, activation='sigmoid')) 116 | 117 | model.compile(loss='binary_crossentropy', optimizer='adam', class_mode="binary") 118 | 119 | history = model.fit(X_train, y_train, batch_size=25, nb_epoch=50, 120 | validation_split=0.2) 121 | 122 | loss = history.history['loss'] 123 | plt.plot(loss) 124 | plt.show() 125 | 126 | 127 | score, acc = model.evaluate(X_test, y_test, batch_size=8, show_accuracy=True) 128 | 129 | print ('Test score:', score) 130 | print ('Test accuracy:', acc) 131 | 132 | out = model.predict(X_test, batch_size=8, verbose=1) 133 | 134 | classes = model.predict_classes(X_test, batch_size=8, verbose=1) 135 | print metrics.confusion_matrix(y_test.astype(int), classes) 136 | print metrics.classification_report(y_test.astype(int), classes) 137 | -------------------------------------------------------------------------------- /keras_tutorial/sentiment_analyzer.py: -------------------------------------------------------------------------------- 1 | import os 2 | import fileinput 3 | import glob 4 | 5 | # Requires installation of the Stanford CoreNLP toolkit to work 6 | # Running this script will insert the sentiment of a sentence below that sentence 7 | # Further processing (making the sentence sentiments into useable vectors) is done 8 | # by 'create_vector.py' 9 | 10 | command = 'java -cp "*" -mx5g edu.stanford.nlp.sentiment.SentimentPipeline -file ' 11 | pathname1 = '/point_towards_original_paragraphs/Ironic/*.txt' 12 | pathname2 = '/point_towards_original_paragraphs/Regular/*.txt' 13 | 14 | #os.system(command+filename+operation) 15 | 16 | #for every .txt file in your corpus take text b/w <\REVIEW> and analyse it. 17 | 18 | for fname in glob.glob(pathname1): 19 | temp = open('results/temp.txt', 'w+') 20 | cur_review = open(fname, 'r') 21 | for line in cur_review: 22 | if line == '\n': 23 | for line in cur_review: 24 | if line != '': 25 | temp.write(line) 26 | temp.close() 27 | os.system(command+'results/temp.txt'+' > results/'+os.path.basename(fname)) 28 | os.remove('results/temp.txt') 29 | 30 | for fname in glob.glob(pathname2): 31 | temp = open('results/temp.txt', 'w+') 32 | cur_review = open(fname, 'r') 33 | for line in cur_review: 34 | if line == '\n': 35 | for line in cur_review: 36 | if line != '': 37 | temp.write(line) 38 | temp.close() 39 | os.system(command+'results/temp.txt'+' > results/'+os.path.basename(fname)) 40 | os.remove('results/temp.txt') 41 | --------------------------------------------------------------------------------