├── LICENSE
├── README.md
├── annotated_softmax.py
├── decision_boundary_plots
├── linear_softmax.png
├── moon_hidden.png
├── moon_softmax.png
├── saturn_hidden.png
└── saturn_softmax.png
├── hidden.py
├── plot_boundary_on_data.py
├── simdata
├── .RData
├── generate_linear_data.R
├── generate_moon_data.py
├── generate_saturn_data.R
├── linear_data_eval.csv
├── linear_data_train.csv
├── linear_data_train.jpg
├── moon_data_eval.csv
├── moon_data_train.csv
├── moon_data_train.jpg
├── output_curve_hidden_nodes.txt
├── plot_data.R
├── plot_hidden_curve.R
├── plot_hyperplane.R
├── saturn_data_eval.csv
├── saturn_data_train.csv
└── saturn_data_train.jpg
├── softmax-python3.py
├── softmax.py
├── softmax_checkpoints.py
└── truncnorm_hidden.py
/LICENSE:
--------------------------------------------------------------------------------
1 | Apache License
2 | Version 2.0, January 2004
3 | http://www.apache.org/licenses/
4 |
5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6 |
7 | 1. Definitions.
8 |
9 | "License" shall mean the terms and conditions for use, reproduction,
10 | and distribution as defined by Sections 1 through 9 of this document.
11 |
12 | "Licensor" shall mean the copyright owner or entity authorized by
13 | the copyright owner that is granting the License.
14 |
15 | "Legal Entity" shall mean the union of the acting entity and all
16 | other entities that control, are controlled by, or are under common
17 | control with that entity. For the purposes of this definition,
18 | "control" means (i) the power, direct or indirect, to cause the
19 | direction or management of such entity, whether by contract or
20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the
21 | outstanding shares, or (iii) beneficial ownership of such entity.
22 |
23 | "You" (or "Your") shall mean an individual or Legal Entity
24 | exercising permissions granted by this License.
25 |
26 | "Source" form shall mean the preferred form for making modifications,
27 | including but not limited to software source code, documentation
28 | source, and configuration files.
29 |
30 | "Object" form shall mean any form resulting from mechanical
31 | transformation or translation of a Source form, including but
32 | not limited to compiled object code, generated documentation,
33 | and conversions to other media types.
34 |
35 | "Work" shall mean the work of authorship, whether in Source or
36 | Object form, made available under the License, as indicated by a
37 | copyright notice that is included in or attached to the work
38 | (an example is provided in the Appendix below).
39 |
40 | "Derivative Works" shall mean any work, whether in Source or Object
41 | form, that is based on (or derived from) the Work and for which the
42 | editorial revisions, annotations, elaborations, or other modifications
43 | represent, as a whole, an original work of authorship. For the purposes
44 | of this License, Derivative Works shall not include works that remain
45 | separable from, or merely link (or bind by name) to the interfaces of,
46 | the Work and Derivative Works thereof.
47 |
48 | "Contribution" shall mean any work of authorship, including
49 | the original version of the Work and any modifications or additions
50 | to that Work or Derivative Works thereof, that is intentionally
51 | submitted to Licensor for inclusion in the Work by the copyright owner
52 | or by an individual or Legal Entity authorized to submit on behalf of
53 | the copyright owner. For the purposes of this definition, "submitted"
54 | means any form of electronic, verbal, or written communication sent
55 | to the Licensor or its representatives, including but not limited to
56 | communication on electronic mailing lists, source code control systems,
57 | and issue tracking systems that are managed by, or on behalf of, the
58 | Licensor for the purpose of discussing and improving the Work, but
59 | excluding communication that is conspicuously marked or otherwise
60 | designated in writing by the copyright owner as "Not a Contribution."
61 |
62 | "Contributor" shall mean Licensor and any individual or Legal Entity
63 | on behalf of whom a Contribution has been received by Licensor and
64 | subsequently incorporated within the Work.
65 |
66 | 2. Grant of Copyright License. Subject to the terms and conditions of
67 | this License, each Contributor hereby grants to You a perpetual,
68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69 | copyright license to reproduce, prepare Derivative Works of,
70 | publicly display, publicly perform, sublicense, and distribute the
71 | Work and such Derivative Works in Source or Object form.
72 |
73 | 3. Grant of Patent License. Subject to the terms and conditions of
74 | this License, each Contributor hereby grants to You a perpetual,
75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76 | (except as stated in this section) patent license to make, have made,
77 | use, offer to sell, sell, import, and otherwise transfer the Work,
78 | where such license applies only to those patent claims licensable
79 | by such Contributor that are necessarily infringed by their
80 | Contribution(s) alone or by combination of their Contribution(s)
81 | with the Work to which such Contribution(s) was submitted. If You
82 | institute patent litigation against any entity (including a
83 | cross-claim or counterclaim in a lawsuit) alleging that the Work
84 | or a Contribution incorporated within the Work constitutes direct
85 | or contributory patent infringement, then any patent licenses
86 | granted to You under this License for that Work shall terminate
87 | as of the date such litigation is filed.
88 |
89 | 4. Redistribution. You may reproduce and distribute copies of the
90 | Work or Derivative Works thereof in any medium, with or without
91 | modifications, and in Source or Object form, provided that You
92 | meet the following conditions:
93 |
94 | (a) You must give any other recipients of the Work or
95 | Derivative Works a copy of this License; and
96 |
97 | (b) You must cause any modified files to carry prominent notices
98 | stating that You changed the files; and
99 |
100 | (c) You must retain, in the Source form of any Derivative Works
101 | that You distribute, all copyright, patent, trademark, and
102 | attribution notices from the Source form of the Work,
103 | excluding those notices that do not pertain to any part of
104 | the Derivative Works; and
105 |
106 | (d) If the Work includes a "NOTICE" text file as part of its
107 | distribution, then any Derivative Works that You distribute must
108 | include a readable copy of the attribution notices contained
109 | within such NOTICE file, excluding those notices that do not
110 | pertain to any part of the Derivative Works, in at least one
111 | of the following places: within a NOTICE text file distributed
112 | as part of the Derivative Works; within the Source form or
113 | documentation, if provided along with the Derivative Works; or,
114 | within a display generated by the Derivative Works, if and
115 | wherever such third-party notices normally appear. The contents
116 | of the NOTICE file are for informational purposes only and
117 | do not modify the License. You may add Your own attribution
118 | notices within Derivative Works that You distribute, alongside
119 | or as an addendum to the NOTICE text from the Work, provided
120 | that such additional attribution notices cannot be construed
121 | as modifying the License.
122 |
123 | You may add Your own copyright statement to Your modifications and
124 | may provide additional or different license terms and conditions
125 | for use, reproduction, or distribution of Your modifications, or
126 | for any such Derivative Works as a whole, provided Your use,
127 | reproduction, and distribution of the Work otherwise complies with
128 | the conditions stated in this License.
129 |
130 | 5. Submission of Contributions. Unless You explicitly state otherwise,
131 | any Contribution intentionally submitted for inclusion in the Work
132 | by You to the Licensor shall be under the terms and conditions of
133 | this License, without any additional terms or conditions.
134 | Notwithstanding the above, nothing herein shall supersede or modify
135 | the terms of any separate license agreement you may have executed
136 | with Licensor regarding such Contributions.
137 |
138 | 6. Trademarks. This License does not grant permission to use the trade
139 | names, trademarks, service marks, or product names of the Licensor,
140 | except as required for reasonable and customary use in describing the
141 | origin of the Work and reproducing the content of the NOTICE file.
142 |
143 | 7. Disclaimer of Warranty. Unless required by applicable law or
144 | agreed to in writing, Licensor provides the Work (and each
145 | Contributor provides its Contributions) on an "AS IS" BASIS,
146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147 | implied, including, without limitation, any warranties or conditions
148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149 | PARTICULAR PURPOSE. You are solely responsible for determining the
150 | appropriateness of using or redistributing the Work and assume any
151 | risks associated with Your exercise of permissions under this License.
152 |
153 | 8. Limitation of Liability. In no event and under no legal theory,
154 | whether in tort (including negligence), contract, or otherwise,
155 | unless required by applicable law (such as deliberate and grossly
156 | negligent acts) or agreed to in writing, shall any Contributor be
157 | liable to You for damages, including any direct, indirect, special,
158 | incidental, or consequential damages of any character arising as a
159 | result of this License or out of the use or inability to use the
160 | Work (including but not limited to damages for loss of goodwill,
161 | work stoppage, computer failure or malfunction, or any and all
162 | other commercial damages or losses), even if such Contributor
163 | has been advised of the possibility of such damages.
164 |
165 | 9. Accepting Warranty or Additional Liability. While redistributing
166 | the Work or Derivative Works thereof, You may choose to offer,
167 | and charge a fee for, acceptance of support, warranty, indemnity,
168 | or other liability obligations and/or rights consistent with this
169 | License. However, in accepting such obligations, You may act only
170 | on Your own behalf and on Your sole responsibility, not on behalf
171 | of any other Contributor, and only if You agree to indemnify,
172 | defend, and hold each Contributor harmless for any liability
173 | incurred by, or claims asserted against, such Contributor by reason
174 | of your accepting any such warranty or additional liability.
175 |
176 | END OF TERMS AND CONDITIONS
177 |
178 |
179 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | Try TensorFlow
2 | ====
3 |
4 | Example code to try out [TensorFlow](http://www.tensorflow.org/). See the blog post [Simple end-to-end TensorFlow examples](https://bcomposes.wordpress.com/2015/11/26/simple-end-to-end-tensorflow-examples/) for more discussion and context.
5 |
6 | You need have [TensorFlow installed](http://www.tensorflow.org/get_started/os_setup.md).
7 |
8 | ## Instructions for simulated data
9 |
10 | The subdirectory `try-tf/simdata` contains train and evaluation data sets for three simulated data set types: linear, moon, and saturn. It also contains some simple R and Python scripts for generating and viewing the data.
11 |
12 | ### Linearly separable data
13 |
14 | The data:
15 |
16 | * `try-tf/simdata/linear_data_train.csv`
17 | * `try-tf/simdata/linear_data_eval.csv`
18 |
19 | The training data set looks like this.
20 |
21 | 
22 |
23 | Softmax regression is perfectly capable of handling this data. If you run the command below, you should see output similar to that provided here.
24 |
25 | ```
26 | $ python softmax.py --train simdata/linear_data_train.csv --test simdata/linear_data_eval.csv --num_epochs 5 --verbose True
27 | Initialized!
28 |
29 | Training.
30 | 0 1 2 3 4 5 6 7 8 9
31 | 10 11 12 13 14 15 16 17 18 19
32 | 20 21 22 23 24 25 26 27 28 29
33 | 30 31 32 33 34 35 36 37 38 39
34 | 40 41 42 43 44 45 46 47 48 49
35 |
36 | Weight matrix.
37 | [[-1.87038445 1.87038457]
38 | [-2.23716712 2.23716712]]
39 |
40 | Bias vector.
41 | [ 1.57296884 -1.57296848]
42 |
43 | Applying model to first test instance.
44 | Point = [[ 0.14756215 0.24351828]]
45 | Wx+b = [[ 0.7521798 -0.75217938]]
46 | softmax(Wx+b) = [[ 0.81822371 0.18177626]]
47 |
48 | Accuracy: 1.0
49 | ```
50 |
51 | The plot of the decision boundary:
52 |
53 |
54 |
55 | ### Moon data
56 |
57 | The data:
58 |
59 | * `try-tf/simdata/moon_data_train.csv`
60 | * `try-tf/simdata/moon_data_eval.csv`
61 |
62 | The training data set looks like this.
63 |
64 | 
65 |
66 | The softmax network performs poorly, but a network with a five node hidden layer works great.
67 |
68 | ```
69 | $ python softmax.py --train simdata/moon_data_train.csv --test simdata/moon_data_eval.csv --num_epochs 100
70 | Accuracy: 0.861
71 |
72 | $ python hidden.py --train simdata/moon_data_train.csv --test simdata/moon_data_eval.csv --num_epochs 100 --num_hidden 5
73 | Accuracy: 0.971
74 | ```
75 |
76 | The plot of the decision boundaries produced by the above calls:
77 |
78 |
79 |
80 |
81 |
82 | ### Saturn data
83 |
84 | The data:
85 |
86 | * `try-tf/simdata/saturn_data_train.csv`
87 | * `try-tf/simdata/saturn_data_eval.csv`
88 |
89 | The training data set looks like this.
90 |
91 | 
92 |
93 | Again, a softmax network performs poorly, but a network with a five node hidden layer works great.
94 |
95 | ```
96 | $ python softmax.py --train simdata/saturn_data_train.csv --test simdata/saturn_data_eval.csv --num_epochs 100
97 | Accuracy: 0.43
98 |
99 | $ python hidden.py --train simdata/saturn_data_train.csv --test simdata/saturn_data_eval.csv --num_epochs 100 --num_hidden 15
100 | Accuracy: 1.0
101 | ```
102 |
103 | The plot of the decision boundaries produced by the above calls:
104 |
105 |
106 |
107 |
108 |
109 |
110 | ## Generating simulated data.
111 |
112 | Feel free to play around with the code to generate data to make it harder, add more dimensions, etc. You can then generate new data as follows (while in the simdata directory):
113 |
114 | ```
115 | $ Rscript generate_linear_data.R
116 | $ python generate_moon_data.R
117 | $ Rscript generate_saturn_data.R
118 | ```
119 |
120 | The R scripts generate both train and test sets. For the moon data, you'll need to split the output into train and eval files using the Unix `head` and `tail` commands.
121 |
122 | ## Creating plots of the data.
123 |
124 | To prepare the blog post for this repository, I created a few R scripts to plot data. They are simple, but I figured I'd include them in case they are useful starting points for others for changing things or plotting related data.
125 |
126 | Go into the `simdata` directory.
127 |
128 | Open `plot_data.R` in an editor and uncomment the data set you'd like to plot, save it, and then run:
129 |
130 | ```
131 | $ Rscript plot_data.R
132 | ```
133 |
134 | For plotting the image with the hyperplane, start up R and then provide the command `source("plot_hyperplane.R")` to R.
135 |
136 | For plotting the graph relating the number of hidden nodes to accuracy, start up R and then provide the command `source("plot_hidden_curve.R")` to R.
137 |
138 |
139 | ## Visualising the model and training using Tensor Board.
140 |
141 | TensorFlow includes a visualisation tool called [TensorBoard](https://www.tensorflow.org/versions/r0.7/how_tos/summaries_and_tensorboard/index.html) that allows you to visualise the directed graph of your model.
142 |
143 | The script `annotated_softmax.py` reproduces the simple softmax regression, but also includes annotations that will write logging information to a folder called `try_tf_logs`.
144 |
145 | After running `annotated_softmax.py`, you can point TensorBoard at the `try_tf_logs` directory to launch a browser-based visualisation of the model and training.
146 |
147 |
148 | ```
149 | $ python annotated_softmax.py --train simdata/linear_data_train.csv --test simdata/linear_data_eval.csv --num_epochs 5 --verbose False
150 | Accuracy: 1.0
151 |
152 | $ tensorboard --logdir=try_tf_logs/
153 | Starting TensorBoard on port 6006
154 | (You can navigate to http://0.0.0.0:6006)
155 | ```
156 |
157 |
158 |
--------------------------------------------------------------------------------
/annotated_softmax.py:
--------------------------------------------------------------------------------
1 | import tensorflow.python.platform
2 |
3 | import numpy as np
4 | import tensorflow as tf
5 |
6 | # Global variables.
7 | NUM_LABELS = 2 # The number of labels.
8 | BATCH_SIZE = 100 # The number of training examples to use per training step.
9 |
10 | # Define the flags useable from the command line.
11 | tf.app.flags.DEFINE_string('train', None,
12 | 'File containing the training data (labels & features).')
13 | tf.app.flags.DEFINE_string('test', None,
14 | 'File containing the test data (labels & features).')
15 | tf.app.flags.DEFINE_integer('num_epochs', 1,
16 | 'Number of examples to separate from the training '
17 | 'data for the validation set.')
18 | tf.app.flags.DEFINE_boolean('verbose', False, 'Produce verbose output.')
19 | FLAGS = tf.app.flags.FLAGS
20 |
21 | # Extract numpy representations of the labels and features given rows consisting of:
22 | # label, feat_0, feat_1, ..., feat_n
23 | def extract_data(filename):
24 |
25 | # Arrays to hold the labels and feature vectors.
26 | labels = []
27 | fvecs = []
28 |
29 | # Iterate over the rows, splitting the label from the features. Convert labels
30 | # to integers and features to floats.
31 | for line in file(filename):
32 | row = line.split(",")
33 | labels.append(int(row[0]))
34 | fvecs.append([float(x) for x in row[1:]])
35 |
36 | # Convert the array of float arrays into a numpy float matrix.
37 | fvecs_np = np.matrix(fvecs).astype(np.float32)
38 |
39 | # Convert the array of int labels into a numpy array.
40 | labels_np = np.array(labels).astype(dtype=np.uint8)
41 |
42 | # Convert the int numpy array into a one-hot matrix.
43 | labels_onehot = (np.arange(NUM_LABELS) == labels_np[:, None]).astype(np.float32)
44 |
45 | # Return a pair of the feature matrix and the one-hot label matrix.
46 | return fvecs_np,labels_onehot
47 |
48 | def main(argv=None):
49 | # Be verbose?
50 | verbose = FLAGS.verbose
51 |
52 | # Get the data.
53 | train_data_filename = FLAGS.train
54 | test_data_filename = FLAGS.test
55 |
56 | # Extract it into numpy matrices.
57 | train_data,train_labels = extract_data(train_data_filename)
58 | test_data, test_labels = extract_data(test_data_filename)
59 |
60 | # Get the shape of the training data.
61 | train_size,num_features = train_data.shape
62 |
63 | # Get the number of epochs for training.
64 | num_epochs = FLAGS.num_epochs
65 |
66 | # This is where training samples and labels are fed to the graph.
67 | # These placeholder nodes will be fed a batch of training data at each
68 | # training step using the {feed_dict} argument to the Run() call below.
69 | # Add a name to include it in the tensor board visualisation.
70 | x = tf.placeholder("float", shape=[None, num_features], name="x_input")
71 | y_ = tf.placeholder("float", shape=[None, NUM_LABELS], name="labels")
72 |
73 | # These are the weights that inform how much each feature contributes to
74 | # the classification.
75 | W = tf.Variable(tf.zeros([num_features,NUM_LABELS]), name="weights")
76 | b = tf.Variable(tf.zeros([NUM_LABELS]), name="bias")
77 | with tf.name_scope("Wx_b") as scope:
78 | y = tf.nn.softmax(tf.matmul(x,W) + b)
79 |
80 |
81 | # Optimization.
82 | with tf.name_scope("xent") as scope:
83 | cross_entropy = -tf.reduce_sum(y_*tf.log(y))
84 | ce_summ = tf.scalar_summary("cross entropy", cross_entropy)
85 |
86 | with tf.name_scope("train") as scope:
87 | train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)
88 |
89 | # Add summary ops to collect data
90 | w_hist = tf.histogram_summary("weights", W)
91 | b_hist = tf.histogram_summary("biases", b)
92 | y_hist = tf.histogram_summary("y", y)
93 |
94 |
95 |
96 | # For the test data, hold the entire dataset in one constant node.
97 | test_data_node = tf.constant(test_data)
98 |
99 |
100 | # Evaluation.
101 | with tf.name_scope("test") as scope:
102 | correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
103 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
104 | accuracy_summary = tf.scalar_summary("accuracy", accuracy)
105 |
106 |
107 | # Create a local session to run this computation.
108 | with tf.Session() as s:
109 |
110 | # Merge all the summaries and write them out to try_tf_logs
111 | merged = tf.merge_all_summaries()
112 | writer = tf.train.SummaryWriter("try_tf_logs", s.graph_def)
113 |
114 | # Run all the initializers to prepare the trainable parameters.
115 | tf.initialize_all_variables().run()
116 | if verbose:
117 | print 'Initialized!'
118 | print
119 | print 'Training.'
120 |
121 | # Iterate and train.
122 | for step in xrange(num_epochs * train_size // BATCH_SIZE):
123 | if verbose:
124 | print step,
125 |
126 | offset = (step * BATCH_SIZE) % train_size
127 | batch_data = train_data[offset:(offset + BATCH_SIZE), :]
128 | batch_labels = train_labels[offset:(offset + BATCH_SIZE)]
129 | train_step.run(feed_dict={x: batch_data, y_: batch_labels})
130 |
131 | if verbose and offset >= train_size-BATCH_SIZE:
132 | print
133 |
134 | if step % num_epochs == 0:
135 | feed = {x: test_data, y_: test_labels}
136 | result = s.run([merged, accuracy], feed_dict=feed)
137 | summary_str = result[0]
138 | acc = result[1]
139 | writer.add_summary(summary_str, step)
140 |
141 | # Give very detailed output.
142 | if verbose:
143 | print
144 | print 'Weight matrix.'
145 | print s.run(W)
146 | print
147 | print 'Bias vector.'
148 | print s.run(b)
149 | print
150 | print "Applying model to first test instance."
151 | first = test_data[:1]
152 | print "Point =", first
153 | print "Wx+b = ", s.run(tf.matmul(first,W)+b)
154 | print "softmax(Wx+b) = ", s.run(tf.nn.softmax(tf.matmul(first,W)+b))
155 | print
156 |
157 | writer.flush()
158 | writer.close()
159 | print "Accuracy:", accuracy.eval(feed_dict={x: test_data, y_: test_labels})
160 |
161 | if __name__ == '__main__':
162 | tf.app.run()
163 |
--------------------------------------------------------------------------------
/decision_boundary_plots/linear_softmax.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/decision_boundary_plots/linear_softmax.png
--------------------------------------------------------------------------------
/decision_boundary_plots/moon_hidden.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/decision_boundary_plots/moon_hidden.png
--------------------------------------------------------------------------------
/decision_boundary_plots/moon_softmax.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/decision_boundary_plots/moon_softmax.png
--------------------------------------------------------------------------------
/decision_boundary_plots/saturn_hidden.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/decision_boundary_plots/saturn_hidden.png
--------------------------------------------------------------------------------
/decision_boundary_plots/saturn_softmax.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/decision_boundary_plots/saturn_softmax.png
--------------------------------------------------------------------------------
/hidden.py:
--------------------------------------------------------------------------------
1 | import tensorflow.python.platform
2 |
3 | import numpy as np
4 | import tensorflow as tf
5 |
6 | import plot_boundary_on_data
7 |
8 | # Global variables.
9 | NUM_LABELS = 2 # The number of labels.
10 | BATCH_SIZE = 100 # The number of training examples to use per training step.
11 |
12 | tf.app.flags.DEFINE_string('train', None,
13 | 'File containing the training data (labels & features).')
14 | tf.app.flags.DEFINE_string('test', None,
15 | 'File containing the test data (labels & features).')
16 | tf.app.flags.DEFINE_integer('num_epochs', 1,
17 | 'Number of passes over the training data.')
18 | tf.app.flags.DEFINE_integer('num_hidden', 1,
19 | 'Number of nodes in the hidden layer.')
20 | tf.app.flags.DEFINE_boolean('verbose', False, 'Produce verbose output.')
21 | tf.app.flags.DEFINE_boolean('plot', True, 'Plot the final decision boundary on the data.')
22 | FLAGS = tf.app.flags.FLAGS
23 |
24 | # Extract numpy representations of the labels and features given rows consisting of:
25 | # label, feat_0, feat_1, ..., feat_n
26 | def extract_data(filename):
27 |
28 | # Arrays to hold the labels and feature vectors.
29 | labels = []
30 | fvecs = []
31 |
32 | # Iterate over the rows, splitting the label from the features. Convert labels
33 | # to integers and features to floats.
34 | for line in file(filename):
35 | row = line.split(",")
36 | labels.append(int(row[0]))
37 | fvecs.append([float(x) for x in row[1:]])
38 |
39 | # Convert the array of float arrays into a numpy float matrix.
40 | fvecs_np = np.matrix(fvecs).astype(np.float32)
41 |
42 | # Convert the array of int labels into a numpy array.
43 | labels_np = np.array(labels).astype(dtype=np.uint8)
44 |
45 | # Convert the int numpy array into a one-hot matrix.
46 | labels_onehot = (np.arange(NUM_LABELS) == labels_np[:, None]).astype(np.float32)
47 |
48 | # Return a pair of the feature matrix and the one-hot label matrix.
49 | return fvecs_np,labels_onehot
50 |
51 | # Init weights method. (Lifted from Delip Rao: http://deliprao.com/archives/100)
52 | def init_weights(shape, init_method='xavier', xavier_params = (None, None)):
53 | if init_method == 'zeros':
54 | return tf.Variable(tf.zeros(shape, dtype=tf.float32))
55 | elif init_method == 'uniform':
56 | return tf.Variable(tf.random_normal(shape, stddev=0.01, dtype=tf.float32))
57 | else: #xavier
58 | (fan_in, fan_out) = xavier_params
59 | low = -4*np.sqrt(6.0/(fan_in + fan_out)) # {sigmoid:4, tanh:1}
60 | high = 4*np.sqrt(6.0/(fan_in + fan_out))
61 | return tf.Variable(tf.random_uniform(shape, minval=low, maxval=high, dtype=tf.float32))
62 |
63 | def main(argv=None):
64 | # Be verbose?
65 | verbose = FLAGS.verbose
66 |
67 | # Plot?
68 | plot = FLAGS.plot
69 |
70 | # Get the data.
71 | train_data_filename = FLAGS.train
72 | test_data_filename = FLAGS.test
73 |
74 | # Extract it into numpy arrays.
75 | train_data,train_labels = extract_data(train_data_filename)
76 | test_data, test_labels = extract_data(test_data_filename)
77 |
78 | # Get the shape of the training data.
79 | train_size,num_features = train_data.shape
80 |
81 | # Get the number of epochs for training.
82 | num_epochs = FLAGS.num_epochs
83 |
84 | # Get the size of layer one.
85 | num_hidden = FLAGS.num_hidden
86 |
87 | # This is where training samples and labels are fed to the graph.
88 | # These placeholder nodes will be fed a batch of training data at each
89 | # training step using the {feed_dict} argument to the Run() call below.
90 | x = tf.placeholder("float", shape=[None, num_features])
91 | y_ = tf.placeholder("float", shape=[None, NUM_LABELS])
92 |
93 | # For the test data, hold the entire dataset in one constant node.
94 | test_data_node = tf.constant(test_data)
95 |
96 | # Define and initialize the network.
97 |
98 | # Initialize the hidden weights and biases.
99 | w_hidden = init_weights(
100 | [num_features, num_hidden],
101 | 'xavier',
102 | xavier_params=(num_features, num_hidden))
103 |
104 | b_hidden = init_weights([1,num_hidden],'zeros')
105 |
106 | # The hidden layer.
107 | hidden = tf.nn.tanh(tf.matmul(x,w_hidden) + b_hidden)
108 |
109 | # Initialize the output weights and biases.
110 | w_out = init_weights(
111 | [num_hidden, NUM_LABELS],
112 | 'xavier',
113 | xavier_params=(num_hidden, NUM_LABELS))
114 |
115 | b_out = init_weights([1,NUM_LABELS],'zeros')
116 |
117 | # The output layer.
118 | y = tf.nn.softmax(tf.matmul(hidden, w_out) + b_out)
119 |
120 | # Optimization.
121 | cross_entropy = -tf.reduce_sum(y_*tf.log(y))
122 | train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)
123 |
124 | # Evaluation.
125 | predicted_class = tf.argmax(y,1);
126 | correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
127 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
128 |
129 | # Create a local session to run this computation.
130 | with tf.Session() as s:
131 | # Run all the initializers to prepare the trainable parameters.
132 | tf.initialize_all_variables().run()
133 | if verbose:
134 | print 'Initialized!'
135 | print
136 | print 'Training.'
137 |
138 | # Iterate and train.
139 | for step in xrange(num_epochs * train_size // BATCH_SIZE):
140 | if verbose:
141 | print step,
142 |
143 | offset = (step * BATCH_SIZE) % train_size
144 | batch_data = train_data[offset:(offset + BATCH_SIZE), :]
145 | batch_labels = train_labels[offset:(offset + BATCH_SIZE)]
146 | train_step.run(feed_dict={x: batch_data, y_: batch_labels})
147 | if verbose and offset >= train_size-BATCH_SIZE:
148 | print
149 | print "Accuracy:", accuracy.eval(feed_dict={x: test_data, y_: test_labels})
150 |
151 | if plot:
152 | eval_fun = lambda X: predicted_class.eval(feed_dict={x:X});
153 | plot_boundary_on_data.plot(test_data, test_labels, eval_fun)
154 |
155 | if __name__ == '__main__':
156 | tf.app.run()
157 |
--------------------------------------------------------------------------------
/plot_boundary_on_data.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from matplotlib import pyplot as plt
3 | from matplotlib import colors
4 |
5 |
6 | def plot(X,Y,pred_func):
7 | # determine canvas borders
8 | mins = np.amin(X,0);
9 | mins = mins - 0.1*np.abs(mins);
10 | maxs = np.amax(X,0);
11 | maxs = maxs + 0.1*maxs;
12 |
13 | ## generate dense grid
14 | xs,ys = np.meshgrid(np.linspace(mins[0,0],maxs[0,0],300),
15 | np.linspace(mins[0,1], maxs[0,1], 300));
16 |
17 |
18 | # evaluate model on the dense grid
19 | Z = pred_func(np.c_[xs.flatten(), ys.flatten()]);
20 | Z = Z.reshape(xs.shape)
21 |
22 | # Plot the contour and training examples
23 | plt.contourf(xs, ys, Z, cmap=plt.cm.Spectral)
24 | plt.scatter(X[:, 0], X[:, 1], c=Y[:,1], s=50,
25 | cmap=colors.ListedColormap(['orange', 'blue']))
26 | plt.show()
27 |
--------------------------------------------------------------------------------
/simdata/.RData:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/simdata/.RData
--------------------------------------------------------------------------------
/simdata/generate_linear_data.R:
--------------------------------------------------------------------------------
1 | ### 10/29/2015
2 | ### Simulate a two-class linearly separable classification problem.
3 | ### Label 0 is the "negative" class.
4 | ### Label 1 is the "positive" class.
5 | ### Author: Jason Baldridge
6 |
7 | # Create a matrix given a label, the class means of some number of
8 | # dimensions, the number of items, and the standard deviation. Values
9 | # are sampled normally according to the mean and stdev for each
10 | # column.
11 | create_matrix = function(label, mu, n, dev=.1) {
12 | d = length(mu)
13 | x = t(matrix(rnorm(n*d, mu, dev), ncol=n))
14 | cbind(rep(label,n),x)
15 | }
16 |
17 | # Num input dimensions (the "features").
18 | numDimensions = 2
19 |
20 | # Sample the means for the dimensions for a positive class.
21 | #pos = runif(numDimensions,min=0,max=1)
22 | pos = c(.7,.5) # Use a fixed 2-dimensional center.
23 |
24 | # Sample the means for the dimensions for a negative class.
25 | #neg = runif(numDimensions,min=0,max=1)
26 | neg = c(.3,.1) # Use a fixed 2-dimensional center.
27 |
28 | # Create training data.
29 | numTraining = 500
30 | trainDev = .1
31 | training_data = as.matrix(rbind(create_matrix(1,pos,numTraining,trainDev),create_matrix(0,neg,numTraining,trainDev)))
32 | shuffled_training_data = training_data[sample(nrow(training_data)),]
33 | write.table(shuffled_training_data,file="linear_data_train.csv",row.names=FALSE,col.names=FALSE,quote=FALSE,sep=",")
34 |
35 | # Create eval data. Possibly make the stdev bigger to make it a bit more interesting.
36 | numEval = 100
37 | evalDev = .1
38 | eval_data = as.matrix(rbind(create_matrix(1,pos,numEval,evalDev),create_matrix(0,neg,numEval,evalDev)))
39 | shuffled_eval_data = eval_data[sample(nrow(eval_data)),]
40 | write.table(shuffled_eval_data,file="linear_data_eval.csv",row.names=FALSE,col.names=FALSE,quote=FALSE,sep=",")
41 |
42 | #Plot the training items, if desired.
43 | #colnames(training_data) = c("label","x","y")
44 | #plot(training_data[,c("x","y")],pch=21,bg=c("orange","blue")[training_data[,"label"]+1])
45 |
46 |
--------------------------------------------------------------------------------
/simdata/generate_moon_data.py:
--------------------------------------------------------------------------------
1 | from sklearn import datasets
2 |
3 | X, y = datasets.make_moons(2000, noise=0.20)
4 |
5 | # Can't believe I'm doing it this way, but join doesn't work
6 | # on numpy strings and I'm on a plane unable to lookup the
7 | # right way to join a column to a matrix and output as CSV.
8 | for x_i,y_i in zip(X,y):
9 | output = ''
10 | output += str(y_i)
11 | for j in range(0,len(x_i)):
12 | output += ','
13 | output += str(x_i[j])
14 | print output
15 |
16 |
17 |
--------------------------------------------------------------------------------
/simdata/generate_saturn_data.R:
--------------------------------------------------------------------------------
1 | ### 10/29/2015
2 | ### Simulate a two-class "Saturn" classification problem
3 | ### Label 0 is the planet
4 | ### Label 1 is the ring
5 | ### Author: James Scott
6 |
7 | # @n: number of points
8 | # @frac: fraction of points to simulate from class 1
9 | # @d: Euclidean dimension
10 | # @radius: a 2-vector of radiuses for class0 and class1
11 | sim_saturn_data = function(n, d, radius, sigma, frac = 0.5) {
12 |
13 | # Argument checking
14 | stopifnot(d >= 2, length(radius) == 2)
15 |
16 | # We work in radial coordinates.
17 | # Uniformly sample d-1 angular coordinates for each point
18 | phi = matrix(runif(n*(d-1), 0, 2*pi), nrow=n, ncol=d-1)
19 |
20 | # Sample a class indicator for each simulated data point
21 | gamma = rbinom(n, 1, frac)
22 | n1 = sum(gamma)
23 |
24 | # Simulate a radial distance for each point
25 | r = rep(0, n)
26 | r[gamma==0] = runif(n-n1, 0, radius[1])
27 | r[gamma==1] = rnorm(n1, radius[2], sigma)
28 |
29 | # convert to Euclidean coordinates
30 | x = matrix(0, nrow=n, ncol=d)
31 | x[,1] = r*cos(phi[,1])
32 | x[,d] = r*apply(sin(phi), 1, prod)
33 | if(d >= 3) {
34 | for(j in 2:(d-1)) {
35 | prod_of_sines = apply(matrix(sin(phi[,1:(j-1)]), nrow=n), 1, prod)
36 | x[,j] = r*prod_of_sines*cos(phi[,j])
37 | }
38 | }
39 |
40 | list(labels = gamma, features = x)
41 | }
42 |
43 | ### Testing: simulate some data and plot it.
44 | mycols = c('blue','orange')
45 |
46 | # 2d example
47 | #out = sim_saturn_data(1000, 2, c(3, 10), sigma = 1)
48 | #plot(out$features,pch=21,bg=mycols[out$labels+1],xlab="x",ylab="y")
49 |
50 | # 3d example (need rgl installed for the visualization)
51 | #out = sim_saturn_data(1000, 3, c(3, 10), sigma = 1.0)
52 | #rgl::plot3d(out$features, col=mycols[out$labels+1],xlab="x",ylab="y",zlab="z")
53 |
54 | ### Actually create simulated data.
55 | numDimensions = 2
56 |
57 | # Create training data.
58 | numTraining = 500
59 | training_out = sim_saturn_data(numTraining, numDimensions, c(5, 10), sigma = 1.0)
60 | training_data = cbind(training_out$labels,training_out$features)
61 | write.table(training_data,file="saturn_data_train.csv",row.names=FALSE,col.names=FALSE,quote=FALSE,sep=",")
62 |
63 | # Create eval data. Perhas make sigma bigger to make it a bit more interesting.
64 | numEval = 100
65 | eval_out = sim_saturn_data(numEval, numDimensions, c(5, 10), sigma = 1.0)
66 | eval_data = cbind(eval_out$labels,eval_out$features)
67 | write.table(eval_data,file="saturn_data_eval.csv",row.names=FALSE,col.names=FALSE,quote=FALSE,sep=",")
68 |
69 |
--------------------------------------------------------------------------------
/simdata/linear_data_eval.csv:
--------------------------------------------------------------------------------
1 | 0,0.147562141324833,0.243518270820358
2 | 0,0.179868989766322,0.0922537025547999
3 | 1,0.754244045840797,0.52387485552728
4 | 0,0.248663780798734,0.175587276306351
5 | 0,0.397217489998824,0.0342134948381493
6 | 0,0.45098160780959,0.0328858982745571
7 | 0,0.335532917252522,0.16654442982869
8 | 0,0.371372049777255,0.167201755443297
9 | 0,0.280985655458144,0.214982885821991
10 | 0,0.313304894476342,0.00521976760984659
11 | 1,0.638465839375771,0.59044662666132
12 | 0,0.431591142988843,0.0470830726734468
13 | 0,0.207774186228136,0.0819718701225306
14 | 1,0.74160948940248,0.48471276227826
15 | 1,0.953514363329195,0.639625829881579
16 | 1,0.642944532413742,0.561453314573865
17 | 0,0.226494744243105,0.054182457565834
18 | 0,0.303693007460946,0.0231174890601685
19 | 0,0.442909334900624,0.26458738107626
20 | 1,0.689641361293936,0.441606457994801
21 | 1,0.808516873219165,0.460988843699898
22 | 0,0.265847526966108,-0.0217612302359918
23 | 0,0.277512671350018,-0.0351946196596575
24 | 1,0.650996504600044,0.281211163745124
25 | 1,0.604564151274163,0.581607217589511
26 | 0,0.361767990760365,0.08894162662105
27 | 1,0.666911463766987,0.658382180098643
28 | 0,0.23063312840159,-0.018832997639963
29 | 0,0.137952214405966,0.11362939867057
30 | 1,0.736452219294899,0.491761590839456
31 | 1,0.59710827462338,0.550563581924152
32 | 1,0.602609504754365,0.4272558830412
33 | 0,0.259748077463861,0.0830864567718866
34 | 1,0.65936769953152,0.398195251750205
35 | 1,0.68433814756797,0.243306417023253
36 | 1,0.855013781033494,0.552123366487809
37 | 1,0.960752241879643,0.589326478652242
38 | 0,0.218532689151841,0.149970476606352
39 | 0,0.340460187615256,0.163897051668133
40 | 0,0.355834203480917,0.336747248250862
41 | 1,0.692211516077662,0.577457901954487
42 | 1,0.662154960387187,0.570323144740557
43 | 0,0.372511985939819,0.0857198602571986
44 | 0,0.22900535800054,0.0267345028505001
45 | 0,0.309531718124159,0.144331015797212
46 | 0,0.263227253895867,0.127925334758116
47 | 0,0.308466277418238,0.0910351363893737
48 | 0,0.0930774599145935,0.113987148810076
49 | 1,0.6229369231208,0.456717036684569
50 | 1,0.706148123157656,0.490803325682998
51 | 1,0.758732482141232,0.456626655986977
52 | 1,0.694577645162027,0.622832560922402
53 | 0,0.369820198735285,0.0550497386655174
54 | 0,0.239404252953096,0.132757785197318
55 | 0,0.218428111377801,-0.219178158477306
56 | 1,0.692946434725903,0.544563975400229
57 | 0,0.245371538967234,0.101855416754447
58 | 1,0.56913442403047,0.294387382973145
59 | 1,0.714809619410276,0.538163512680377
60 | 0,0.358067954360732,0.17044538896009
61 | 0,0.320547428963535,0.251711615665619
62 | 0,0.331588068345858,0.107835653776275
63 | 0,0.304030747179208,0.0547750174185641
64 | 0,0.321995718652107,0.00579217815345798
65 | 1,0.865360178690971,0.428046918332584
66 | 1,0.671798231854105,0.344725091990453
67 | 1,0.828788452478336,0.71100719973842
68 | 1,0.781619318840399,0.632570613805373
69 | 1,0.731475532721693,0.257914929544682
70 | 1,0.629915796829575,0.284014988011287
71 | 0,0.422354335685084,0.114636655496169
72 | 1,0.673380826362505,0.577899254008376
73 | 0,0.411902241550325,-0.0328305542767028
74 | 1,0.819974434826139,0.549032982583942
75 | 1,0.749879048884502,0.521106457290703
76 | 1,0.865129108041695,0.505063149340243
77 | 1,0.802865060766911,0.534926888417566
78 | 1,0.757857478161499,0.353704731938664
79 | 1,0.795630162325943,0.351825595995113
80 | 0,0.462310717991569,0.101365623468382
81 | 0,0.269750572403886,-0.149506765860705
82 | 1,0.832120799383439,0.579782802852318
83 | 0,0.389685944569249,0.163719794768627
84 | 0,0.300495737942904,0.0420527462802183
85 | 0,0.301934594176741,0.0839837366231943
86 | 0,0.150962586848513,0.233882915802165
87 | 0,0.25452948081291,0.0859799579364082
88 | 1,0.758924966011327,0.51452726603244
89 | 1,0.688094757573297,0.451071027859505
90 | 0,0.308508646709991,0.124020803218039
91 | 1,0.55488440319187,0.712912973753406
92 | 1,0.734968534679093,0.291312226056919
93 | 0,0.424844828588299,0.0185442193071348
94 | 0,0.307133843847762,-0.0280343474796379
95 | 0,0.398740975001251,-0.0767590748067974
96 | 0,0.273728056742917,0.251849536602207
97 | 1,0.740350011751019,0.531733530441795
98 | 0,0.286412335506547,0.0834310455481071
99 | 1,0.844565148785342,0.556675571552102
100 | 1,0.570621351148686,0.405651266329464
101 | 1,0.780228461639442,0.682045157407263
102 | 1,0.808610830575502,0.48055917875417
103 | 0,0.404736393317852,0.0534461559608683
104 | 0,0.0182219815354657,-0.0890842632482053
105 | 1,0.580213837410484,0.434088367477381
106 | 0,0.439028571589774,-0.00891622428961927
107 | 0,0.169013026906772,0.139040747231028
108 | 1,0.721813199806076,0.453310307451834
109 | 1,0.579448726121783,0.601098430630891
110 | 1,0.59757790717587,0.582346229610322
111 | 1,0.706844037534213,0.444478339486168
112 | 0,0.311812492223678,-0.0245609488864325
113 | 1,0.696270344001157,0.559918986782424
114 | 0,0.517693836106899,0.136882581439463
115 | 1,0.711330813645891,0.368148477498463
116 | 1,0.651106025565724,0.645855201514964
117 | 0,0.312846178583358,0.167410567066292
118 | 0,0.245212502207789,0.0788996016352575
119 | 0,0.281584872796863,0.16330117675989
120 | 0,0.224177537883604,0.198729749727135
121 | 0,0.21360730861153,0.0659637945032196
122 | 0,0.236451258508006,0.0652936599183583
123 | 1,0.769116120629594,0.529177074205597
124 | 0,0.221575998142235,0.130222505896903
125 | 1,0.624409461092592,0.379450781522128
126 | 1,0.547121964828769,0.273564135415305
127 | 0,0.17592618523832,0.160037295166501
128 | 1,0.736121699674151,0.506024281122913
129 | 0,0.425480843904003,-0.0479291016901599
130 | 1,0.618590118156231,0.562390207752306
131 | 0,0.534384693355842,-0.00458315713932619
132 | 0,0.470829423375905,0.0567177119203065
133 | 0,0.375738399369332,0.10675402978279
134 | 0,0.319635320093047,0.129350314422456
135 | 0,0.243349669666206,0.116546106571295
136 | 0,0.34357354135486,-0.00338666050798213
137 | 0,0.186259256050742,0.049140403307173
138 | 1,0.646103731473393,0.476865433197868
139 | 1,0.742216880358217,0.390083115275135
140 | 1,0.839287534868438,0.550027764170609
141 | 1,0.800012129202795,0.466861127031535
142 | 0,0.216202082309659,0.176368365536743
143 | 1,0.780494991593676,0.375525772107199
144 | 1,0.726960940964144,0.580849892908468
145 | 1,0.743612641049324,0.552738236722576
146 | 1,0.597827269636531,0.314015371971029
147 | 1,0.753161899049477,0.612163367781747
148 | 1,0.672106155440207,0.541402426363875
149 | 0,0.611648647701236,-0.176614445962288
150 | 1,0.643311341435752,0.53917536733336
151 | 1,0.652427513768223,0.386155037786121
152 | 0,0.301520618087258,0.00779725506341844
153 | 1,0.580917034140212,0.318993261238895
154 | 1,0.658638758875195,0.58533531721232
155 | 1,0.522655469981122,0.526482285184902
156 | 1,0.619107709006306,0.395568496047808
157 | 0,0.342854385955556,0.00779057938184129
158 | 1,0.86999733382526,0.574001573496694
159 | 0,0.318543715852861,0.114391118336483
160 | 0,0.228396072599371,0.0106065027681785
161 | 1,0.727737130943151,0.38382553707226
162 | 0,0.242654358004674,0.229605566380433
163 | 1,0.849147311410968,0.485542675609488
164 | 1,0.688785680721008,0.568723327471366
165 | 1,0.601072641696223,0.472949199587446
166 | 0,0.386597757697056,-0.0462553172550934
167 | 0,0.173427736038098,0.0661032211070728
168 | 0,0.212619485253217,0.226220588353548
169 | 1,0.676379387810006,0.478200807508695
170 | 1,0.814570220992963,0.343974473335934
171 | 0,0.385176265775491,0.0321607979792773
172 | 1,0.749057204428228,0.460541975098555
173 | 0,0.249098133601278,0.135190023877858
174 | 1,0.755616419656037,0.499437877798512
175 | 1,0.605289891647501,0.576382527441655
176 | 0,0.220124707599109,0.100802583684052
177 | 1,0.611660353946956,0.582417425243852
178 | 1,0.638657314638631,0.567476782208443
179 | 1,0.707364959529237,0.553460099560961
180 | 1,0.750636527933954,0.418064859633507
181 | 0,0.258629634330844,0.205122708778633
182 | 1,0.561659196951965,0.302892925040985
183 | 0,0.261830597304793,0.0675225213946725
184 | 1,0.614308570486587,0.6059587413353
185 | 0,0.414439463469767,0.00965425791140319
186 | 0,0.252322879548744,0.0506559735947343
187 | 0,0.290671271152809,0.00155405446676163
188 | 0,0.235453515294468,0.110972028714464
189 | 1,0.732335416418508,0.448978572463179
190 | 1,0.810313994896262,0.495563824762476
191 | 0,0.421663017868487,-0.0356477152171603
192 | 1,0.696456152042301,0.492043579296949
193 | 1,0.648005874898818,0.474735589412081
194 | 1,0.586856365928848,0.486846359431101
195 | 0,0.311153185052651,0.0667137376136715
196 | 0,0.238197827120521,0.329800929790695
197 | 0,0.152745736144961,-0.0169226770608605
198 | 0,0.383676691365755,-0.0613888293971788
199 | 1,0.58794957455425,0.759074050955085
200 | 0,0.298771398500675,0.181005648077388
201 |
--------------------------------------------------------------------------------
/simdata/linear_data_train.csv:
--------------------------------------------------------------------------------
1 | 0,0.272702493273322,0.0201936700818061
2 | 1,0.867855051798607,0.597829895646543
3 | 1,0.686252591276707,0.425683114984294
4 | 0,0.186145515985916,0.0412716941344573
5 | 1,0.645338925515076,0.580301641707133
6 | 1,0.674375674955563,0.427871763030954
7 | 1,0.699879937946578,0.404807703421517
8 | 1,0.802841564069345,0.416099191374431
9 | 1,0.585520610660174,0.614734443224055
10 | 0,0.159362690865938,0.0438791845717219
11 | 1,0.819060934526088,0.537459473474092
12 | 1,0.87894034372429,0.498259089084295
13 | 1,0.717795050577066,0.4744618920087
14 | 1,0.566033677291232,0.387396342863612
15 | 0,0.247429729835316,0.0879995726331724
16 | 1,0.431249648468584,0.50812116678762
17 | 1,0.445587715698477,0.495087107318878
18 | 1,0.76039174183016,0.444561107196163
19 | 1,0.76051061546582,0.515525121630246
20 | 0,0.275494818915078,0.140478719411677
21 | 1,0.753968841901219,0.524162771626072
22 | 0,0.294497740315007,-0.050388399952233
23 | 1,0.804717983782732,0.398515224230574
24 | 0,0.300418039279848,0.351027135627807
25 | 1,0.688360546074351,0.373526723462879
26 | 1,0.852345401249631,0.406303935627579
27 | 1,0.668099899289195,0.531841347660949
28 | 0,0.304800740891564,0.150389928827743
29 | 1,0.619367500214818,0.480216866285488
30 | 1,0.686703089228757,0.361979344372439
31 | 1,0.718589858837761,0.416212572766102
32 | 1,0.826085146421109,0.575286895749816
33 | 0,0.420448410287454,-0.0546403055195056
34 | 1,0.75141532125873,0.325021724291347
35 | 1,0.730592409613609,0.541059117596772
36 | 1,0.813904839898948,0.495235612652197
37 | 0,0.339090628854521,0.0309991380475252
38 | 0,0.387754495519495,0.111295769160654
39 | 0,0.338508363051928,0.183845232273478
40 | 0,0.252169642133232,-0.0277920767152015
41 | 0,0.192477983760573,0.248539504317799
42 | 1,0.579132815667903,0.596674752538209
43 | 1,0.709753564340006,0.627776265158473
44 | 0,0.385744505772042,0.173471897067801
45 | 0,0.378661420604518,0.247146938725877
46 | 1,0.646574260872838,0.322675381473884
47 | 1,0.609118023604304,0.647831712350825
48 | 1,0.758635069664429,0.597297179207463
49 | 0,0.112918893797697,0.0399196636324755
50 | 0,0.0757196797168712,0.212549149431339
51 | 0,0.262526611883118,0.153510775651215
52 | 0,0.400776042635224,0.0118928062320549
53 | 0,0.249868972870086,0.087510966473546
54 | 0,0.300843498603656,-0.125478800703799
55 | 1,0.68732959206275,0.585402380977657
56 | 0,0.21604950004571,0.107752455797129
57 | 1,0.727973307482895,0.445424058558705
58 | 1,0.611661189054556,0.443567247275501
59 | 1,0.659200378458334,0.438264242694697
60 | 0,0.316661223102756,0.00512723267572414
61 | 0,0.335511102137539,0.0220287094796348
62 | 1,0.568790247259437,0.605633120746626
63 | 1,0.759945347225826,0.530666204478723
64 | 0,0.254527443910965,0.171592939261188
65 | 1,0.766837548998774,0.486441995062381
66 | 0,0.332894760145352,-0.0112936854155695
67 | 0,0.377466773756814,0.155504538357614
68 | 0,0.399326026982261,0.153662508038569
69 | 1,0.836999237521849,0.564460047717315
70 | 0,0.171119962569083,0.127536860104091
71 | 1,0.506373282256241,0.417952369585369
72 | 1,0.560722898029418,0.720223842301507
73 | 0,0.348845475773127,0.225755862785672
74 | 0,0.251503092582984,-0.0437657692433004
75 | 0,0.177615608556207,0.0753154885756146
76 | 0,0.478794516530508,0.205951267283078
77 | 1,0.829635534744383,0.272892546065268
78 | 1,0.83777849837459,0.453830064938213
79 | 0,0.314005750924866,-0.036418264249222
80 | 1,0.777874637421659,0.658965983018213
81 | 1,0.667422914295883,0.450202737899244
82 | 0,0.413187584863836,-0.0382542681011459
83 | 0,0.313650896375576,-0.052069716414482
84 | 0,0.337713941691823,0.14231450721611
85 | 0,0.208535455875277,0.0365610978079326
86 | 1,0.751917522898454,0.397763749461091
87 | 0,0.704285128566686,0.0173529747875296
88 | 1,0.660159168124926,0.686879190712555
89 | 0,0.239882597095324,0.039489328310596
90 | 0,0.261667488176466,0.120468313797353
91 | 0,0.221322822342822,-0.0147960345048331
92 | 0,0.302200894788074,0.0773367505182494
93 | 0,0.464646005426031,0.0624406129250455
94 | 1,0.72353467112824,0.561586283981498
95 | 1,0.779848416299403,0.40335122146947
96 | 0,0.29619539738445,-0.0889825980170734
97 | 0,0.310118813111254,-0.0470255968780949
98 | 0,0.280329283647515,0.209183926330223
99 | 0,0.495449785691372,0.0943897292081698
100 | 0,0.323181642395557,0.214347078498927
101 | 1,0.608712387395747,0.464674597744798
102 | 0,0.333967005925061,0.108858809713995
103 | 1,0.670725446177098,0.562078812324667
104 | 0,0.157682515511279,0.114425938519343
105 | 0,0.263527985253847,0.0779341919535255
106 | 1,0.768842729112313,0.607910862471588
107 | 1,0.782968217248463,0.684563065693586
108 | 1,0.559552053226898,0.525739246953288
109 | 0,0.218011674132951,0.235791501380809
110 | 1,0.889413715027983,0.272981766226483
111 | 1,0.707683165108239,0.43858100159458
112 | 1,0.626040631828399,0.573452439942206
113 | 0,0.319897190940781,0.0644771584718417
114 | 0,0.414126766081263,0.299837127790236
115 | 0,0.321648041917345,0.110466634556545
116 | 0,0.302937416869991,0.0800642826815047
117 | 0,0.229048040649578,-0.184479427661952
118 | 0,0.0838523271376479,0.167658402689939
119 | 0,0.191948768561072,-0.0498624393529019
120 | 1,0.710093053081494,0.511290768221669
121 | 0,0.249260920804791,0.084829401727985
122 | 0,0.344202229217507,0.0310801977342976
123 | 0,0.394779068995597,0.0639967934529216
124 | 0,0.296291514179376,0.0940554702528169
125 | 1,0.812874595218026,0.469660618071793
126 | 0,0.312020940488456,0.338829143466612
127 | 1,0.663441004626609,0.46359918851702
128 | 1,0.516471550102168,0.661466565814409
129 | 0,0.301371308850716,0.0753255969087134
130 | 0,0.273261668487338,0.0633099583357949
131 | 0,0.191452120986692,-0.143733307029889
132 | 0,0.180617862650367,0.154215474907655
133 | 0,0.229543176130791,0.163137050894395
134 | 0,0.245315233544734,0.019128303476096
135 | 1,0.748493032268826,0.435325673692232
136 | 1,0.772508782267438,0.588852647990746
137 | 1,0.636477940285981,0.390966711809906
138 | 0,0.285085569705584,-0.00531731530741626
139 | 0,0.163379626486227,-0.117361545679424
140 | 0,0.468617183825494,0.207884744319045
141 | 1,0.716267375815614,0.58055995163604
142 | 1,0.691690404380763,0.455468188511155
143 | 1,0.832068210563701,0.498498173611383
144 | 1,0.798097754407747,0.422749113552243
145 | 0,0.31681926493838,-0.0047561585254903
146 | 1,0.827857867668203,0.631884677407406
147 | 1,0.824940709980804,0.451138822676032
148 | 1,0.749289181326563,0.532328990098315
149 | 1,0.624983995029484,0.351158851969964
150 | 0,0.246409770038098,0.150676967160461
151 | 0,0.223024354285995,-0.151750472867883
152 | 1,0.76253393134356,0.586377574234753
153 | 0,0.256898604102665,0.0438509756239602
154 | 0,0.253907298425618,-0.039504453708371
155 | 1,0.734372750482043,0.663550372431311
156 | 1,0.814369495793697,0.468059076508072
157 | 0,0.458348241169566,0.245312288102812
158 | 0,0.439622581101274,0.0632159802013028
159 | 1,0.640924211952676,0.430795319116029
160 | 0,0.388509099154434,0.197161148794277
161 | 0,0.181132949519206,0.126716633249988
162 | 0,0.376802675521996,0.159627299983832
163 | 1,0.613992378167526,0.448724106891588
164 | 1,0.831026553261926,0.484913716227477
165 | 1,0.556045972239925,0.587379268102036
166 | 0,0.395878015676788,0.10323120916278
167 | 1,0.591953582092163,0.417286878320068
168 | 1,0.46725112256271,0.520724901872325
169 | 0,0.346922580998637,0.0944775645412709
170 | 1,0.696503870903246,0.437940147112552
171 | 1,0.637613213827238,0.471962377250537
172 | 0,0.335162981545741,-0.027939761534094
173 | 0,0.143582808117823,0.167860616383039
174 | 1,0.723751517994281,0.478458346818457
175 | 1,0.660177641452219,0.483452537209393
176 | 1,0.76426800556873,0.535206233408446
177 | 1,0.720740791695091,0.389326271169869
178 | 0,0.174056768627934,0.154238589976693
179 | 0,0.198788760104965,0.00682620020478004
180 | 1,0.910390774424439,0.566187876343115
181 | 1,0.776646384853128,0.56813908923461
182 | 1,0.587905469226655,0.436903266509323
183 | 1,0.669457159706513,0.614707436203888
184 | 1,0.810283279996055,0.545987496271549
185 | 0,0.344774208069323,0.0881550516552802
186 | 0,0.303495334257899,0.0089383403243439
187 | 0,0.269511611959495,0.21121955827661
188 | 0,0.299334935856193,0.101146959731773
189 | 1,0.506022527536241,0.578908197947377
190 | 0,0.394109801948955,0.0336476048018214
191 | 0,0.327404729016729,0.0431007508392108
192 | 0,0.365495572999311,0.0482730093488889
193 | 0,0.374130299862337,0.0359913916271899
194 | 1,0.85526938076446,0.45332572345042
195 | 0,0.18341342668473,0.0272027593280061
196 | 1,0.733614866593426,0.526643787401623
197 | 0,0.318182021030057,0.103656647700912
198 | 0,0.240920265830654,0.0508657765459108
199 | 1,0.513047324442947,0.615283053940208
200 | 0,0.151334880808228,0.315431997368872
201 | 0,0.394462481958267,0.101655086031096
202 | 1,0.515170779931783,0.58795511851458
203 | 1,0.87822902495986,0.383079543553225
204 | 0,0.209125274713165,0.0405155085830344
205 | 1,0.741073710053074,0.543223635403209
206 | 1,0.717691808404203,0.46886594362597
207 | 0,0.292397881802737,-0.233389260751618
208 | 1,0.702115243155436,0.572679676100641
209 | 0,0.375094870497223,0.150809692407089
210 | 1,0.785725652776951,0.657917019182177
211 | 1,0.654858803321836,0.594007046555065
212 | 0,-0.015536607372292,0.118473570384402
213 | 1,0.823689849537966,0.517843286242661
214 | 1,0.737024212776215,0.516770848076513
215 | 1,0.547824942019359,0.542864387419221
216 | 0,0.00300243591783783,0.0669321614320229
217 | 0,0.273884392760745,0.171475705806533
218 | 0,0.103536256254438,0.0598674814024887
219 | 1,0.773675819047256,0.453959112207333
220 | 1,0.578217531502335,0.43667171410766
221 | 0,0.232639642579056,0.240924534449787
222 | 1,0.727694558223229,0.642259843020284
223 | 0,0.290605671376913,0.164229334828911
224 | 1,0.692293722721867,0.496939590106284
225 | 1,0.615593461878773,0.449406011572758
226 | 1,0.627821905109084,0.619436042725204
227 | 0,0.238329429591959,0.286375999752565
228 | 1,0.792581075649811,0.344853674219274
229 | 0,0.350738565735997,0.22558224695422
230 | 0,0.319592422848907,0.0867513089001646
231 | 1,0.643171034805571,0.563205996550792
232 | 0,0.374723362007506,0.00837700696183299
233 | 0,0.254324259112646,0.00144734006907962
234 | 1,0.578663882214374,0.529705467976943
235 | 1,0.882743803416507,0.561587200664543
236 | 1,0.815857167548696,0.527439454078851
237 | 1,0.726982121399193,0.492554058609603
238 | 1,0.552847329036692,0.577227366448834
239 | 1,0.637439391832819,0.515361216802003
240 | 1,0.835461222089426,0.35051988465915
241 | 1,0.652410315557341,0.509471949208526
242 | 0,0.488358973871447,0.00779776032373296
243 | 1,0.785369845724894,0.570234703493426
244 | 1,0.777757685635365,0.45173070684212
245 | 1,0.795701597689754,0.524204837455563
246 | 1,0.549169099565117,0.384759752701807
247 | 0,0.234862682863927,0.129931624201085
248 | 0,0.193303345986147,0.102307826779251
249 | 1,0.742839831135753,0.682694301793083
250 | 1,0.743079207517149,0.437048647358291
251 | 0,0.195641253875026,0.142516326553134
252 | 0,0.194197685991943,0.0711303494887362
253 | 0,0.397289909365214,0.147242119427428
254 | 0,0.280735382867741,0.2741197609794
255 | 0,0.271616309865042,0.130158033572375
256 | 0,0.226743613931422,0.031337905344956
257 | 0,0.323762524070031,0.23784457012538
258 | 0,0.433031580638902,0.0882404371773291
259 | 1,0.634980622804064,0.521478938641404
260 | 0,0.248833759760726,0.220434482749925
261 | 1,0.631488997469027,0.507633087769453
262 | 1,0.58169884772006,0.516668109773911
263 | 1,0.615425233806811,0.350418391510562
264 | 0,0.195709693569621,0.0534423509285914
265 | 0,0.258753403680014,0.0846353866955831
266 | 1,0.679225606911221,0.48019792150124
267 | 0,0.356454895453918,0.00599612727378884
268 | 0,0.157243082918555,0.00198288708382963
269 | 0,0.403489012486068,0.176443486229387
270 | 0,0.406157733738504,0.148474644739375
271 | 1,0.557526378882686,0.469763927684993
272 | 0,0.409614776290418,0.127695190474676
273 | 0,0.221130957798106,0.00246251644256472
274 | 0,0.307149139147413,0.0351128399564661
275 | 1,0.880521324305735,0.437411475204538
276 | 0,0.209054910179452,0.365368139829218
277 | 1,0.744834054436914,0.328366131280262
278 | 1,0.619738988151147,0.498248349424278
279 | 1,0.732680162526291,0.356227527560738
280 | 0,0.235409738360354,0.155565967884249
281 | 1,0.708282472843026,0.466522588821841
282 | 1,0.653624611750762,0.545726726383717
283 | 0,0.244974323858804,0.0233317035889604
284 | 1,0.687006405386532,0.537180417249272
285 | 0,0.270027214927163,0.145099856731118
286 | 1,0.529880648219717,0.373601704431567
287 | 0,0.153435200903184,0.0827142600527631
288 | 0,0.196907982479139,0.0110335793546832
289 | 0,0.183488581842081,0.178008251937411
290 | 0,0.103729266597934,0.226377008526147
291 | 1,0.497275351166873,0.557524687583008
292 | 0,0.234936388491344,0.153641322229989
293 | 1,0.638117952580239,0.502399163015757
294 | 0,0.265919558672097,-0.0175444065462989
295 | 1,0.972739429139427,0.579478912092279
296 | 1,0.771707055886069,0.432483214910362
297 | 0,0.319847915030195,-0.0182933707045556
298 | 1,0.640152218640739,0.446549485194609
299 | 0,0.309726418649981,0.0180711719824336
300 | 1,0.59634217809301,0.465539422207531
301 | 1,0.700837299015692,0.56459983382551
302 | 1,0.627143376122354,0.566889375676663
303 | 1,0.678457616699935,0.46206888081276
304 | 0,0.335575549111729,0.236793685308073
305 | 1,0.552340913408488,0.454735797393809
306 | 1,0.775272923590388,0.570571050115034
307 | 0,0.389002732801433,0.0844440149829364
308 | 1,0.774075574991445,0.644286970259073
309 | 0,0.436576258767381,0.113144386193333
310 | 0,0.131806366354122,0.135492054483679
311 | 1,0.865305609911491,0.393004029700549
312 | 0,0.383896546872063,0.136931931327836
313 | 0,0.52876851092401,0.18694206198509
314 | 1,0.872037927046165,0.524298979669219
315 | 0,0.319398680338347,0.082050069292556
316 | 0,0.306019228765853,0.171146090516605
317 | 1,0.568028897019929,0.397139632050195
318 | 1,0.687004258165262,0.657174773644742
319 | 0,0.264884892997342,0.152177916190216
320 | 0,0.324477457435399,0.130638598187011
321 | 1,0.680743646079928,0.618979277469957
322 | 1,0.621895527664642,0.57337598375699
323 | 1,0.712168082997765,0.53550873606463
324 | 1,0.61978701745368,0.609027977048943
325 | 0,0.365233331106601,-0.0613362243238503
326 | 0,0.397735986696886,0.112311028414076
327 | 1,0.606235254004225,0.400447895775286
328 | 0,0.321941398084019,0.21374858825093
329 | 0,0.461363308549641,0.118628064693912
330 | 1,0.701084109913346,0.432315413377117
331 | 1,0.599379648268497,0.557589211628395
332 | 1,0.765671832669,0.522974365432396
333 | 0,0.418904308051734,0.141407526973326
334 | 0,0.0928236534128022,0.208792514080872
335 | 1,0.773308591622014,0.384933544483661
336 | 0,0.310011149079205,-0.113025500604217
337 | 0,0.330476297927931,0.0562833566800701
338 | 1,0.719222956560448,0.599601234367575
339 | 0,0.0948268113584294,0.155328008227369
340 | 0,0.266557090739179,0.0281396471150643
341 | 1,0.645157933764616,0.504226979507884
342 | 1,0.962330528460206,0.452805671774283
343 | 1,0.729786811932308,0.567362214851711
344 | 0,0.42822079264878,-0.116947049145102
345 | 0,0.28292031311975,0.210396468088107
346 | 1,0.818960733047804,0.471478708169644
347 | 0,0.301000395650939,0.275350638737648
348 | 0,0.248563052222786,0.210458573470741
349 | 1,0.624567745681724,0.463364770378235
350 | 1,0.721131916759737,0.33170180449007
351 | 0,0.444882506681454,0.0612027055428679
352 | 1,0.638377283819513,0.445003526421249
353 | 1,0.696976438121931,0.360248232352897
354 | 0,0.332136597424114,-0.109847076823912
355 | 1,0.699361384618667,0.432264272568347
356 | 1,0.831993970135725,0.539947466404146
357 | 1,0.561925910683751,0.424648173491636
358 | 0,0.156189816118722,0.0729055751558616
359 | 1,0.693031841480399,0.405688203837692
360 | 0,0.284499114880476,0.169492667486919
361 | 1,0.382645621013668,0.693017458125835
362 | 1,0.787366045398516,0.56245748664724
363 | 0,0.357534849007698,0.026936192217413
364 | 1,0.731630968684622,0.553183988567175
365 | 0,0.196234400016264,-0.183104296203216
366 | 0,0.360587443227833,0.0459158003103007
367 | 0,0.20757915464528,0.153368408854095
368 | 1,0.779538870581914,0.31652036018281
369 | 1,0.603413970007394,0.565691097923056
370 | 1,0.703962936778714,0.515774251438579
371 | 0,0.262673571663612,0.0650658239322596
372 | 0,0.282079330100814,0.111953207211367
373 | 0,0.297324925180568,-0.0391228322398546
374 | 1,0.763180071191603,0.490652821893112
375 | 0,0.315184964005646,0.253006254684889
376 | 1,0.734171640777751,0.519167158795141
377 | 0,0.184180985030549,0.00393161644942733
378 | 0,0.223298865161568,-0.026327091947429
379 | 0,0.316478145129054,0.0575035712393981
380 | 0,0.378857137825344,0.22082964898691
381 | 0,0.397656833742266,0.083190442427166
382 | 1,0.759538686196077,0.437127672521589
383 | 1,0.616664639947993,0.617391680132461
384 | 0,0.145722235542289,0.108084199810254
385 | 0,0.212343117912545,0.15016795466069
386 | 1,0.73503309376852,0.570909233439749
387 | 1,0.77543710347383,0.3712569376399
388 | 1,0.713781263822946,0.627104550338368
389 | 1,0.657921394808149,0.661998316396182
390 | 1,0.761264669790777,0.46397686813566
391 | 1,0.713206276225525,0.744066073525418
392 | 1,0.496790564523307,0.346152051582078
393 | 0,0.229258131675837,0.22391922262195
394 | 1,0.629957821024016,0.598895836847523
395 | 1,0.463104017343866,0.583455771572864
396 | 1,0.740197079770519,0.511804494746958
397 | 1,0.79300764090308,0.645624889279153
398 | 1,0.670301301929275,0.505798660521333
399 | 1,0.642616536317977,0.528714803503423
400 | 1,0.788762644621946,0.616529533399391
401 | 1,0.857464939639688,0.51031489411801
402 | 0,0.336846954546065,-0.142491179183837
403 | 1,0.633643110908775,0.560760074681661
404 | 1,0.808547386300575,0.483391730709343
405 | 0,0.171527188406912,0.181674625895587
406 | 1,0.785541616406081,0.584142957013564
407 | 1,0.679314778984817,0.588366113333832
408 | 0,0.361113481215867,0.0535724803311333
409 | 0,0.397064291394628,0.214166442900235
410 | 1,0.59289850815176,0.597423108259622
411 | 1,0.762216937589137,0.685826995668519
412 | 1,0.598470889624612,0.512232600439455
413 | 1,0.847631108985384,0.469639451803623
414 | 0,0.352206404961858,0.0658130319379766
415 | 0,0.302196572373195,0.0657407190121178
416 | 1,0.635779981355708,0.528047686177884
417 | 1,0.578992955418749,0.514896918280291
418 | 0,0.212999520548702,0.0541441484560792
419 | 1,0.891804459254137,0.398665659557486
420 | 1,0.654340337267037,0.451580501947295
421 | 0,0.247719577705494,0.0799734001071803
422 | 0,0.273897421994848,0.203501937678282
423 | 1,0.567001182521013,0.597298736166695
424 | 0,0.467798139709093,0.229337851914864
425 | 1,0.570353132688965,0.478803832996153
426 | 0,0.443542502048693,0.0928170957175811
427 | 1,0.833742702690899,0.433885148967803
428 | 0,0.207152652348099,0.160692107355562
429 | 1,0.596946688183879,0.504932395223731
430 | 0,0.237987468873185,0.111890466523783
431 | 0,0.122120423311518,0.0664443604938182
432 | 1,0.625397213019283,0.429400535274749
433 | 1,0.778207480642215,0.503531292602428
434 | 0,0.436212859109153,0.0122862274912339
435 | 1,0.625001694810475,0.549123827462865
436 | 0,0.136545300743945,0.126831485205996
437 | 1,0.597731579576769,0.556662445462397
438 | 0,0.408344125500783,0.253336636246755
439 | 1,0.964226974503666,0.320675977950571
440 | 0,0.324997356440017,0.048644424257146
441 | 1,0.652230816894289,0.531950363671602
442 | 0,0.0389828168912392,0.137251854890295
443 | 1,0.736664215489305,0.404307513508069
444 | 1,0.744811760763821,0.575886916382629
445 | 0,0.470458604705599,0.289305226993897
446 | 0,0.38623549650401,0.129688951832793
447 | 0,0.287511322902999,0.128228003988729
448 | 0,0.214845705418748,0.155132679897794
449 | 1,0.549283007772904,0.502960660190269
450 | 0,0.310174445838025,0.0941481076322118
451 | 0,0.195454566633443,-0.234607607847963
452 | 0,0.298249802393511,0.112851002967217
453 | 1,0.690237004664318,0.378108840643105
454 | 0,0.189398549132626,0.0890272156639169
455 | 1,0.614365823452273,0.456521991264265
456 | 1,0.518062634355388,0.561211028653153
457 | 0,0.372343420847563,-0.0673792950230403
458 | 1,0.605582550542063,0.719570587546506
459 | 1,0.558400398717601,0.553650998537768
460 | 1,0.619303706502876,0.525020408389046
461 | 0,0.416806925989891,0.127498072446934
462 | 1,0.668371871343868,0.439494297274923
463 | 0,0.227864005666367,0.249375945070259
464 | 0,0.388964303923619,0.0512118591290209
465 | 0,0.523396016444082,0.0793919490086411
466 | 1,0.696502700171182,0.319249568970975
467 | 1,0.694920878442282,0.611577830966688
468 | 0,0.18560722345213,0.141410362044617
469 | 1,0.639350524013375,0.531104368031335
470 | 0,0.294566651242279,0.0764817614760867
471 | 1,0.725640168924614,0.49358305389429
472 | 1,0.877070872119292,0.391289726240275
473 | 1,0.587562805028845,0.729819251882595
474 | 0,0.340721421395989,0.16849774837481
475 | 0,0.430236481117462,0.130542181925397
476 | 1,0.785999865781316,0.539671413885152
477 | 1,0.758827793555516,0.503597044078879
478 | 0,0.590143103205461,0.151490159471587
479 | 1,0.578174749789001,0.267200797992692
480 | 1,0.657664683723653,0.410509106904018
481 | 1,0.839918271521176,0.427395442341733
482 | 0,0.39200354022638,0.236253719076249
483 | 1,0.788051288258796,0.292994344630151
484 | 1,0.739784991601855,0.354434076820377
485 | 0,0.4551704629299,-0.130902505903346
486 | 0,0.244443670308969,0.200144611747777
487 | 1,0.636376132613296,0.45775638731752
488 | 1,0.796637839131232,0.580737820037626
489 | 1,0.708801577457193,0.545352360978482
490 | 1,0.550265645136274,0.484494061509674
491 | 0,0.234541465103561,0.233991672818969
492 | 0,0.194670134511449,0.126177918457203
493 | 1,0.766889064563754,0.466402612868279
494 | 1,0.80071884792835,0.404593258613153
495 | 0,0.381284472737006,0.17687356357681
496 | 1,0.599522552395624,0.543136172846105
497 | 0,0.240574345353352,0.0253513928215877
498 | 0,0.410017041126765,-0.0100812540539413
499 | 0,0.361974435284997,0.0881434501053607
500 | 0,0.207727524944803,-0.0367640612710508
501 | 0,0.1594489254994,0.201522377891547
502 | 0,0.256651273497901,-0.132347852626953
503 | 0,0.197554559087531,0.177957295136275
504 | 0,0.398728683880572,0.16728015949292
505 | 0,0.360811993622786,-0.04824866638835
506 | 1,0.733941290645016,0.433150455186029
507 | 1,0.618828631204708,0.486055269309517
508 | 0,0.165115647436616,0.166373378499186
509 | 0,0.329986469377093,-0.145319876090286
510 | 1,0.655945618215475,0.565458427664124
511 | 1,0.63592101405517,0.611493955328119
512 | 0,0.187670062838029,0.258563366423394
513 | 0,0.349829851646627,0.084957807867815
514 | 0,0.378321216347947,0.214635756849205
515 | 1,0.707728669366132,0.48114770792878
516 | 0,0.355190716822167,0.120052746763546
517 | 0,0.466632125933254,0.0342335031628444
518 | 1,0.660195421430894,0.535792199622042
519 | 0,0.372382587512372,0.178211756645497
520 | 0,0.28955963461689,0.232916243764839
521 | 0,0.275691767045573,0.104233261047742
522 | 0,0.457583346733422,0.0113568064427055
523 | 0,0.30207949013981,0.153308802533798
524 | 0,0.342152315929637,0.223798246892968
525 | 0,0.273887320638952,0.206809480968687
526 | 1,0.685612837176582,0.664078399046569
527 | 1,0.764881455038035,0.621398000348663
528 | 1,0.52238020940587,0.358551794456546
529 | 1,0.593308566223648,0.451690105976225
530 | 0,0.291803725430566,0.127856814131483
531 | 1,0.726693013263402,0.46135257365006
532 | 1,0.7372293568951,0.505908624984738
533 | 1,0.565792951278671,0.63327022806756
534 | 0,-0.0429918005727185,0.134210110662662
535 | 1,0.831211182597751,0.52623590173448
536 | 1,0.650675907968231,0.513559986676506
537 | 1,0.696222490735325,0.55761580245871
538 | 1,0.650270082721844,0.432031433143288
539 | 0,0.211309305874204,0.185172747070275
540 | 1,0.880032898032521,0.471690943221445
541 | 0,0.242159521808166,0.124004647757489
542 | 1,0.738575812297853,0.560734748425541
543 | 0,0.218745425642789,-0.0280331971720218
544 | 0,0.157586196082378,0.11668032192546
545 | 1,0.746213040875643,0.479935580740508
546 | 1,0.737379178012575,0.524749497313761
547 | 1,0.676603305786792,0.574394608417219
548 | 0,0.234500311201054,0.133431799436931
549 | 1,0.74042749986537,0.342508162770225
550 | 1,0.746154363720681,0.611291853873489
551 | 0,0.130483539577805,0.270796873571717
552 | 0,0.363394337502584,0.0324431998285745
553 | 1,0.699695087182931,0.377325657848333
554 | 0,0.215982659005145,0.100276639841196
555 | 0,0.322170812569563,0.0286951852530272
556 | 0,0.467815313162574,0.166257700276355
557 | 0,0.344496907667261,-0.0787891568310318
558 | 0,0.269913803207164,0.180110737296386
559 | 1,0.607093838078526,0.42899287568583
560 | 1,0.702841598939775,0.528984272889046
561 | 0,0.433699692174703,0.0401236778817728
562 | 1,0.642206235256222,0.166347496883072
563 | 0,0.280306383994694,0.0515086109237062
564 | 0,0.199028975587147,0.157396279671003
565 | 1,0.685560135054544,0.351734395271269
566 | 0,0.158536018164786,0.145683091490734
567 | 0,0.160030887790979,0.172756912807044
568 | 1,0.564790816886263,0.477457505394322
569 | 1,0.552832873258963,0.642784726756741
570 | 0,0.194718231937484,0.156165824848463
571 | 0,0.426026428855053,0.164696485653757
572 | 1,0.597887804377164,0.594844293548714
573 | 0,0.295528266788754,0.0747183403083028
574 | 0,0.35263366740229,0.123923521156832
575 | 0,0.116067262405864,0.105276933488227
576 | 0,0.189970215522424,0.131514007418794
577 | 0,0.358483581296374,0.158239696171057
578 | 1,0.527275122386961,0.466595976196316
579 | 1,0.766176616897604,0.378363494395416
580 | 0,0.318242758552735,0.0247720606042103
581 | 0,0.349499567667261,0.0689912186755568
582 | 1,0.646156331064499,0.612417041419526
583 | 1,0.846563761969743,0.546017215326986
584 | 1,0.746112882858578,0.385233322062056
585 | 1,0.572935439371457,0.404890792506923
586 | 0,0.147158625146855,0.0460825340590317
587 | 0,0.336616456247489,0.0669092618933998
588 | 0,0.310793720925482,0.27135928077744
589 | 1,0.728453223685108,0.322437669015729
590 | 0,0.25385354823197,-0.084252650626292
591 | 1,0.709475389351328,0.522560179699657
592 | 1,0.800320632714859,0.480947297822752
593 | 0,0.455797243394366,0.0554957479690124
594 | 1,0.624757344438195,0.601581397859871
595 | 0,0.412517797005096,0.183547427624048
596 | 1,0.754312959030315,0.542308257169538
597 | 1,0.774425471173311,0.559642830124951
598 | 1,0.785912061279285,0.607153576392285
599 | 1,0.550647569861904,0.521253980103191
600 | 1,0.781194627864989,0.437515084277226
601 | 1,0.769721404472552,0.489638616811605
602 | 0,0.345851331281325,-0.0835423955202882
603 | 0,0.274205452800902,0.0369729562045203
604 | 1,0.742730066689469,0.448510139288652
605 | 1,0.669008356649984,0.556092799188644
606 | 1,0.521279742678478,0.533259746345554
607 | 0,0.368961393478092,0.0620565712312246
608 | 0,0.418343959806105,-0.00148766366146191
609 | 1,0.678772181087111,0.297239507972293
610 | 1,0.816917949827907,0.350846547816628
611 | 1,0.576715848038144,0.510633959111448
612 | 1,0.609197071600881,0.696525280668464
613 | 0,0.290185006852936,0.0950991271521423
614 | 0,0.220627942907053,0.172388573953883
615 | 0,0.323604068726149,0.321929878579241
616 | 1,0.636452067094477,0.378647099783854
617 | 0,0.261772082828085,0.125482954085528
618 | 0,0.199142614419153,0.24837010252674
619 | 0,0.359692272546054,0.133475454756772
620 | 1,0.656923779669395,0.399015673625338
621 | 1,0.765466551302201,0.486622732524117
622 | 0,0.363038943749185,0.190146711949949
623 | 1,0.573728527479804,0.708506134537767
624 | 0,0.227434505032207,0.132005928531751
625 | 0,0.362261102269695,0.182091406424617
626 | 1,0.552767591574914,0.640231347572874
627 | 1,0.745008921758967,0.476138905282749
628 | 0,0.235016094982593,0.030821328831152
629 | 1,0.569927185769143,0.533599512946027
630 | 1,0.540919226433496,0.445965587532701
631 | 1,0.537303628601098,0.442510392206699
632 | 1,0.614599312376511,0.293594212647256
633 | 1,0.606013974663877,0.544264840906945
634 | 1,0.454528809834834,0.499476423679876
635 | 1,0.737317161552518,0.455849265676542
636 | 0,0.226877643649583,0.0778257767447274
637 | 0,0.235769090654382,-0.0541148381745464
638 | 1,0.629191245712809,0.412962868228981
639 | 0,0.288703334680669,0.0509839710513972
640 | 0,-0.0553610449220905,0.112077098399482
641 | 0,0.308572147660474,0.178866373774703
642 | 0,0.280267718134234,0.168879328616616
643 | 0,0.290595179766584,0.085496431509059
644 | 1,0.87979592429344,0.517910778998036
645 | 1,0.647119063616976,0.487188900738255
646 | 0,0.187124094699949,0.121866784395191
647 | 0,0.422112816496255,0.209956176604592
648 | 1,0.732559070962285,0.440294557978558
649 | 0,0.388661232418864,0.148966053456064
650 | 0,0.423245072075889,0.0867387632012575
651 | 0,0.282849124264832,0.148769986793792
652 | 0,0.424768340495513,0.0556317808256201
653 | 0,0.174129959981651,0.00807187578895145
654 | 0,0.501843353134547,0.133031451175741
655 | 0,0.120615734783942,0.0376933600645564
656 | 0,0.378693368144518,-0.0438914269791444
657 | 0,0.170081040130433,0.182335149838812
658 | 0,0.352413327115188,0.0144398079790181
659 | 1,0.729827688438959,0.593487551439833
660 | 0,0.378109776388778,0.289359718594457
661 | 0,0.31249013506134,0.0693060078898178
662 | 0,0.231522642975173,0.047844674381033
663 | 0,0.399658150733606,-0.0853742813911763
664 | 1,0.469529396378605,0.551583233394723
665 | 0,0.183216611939466,0.124901308938973
666 | 0,0.500774519409122,0.103434288267971
667 | 0,0.407243066732757,0.112239607034534
668 | 1,0.741062198918369,0.386530629989029
669 | 0,0.299881602923698,0.20512589925838
670 | 1,0.548105379482974,0.463687353737308
671 | 0,0.349747823856433,0.0767352421283166
672 | 0,0.528560887313413,0.233681140107927
673 | 1,0.697685065083971,0.421634161117949
674 | 1,0.683603527024515,0.529769635911744
675 | 0,0.224155125982703,0.130060041166071
676 | 1,0.84206414514415,0.389687568689523
677 | 0,0.349116725186401,0.0931364450169638
678 | 1,0.717425799742033,0.569001564152182
679 | 0,0.303835388568343,-0.0808600312562248
680 | 0,0.385282841600747,0.176629671043404
681 | 1,0.728233194892712,0.437566561716223
682 | 0,0.275560822841845,0.067759535376445
683 | 1,0.752526631380138,0.437676927621947
684 | 1,0.77594551463372,0.647335356401004
685 | 0,0.289443821834818,0.129655963824409
686 | 0,0.457031043554695,0.110360318511478
687 | 1,0.670168642297959,0.550344589329263
688 | 1,0.704943163416788,0.523602746131117
689 | 0,0.317110572357443,0.244019105850541
690 | 0,0.340153310682465,0.161452198272563
691 | 1,0.677736499067133,0.529667449885923
692 | 0,0.263745005034983,-0.0244753757620519
693 | 0,0.316498859817741,-0.0606874549183752
694 | 1,0.831297688454378,0.42940165448941
695 | 0,0.456705300158062,0.0624238498026648
696 | 0,0.359645481362399,0.0448678597671015
697 | 1,0.685459002073857,0.484052444289094
698 | 0,0.318081688805948,0.00509799198151692
699 | 1,0.85843942825533,0.463609514867966
700 | 1,0.73009055780931,0.50508575074328
701 | 1,0.657915724124217,0.626157950965602
702 | 0,0.463987853561012,0.114426246147913
703 | 0,0.553660197410232,0.0842710327103933
704 | 0,0.460782987659597,0.297823063863037
705 | 0,0.264636371729503,-0.0733221751824376
706 | 0,0.318521878216774,0.156081826169686
707 | 0,0.176521557194302,0.209879949665236
708 | 1,0.616728756921145,0.598782674158815
709 | 0,0.348737364342779,0.196038946628466
710 | 1,0.584272236083607,0.5583952910681
711 | 0,0.491048239993434,-0.0341748906928977
712 | 0,0.363221300121058,0.00311416891583936
713 | 0,0.294428489999627,0.0740381167710181
714 | 0,0.314484068321389,0.174130968448632
715 | 1,0.711674830724482,0.722225529524769
716 | 0,0.429497325683598,-0.00338571057528961
717 | 1,0.693051453242482,0.552981854141337
718 | 1,0.694557977087814,0.344335319534401
719 | 1,0.698380454854423,0.50235305891111
720 | 1,0.546240528463992,0.484799737777533
721 | 1,0.84456999822897,0.511630529094712
722 | 1,0.829631649689789,0.449013520814537
723 | 1,0.611370787230976,0.47537648123942
724 | 1,0.48794441957928,0.352474619443002
725 | 1,0.678121429032912,0.717614501803218
726 | 1,0.656238979787779,0.506295703756179
727 | 1,0.808704355917707,0.544289437087048
728 | 0,0.291000786174897,0.18756242990602
729 | 0,0.286008831075059,0.183256270690621
730 | 1,0.778314686712391,0.551749873922335
731 | 0,0.348163414892062,0.0526164568572623
732 | 1,0.698985646368548,0.450526988393171
733 | 1,0.652419269932391,0.551775767280179
734 | 1,0.714540613442356,0.499080188789111
735 | 0,0.288269544293992,0.176780433458229
736 | 1,0.701376295676095,0.634380794788028
737 | 0,0.16985426142413,0.130183460048874
738 | 1,0.665964081261148,0.497872021783534
739 | 1,0.659681610788045,0.440149383368503
740 | 0,0.295506048188131,0.19356452237853
741 | 0,0.3581883242448,0.188038263309915
742 | 0,0.2518389930837,0.141931476447984
743 | 0,0.255088820307825,0.136113796079466
744 | 1,0.788789907139158,0.472736976049706
745 | 0,0.254720082517004,0.278841949769336
746 | 0,0.400096489689716,0.169806941395634
747 | 0,0.389325388925308,0.0428207959406402
748 | 0,0.303596644217423,-0.0211356942252308
749 | 1,0.598075982202267,0.71634440679072
750 | 1,0.846531856758702,0.630666796364398
751 | 1,0.636987579847441,0.523821578330955
752 | 1,0.704328174035875,0.464595076574923
753 | 0,0.333602864533136,0.196001784870715
754 | 1,0.718377687551127,0.446878867271396
755 | 0,0.292129461945253,0.208819776462064
756 | 0,0.18141881158465,0.305065208146363
757 | 0,0.297265442638535,0.0490799554703924
758 | 1,0.575074707384884,0.546141853062778
759 | 0,0.360464793278257,0.205406105841231
760 | 1,0.819074760454348,0.412413216527676
761 | 0,0.278193244660972,0.322133841947987
762 | 0,0.234574413009205,0.226352645739012
763 | 1,0.755055764844823,0.736836426380321
764 | 1,0.733022751201858,0.59267341061093
765 | 1,0.85348490448349,0.657929419547266
766 | 1,0.806216184454454,0.500627923967346
767 | 1,0.621731433191579,0.48695714453071
768 | 0,0.258087814184909,-0.0635578237114404
769 | 1,0.814553234525968,0.43746209852251
770 | 0,0.231235130634974,-0.00268733052683408
771 | 0,0.338905885312007,0.252978187630702
772 | 0,0.277807732751421,-0.0399785462420387
773 | 0,0.516568826420593,0.151487916700293
774 | 1,0.64204797455313,0.430535910119798
775 | 1,0.78355445902446,0.437011098351206
776 | 0,0.404887232102823,-0.209793610987081
777 | 1,0.831418300560301,0.665979212193514
778 | 1,0.71338519230368,0.562375116909461
779 | 0,0.110918432853517,0.179360867226739
780 | 1,0.817393901114933,0.570687722319416
781 | 0,0.362123656488518,0.133749838746908
782 | 1,0.678044743030206,0.552285480722416
783 | 1,0.639184966099981,0.389256202074398
784 | 0,0.301418046996501,0.0251739111388578
785 | 0,0.296389031914539,0.14813263086704
786 | 1,0.641962355577562,0.492083970998301
787 | 0,0.350710788108678,0.272359213958655
788 | 1,0.607008203403116,0.342777977349272
789 | 1,0.745329209679339,0.58144982782331
790 | 0,0.378766570158344,-0.028604268323575
791 | 0,0.319729888125828,0.0968840107029267
792 | 1,0.691802673441845,0.640285255996932
793 | 0,0.242278150555005,-0.0275442358243946
794 | 0,0.332368857276272,0.107204488883822
795 | 0,0.246628887420573,-0.0598483717461624
796 | 1,0.689927289944315,0.643969077903893
797 | 0,0.167477837598874,0.221552520108923
798 | 0,0.361225831315617,0.126766741525405
799 | 1,0.599910596476218,0.597604157226741
800 | 1,0.705296772650519,0.562224723426892
801 | 1,0.610859050617126,0.561555935527899
802 | 1,0.779042693428166,0.469742258952299
803 | 1,0.721953383296772,0.330504205034856
804 | 0,0.199177670493139,-0.0541702682912998
805 | 1,0.632324877663293,0.2988795550039
806 | 1,0.710946606108542,0.65070982991859
807 | 1,0.646660609202825,0.554404571510107
808 | 0,0.185876611085429,0.0934318346929306
809 | 0,0.273094770627089,0.101234103940518
810 | 1,0.670510063788298,0.524744046683915
811 | 0,0.330588997705384,1.97621294152073e-05
812 | 0,0.220551652501929,0.00677419470080122
813 | 0,0.509820742534314,0.0474168402911526
814 | 0,0.349215800162633,0.0975564168091252
815 | 0,0.255324166357047,0.0142135303100417
816 | 0,0.207544184859608,0.225988923069435
817 | 1,0.685340920227042,0.494319470006428
818 | 0,0.416679571493822,0.245677237237383
819 | 1,0.812975541830634,0.415628016222127
820 | 0,0.320471039231804,0.150145735237842
821 | 1,0.829461547895229,0.493776376960508
822 | 1,0.680994960542808,0.43400030029373
823 | 0,0.259987042270739,0.114005486392992
824 | 0,0.168848685578161,-0.00954704765822642
825 | 1,0.635145828898241,0.51360447532633
826 | 0,0.421903259687952,0.180794066319864
827 | 1,0.625057332452724,0.641587131914647
828 | 1,0.675848273512748,0.632010071910655
829 | 1,0.533133257051366,0.578125658611273
830 | 0,0.197305182165994,0.151226509649199
831 | 1,0.74653428994078,0.62364340524231
832 | 0,0.253651159020339,0.168892052960315
833 | 1,0.765788074642117,0.454842753406659
834 | 1,0.666398682662433,0.608995629881867
835 | 1,0.635291232115157,0.3766058471493
836 | 1,0.578223848012206,0.577032660203558
837 | 0,0.26551333704833,0.0354747123289688
838 | 1,0.720383364756263,0.314354519714492
839 | 1,0.687289229941347,0.479171746692858
840 | 0,0.327435187790049,-0.0457038326212
841 | 1,0.512954312657834,0.605408005890925
842 | 1,0.574372740473772,0.464359352552996
843 | 1,0.519101638860377,0.567569038032269
844 | 1,0.623855871712756,0.483145048552174
845 | 1,0.599179754034089,0.521952878469356
846 | 0,0.423768817415759,0.0882421894269756
847 | 1,0.746294608042051,0.414510923167562
848 | 1,0.675670405056974,0.422696071313821
849 | 1,0.788703159498108,0.67386104031044
850 | 0,0.378106707841851,0.0760879104127337
851 | 0,0.114270526158963,0.201381513049239
852 | 1,0.747673611917776,0.573924654000711
853 | 1,0.566333032387794,0.743419588671139
854 | 0,0.239739624143197,-0.0275283030751193
855 | 0,0.336807563075062,0.118372100185875
856 | 0,0.39949816825635,-0.0363911604776991
857 | 0,0.226185608435648,-0.0957692926033634
858 | 1,0.709608753767254,0.487886920405059
859 | 1,0.659727666342834,0.721933812585661
860 | 0,0.280539014460227,0.206328172933842
861 | 1,0.622483852839431,0.403868231044358
862 | 1,0.638215910949604,0.391402203658114
863 | 0,0.182465612985859,0.077579336035961
864 | 0,0.16548469680623,0.187504949483702
865 | 1,0.642055730547472,0.496933987331819
866 | 0,0.30462177800554,0.253548749981266
867 | 0,0.360320221194284,0.283823890069657
868 | 1,0.641996569571562,0.507677787015584
869 | 1,0.780896014850636,0.5308701386998
870 | 1,0.705949157648258,0.594286581653946
871 | 1,0.684006329129067,0.418479179364012
872 | 1,0.710585414310567,0.571614895412168
873 | 0,0.301067650445162,-0.166699008606091
874 | 0,0.305202990441317,0.364626716148074
875 | 0,0.164086121321754,0.131222497755963
876 | 0,0.529876888274968,0.041069330989599
877 | 0,0.303736937964446,0.136527110501533
878 | 0,0.28505744342113,0.137579845924959
879 | 1,0.728257086597695,0.48967616688098
880 | 1,0.604446478729784,0.545729909789597
881 | 0,0.293159603172581,0.0490178735520561
882 | 1,0.868714367863232,0.531447483866697
883 | 1,0.67379793891572,0.499687817279283
884 | 1,0.680067364015127,0.537958834329096
885 | 1,0.769445918004292,0.612545393802719
886 | 0,0.229587118303191,0.0932053663345317
887 | 1,0.756073026620105,0.519522905570089
888 | 0,0.172997458030445,0.0116226432767597
889 | 1,0.628765292423085,0.440481162064944
890 | 0,0.435077816729383,0.144463845082589
891 | 1,0.548409328550115,0.453305211297501
892 | 1,0.83986780410854,0.506971986322197
893 | 0,0.33073949997258,0.249754687175591
894 | 1,0.74949720500869,0.410347665801642
895 | 0,0.0929085591442369,0.215360941218623
896 | 1,0.7359470510237,0.599612237896847
897 | 1,0.600964708812595,0.611895797507082
898 | 0,0.339469243009024,0.267604028758517
899 | 1,0.694002550766549,0.40246039615025
900 | 1,0.552501058823616,0.627440184414506
901 | 0,0.40353874235505,0.336157266667048
902 | 0,0.462315574097761,-0.00573901316312267
903 | 0,0.349231556863921,0.086163354915397
904 | 0,0.39193148549771,0.17590698916113
905 | 1,0.676930575172694,0.346510128462135
906 | 1,0.679645828045732,0.577121148661782
907 | 1,0.864397242160299,0.506012221528977
908 | 0,0.286349599507294,0.0284234784210164
909 | 0,0.301056208183654,0.0971042890365868
910 | 1,0.70080768404139,0.49690866368696
911 | 1,0.703500736492505,0.415943311086894
912 | 1,0.635860406671523,0.662055846971972
913 | 1,0.596187712280143,0.520124065364811
914 | 0,0.150957974800847,0.0464335172377109
915 | 1,0.520139166943832,0.596836499299212
916 | 0,0.397568770966058,0.0306454851285853
917 | 0,0.331845949389302,0.11252982223052
918 | 1,0.746466296132988,0.576399481730183
919 | 0,0.27187367746434,0.33376370949036
920 | 0,0.470757410145599,0.078661207553236
921 | 0,0.536172439128294,0.0777459259455492
922 | 0,0.303458038868193,0.2051544655177
923 | 0,0.176054167244673,0.322091359574964
924 | 1,0.766857719111549,0.50691519493913
925 | 0,0.242532011092533,0.129756668505565
926 | 0,0.250988840875535,0.0931375276589551
927 | 1,0.699394326993209,0.452299419298282
928 | 0,0.321874033709486,0.0151074577957314
929 | 1,0.678111194051389,0.392634175140063
930 | 0,0.248833485186545,0.0432042460120094
931 | 1,0.694286452178599,0.602081068167167
932 | 1,0.753212641749318,0.485998157208827
933 | 1,0.721406654309081,0.430254653604519
934 | 1,0.518593077174776,0.456272115507621
935 | 0,0.192697787614544,0.116152945192816
936 | 0,0.217411882851379,-0.0645645709560981
937 | 0,0.149417245045527,-0.0509644534767213
938 | 0,0.333200271991093,0.0126829710560955
939 | 0,0.330071760445476,-0.163593017708845
940 | 1,0.534877952568945,0.602286414156199
941 | 0,0.427137123832751,-0.00212997267714535
942 | 0,0.465209007558267,0.190092842029924
943 | 0,0.387820847910937,0.147036212372151
944 | 1,0.677075663688624,0.393688203202669
945 | 0,0.340122927992711,0.199102849848583
946 | 0,0.375936597647849,0.0576857517171014
947 | 0,0.19680461716877,0.126825098566748
948 | 0,0.345613244418262,0.111968101894549
949 | 0,0.415306271487636,0.0274765339028115
950 | 1,0.759206126508585,0.361066865632757
951 | 1,0.627329454187417,0.70470321713622
952 | 1,0.750035120504359,0.671914272368653
953 | 1,0.658400551559745,0.765364048505117
954 | 1,0.809852288886275,0.369827122090055
955 | 1,0.742155427459077,0.475197149670467
956 | 0,0.472977082954314,0.0414537702779115
957 | 0,0.456431422915802,0.0269498051853626
958 | 0,0.206938238018846,0.194749956359312
959 | 1,0.757067760279864,0.559505907275914
960 | 1,0.734348564804222,0.582643128118847
961 | 0,0.121993378223797,0.163177154070997
962 | 0,0.358354979816009,0.145474920493865
963 | 1,0.670281007673169,0.458492128030286
964 | 1,0.753318178108946,0.478502396023747
965 | 0,0.479495384633774,0.165887540584127
966 | 0,0.356497398658085,0.107194390487611
967 | 0,0.224403509860098,0.0291987439921982
968 | 1,0.609590262907632,0.553526932603486
969 | 0,0.24296702403368,0.0588362296331379
970 | 0,0.420545659324178,0.125156289067077
971 | 0,0.228213758920545,0.143136893584809
972 | 0,0.191837685799215,0.132193944906839
973 | 1,0.661831288773944,0.652660707954536
974 | 0,0.298577371906541,0.0552648411022183
975 | 1,0.730213677155348,0.582079483739723
976 | 0,0.204568420435368,0.0350601196982612
977 | 1,0.716255356890322,0.363472186953968
978 | 0,0.386798314348276,-0.027664840420184
979 | 0,0.342563217571713,-0.158701499702784
980 | 1,0.869206933523169,0.517271382682064
981 | 1,0.773020413744283,0.549650593897377
982 | 1,0.667303669642038,0.437695155146516
983 | 1,0.63592064294667,0.535376831534549
984 | 0,0.319719761117838,0.146917927674461
985 | 1,0.818872517719446,0.47752038987334
986 | 1,0.583970376921327,0.566355241690794
987 | 0,0.37328464075495,0.101685122091795
988 | 0,0.294380885719476,0.0685428528864088
989 | 0,0.347644215517767,0.221266167808915
990 | 1,0.68545554285279,0.500169334192856
991 | 1,0.612060851132193,0.413868137102724
992 | 0,0.266821442944639,0.0992953966705651
993 | 0,0.395630463368185,0.149027011545546
994 | 0,0.369444556046919,0.278277875406332
995 | 1,0.719392196460534,0.465720539664018
996 | 1,0.680332620830717,0.534328581664379
997 | 0,0.179529285117794,-0.110245947644673
998 | 1,0.573306543090293,0.608713744427839
999 | 1,0.500384090018299,0.579511796304665
1000 | 1,0.790973981269152,0.563663506890512
1001 |
--------------------------------------------------------------------------------
/simdata/linear_data_train.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/simdata/linear_data_train.jpg
--------------------------------------------------------------------------------
/simdata/moon_data_eval.csv:
--------------------------------------------------------------------------------
1 | 0,-0.500568579838,0.687106471955
2 | 1,0.190067977988,-0.341116711905
3 | 0,0.995019651532,0.663292952846
4 | 0,-1.03053733564,0.342392729177
5 | 1,0.0376749555484,-0.836548188848
6 | 0,-0.113745482508,0.740204108847
7 | 1,0.56769119889,-0.375810486522
8 | 1,0.029053020776,0.0662553115138
9 | 1,1.97115418849,0.273806254766
10 | 0,0.709015172669,0.240575923747
11 | 1,0.370316033321,-0.128201864483
12 | 0,-1.02519250655,0.0804386196736
13 | 1,0.889419539348,-0.426764508858
14 | 0,-0.815197473739,0.343830106878
15 | 1,0.103984761456,-0.125316575481
16 | 0,0.669872794385,0.924316677526
17 | 0,0.643513479899,0.885369948437
18 | 0,0.915430504991,0.870963917519
19 | 0,-1.26352201845,0.49571290049
20 | 0,1.10881006012,0.40141246155
21 | 1,0.0587265717682,0.502672407917
22 | 0,-0.505897969713,0.673650585128
23 | 0,0.738273195336,0.315131977418
24 | 1,0.0312439516817,0.302219516823
25 | 0,0.852464709394,0.178128395499
26 | 0,1.03186111678,0.750161618571
27 | 0,-0.526986140623,0.902092605754
28 | 0,-0.728635377324,0.99557396959
29 | 1,-0.045113142414,-0.257386127347
30 | 1,1.1589669104,-0.708845157316
31 | 1,1.28773366287,-0.528537227076
32 | 1,1.52103538165,-0.216994816084
33 | 0,0.85418807633,0.302561598632
34 | 1,1.7178437562,-0.671111198659
35 | 0,1.00991587529,0.208138479285
36 | 1,1.61366375141,-0.670751979351
37 | 1,1.85725351971,0.410922532194
38 | 1,2.18778916414,0.490173856148
39 | 1,0.263793192141,0.289378947091
40 | 0,-0.0268276293996,0.945425957263
41 | 0,0.442747631256,0.746807692066
42 | 1,2.28496175734,0.344245685947
43 | 1,1.38868004544,-0.460490488794
44 | 0,-0.816915234614,0.678684687018
45 | 0,-0.969295736661,0.562134179627
46 | 1,0.421868617904,-0.160956446
47 | 1,1.65991109412,-0.161081712533
48 | 0,0.244923922682,1.28880016617
49 | 1,1.63994385529,-0.504280047541
50 | 0,-1.18354333026,0.280662043538
51 | 0,0.130678969482,1.06590483506
52 | 1,0.94072134659,-0.380472006359
53 | 1,0.146393475566,-0.0162829413787
54 | 0,-0.791589791592,0.440089333235
55 | 0,0.00504798694106,0.929785582337
56 | 0,-0.865552177444,0.753468024589
57 | 0,-0.50282742293,0.741718820023
58 | 0,1.14409048682,0.197227604297
59 | 0,1.16702295045,0.449377253948
60 | 1,0.600089936361,-0.400640362415
61 | 0,0.153842022423,0.840515489352
62 | 0,-0.79643958858,1.0728758
63 | 0,-1.20773990343,0.267500509668
64 | 1,1.45214427024,0.0149673429589
65 | 1,0.744825457248,-0.241765583514
66 | 0,0.288968251496,1.22810054843
67 | 1,1.67177425154,-0.409514729047
68 | 0,-0.493185888365,0.707390146389
69 | 0,1.45996751699,0.273040122219
70 | 0,-0.265659336101,0.557911527113
71 | 1,0.0438188233332,0.498771357863
72 | 1,0.0506683538813,0.244630492389
73 | 0,-0.785394748084,0.120495170178
74 | 1,0.337049890476,0.163531703107
75 | 1,1.91196643477,0.670347172482
76 | 0,0.886559998655,0.70673358167
77 | 0,0.708110522436,0.20313866569
78 | 0,-0.00411909521608,1.21525516428
79 | 0,0.879578121565,0.82065421626
80 | 1,1.83407249551,0.214984784668
81 | 0,0.158249592248,1.22381846345
82 | 1,0.181357446941,0.314191888911
83 | 1,0.976675382628,-0.504692808602
84 | 0,1.06363674153,0.565594108545
85 | 0,1.38234192051,-0.273929206989
86 | 0,0.951863389848,0.188485401247
87 | 0,1.1612162475,0.671266118062
88 | 1,0.107351195124,-0.21265283768
89 | 0,-0.836313677978,0.645398950949
90 | 1,0.382223075621,-0.537343217518
91 | 1,0.493972870861,-0.248895266867
92 | 1,1.79749243157,-0.339941184048
93 | 0,0.673576395063,0.740919992567
94 | 1,1.56488156502,-0.547435453857
95 | 1,0.283255605382,-0.272028678096
96 | 1,0.92864769938,-0.741613047216
97 | 0,0.764117634376,0.430576067686
98 | 1,0.0762457688673,-0.101809529848
99 | 1,1.72594203865,0.302238014226
100 | 0,1.00514406494,0.247745252348
101 | 1,1.38938318207,-0.257624576821
102 | 0,-0.931927702563,0.0675579191061
103 | 1,2.02938940901,0.13549029869
104 | 1,1.44859525772,-0.538998572774
105 | 1,0.700241223027,-0.361242153024
106 | 0,0.205834548607,1.44717862999
107 | 1,0.998468087238,-0.454851979249
108 | 1,0.225827258027,0.162004837618
109 | 0,-0.407173782159,1.12198349566
110 | 0,0.675315831968,0.261008548803
111 | 1,1.90914208359,0.426653816909
112 | 0,0.0326319273704,0.536122139578
113 | 0,-0.619279755742,0.733199388941
114 | 0,-0.305851111601,0.667074949583
115 | 0,0.862378026831,0.394710963025
116 | 1,-0.172970309409,-0.25946923535
117 | 1,0.0670121276831,-0.0107427495743
118 | 1,-0.240439232193,0.123569182785
119 | 1,0.770192362663,-0.538960163377
120 | 0,0.840656686909,0.0765833430865
121 | 0,-1.04600399047,0.32281582
122 | 1,0.828528103094,-0.307714969518
123 | 0,0.34369152987,0.682833694472
124 | 0,-1.32532111342,0.260047950996
125 | 0,-0.645457853581,0.306101617957
126 | 1,2.22508472656,0.0985767047671
127 | 1,0.0245613648638,0.473611322223
128 | 1,1.34976334545,-0.621716730531
129 | 1,0.238341039544,-0.0849359952151
130 | 0,-0.580077209378,0.938925319107
131 | 0,0.408660254369,0.628515702044
132 | 0,-0.792477555963,0.333412677731
133 | 1,2.040010107,0.149023064338
134 | 1,1.01449590591,-0.493702731778
135 | 1,0.0452220342886,-0.0735491457806
136 | 0,-1.16354492755,0.240147386224
137 | 1,1.90772199827,-0.329657624553
138 | 0,0.29222637769,0.663005773924
139 | 1,2.08284126455,-0.194327595223
140 | 0,-0.892865315857,0.550181529996
141 | 0,1.11989505927,0.389768175027
142 | 1,1.11283267633,-0.569860415603
143 | 0,0.981117340253,0.290874442009
144 | 1,-0.208768559933,0.49691591102
145 | 1,0.223157179352,-0.155301677429
146 | 0,0.974290332833,0.264243634928
147 | 1,1.97826098995,0.0382282796752
148 | 0,-0.792467232902,0.952109371168
149 | 1,1.77899834703,0.125006070293
150 | 0,1.31043319695,0.399964415084
151 | 0,1.28912031352,-0.0962515185103
152 | 0,1.11302727657,0.542690458519
153 | 0,-0.319443626718,0.574935081075
154 | 0,0.197898895522,0.920459719762
155 | 0,-0.697958271859,0.640357024613
156 | 0,0.399327411114,0.892302160506
157 | 0,0.446493297985,0.750343460128
158 | 0,0.182047259486,1.19627502797
159 | 0,-0.981873704735,-0.00506800345832
160 | 1,0.761044475582,-0.270602027139
161 | 1,0.313005654844,-0.703530974923
162 | 1,1.76606362413,-0.455494314853
163 | 0,1.04661591405,0.251745844837
164 | 0,-0.190903584019,0.892676318393
165 | 0,0.788058754414,0.499999229894
166 | 0,-0.863884866882,0.281023653406
167 | 1,0.140036546395,0.0485307507966
168 | 0,0.525805505031,0.976955180177
169 | 0,-0.92507325346,0.0862767916624
170 | 0,1.03751301371,-0.370267388378
171 | 0,0.401263948245,1.07196613668
172 | 1,1.81260469014,0.0529897791196
173 | 1,1.28853329339,-0.329717850403
174 | 1,-0.309294127871,0.522958845929
175 | 0,-0.457299260413,1.21565053973
176 | 1,2.00230776938,0.402742869313
177 | 1,0.111563440724,0.269655928622
178 | 0,0.61009528834,0.865678224513
179 | 1,2.34019921911,0.366089287087
180 | 1,0.106820740855,-0.0968336456934
181 | 0,0.618768935282,0.661150089006
182 | 0,-0.920077807508,0.126842254088
183 | 1,1.50679565073,-0.324878487468
184 | 0,0.329571499418,0.687207730756
185 | 1,0.90701930276,0.000480070368653
186 | 1,0.455043513844,-0.282414674462
187 | 0,-0.500476849483,1.03525825136
188 | 1,2.14686664722,-0.492680631767
189 | 0,-0.208937614787,0.822033540544
190 | 1,2.19628449397,0.555594181052
191 | 0,0.317211851849,0.761583411244
192 | 0,0.50975544596,0.646565797364
193 | 1,0.201246167989,-0.227969920327
194 | 0,1.11431016584,-0.0976160569456
195 | 1,1.92205982598,0.39730607743
196 | 0,0.679256420188,0.704702500103
197 | 1,-0.153098498998,0.291525110317
198 | 1,1.20729470485,-0.565176805684
199 | 1,1.63505867005,-0.397232219047
200 | 1,0.357300082212,0.180953553469
201 | 1,0.116232840289,-0.0532782389998
202 | 0,-0.0109308116929,0.591251052507
203 | 0,0.831388290209,-0.0884159079792
204 | 1,-0.333940906473,0.502312242657
205 | 1,-0.210369112827,0.275670310792
206 | 1,2.00784257277,0.0585065579679
207 | 1,0.400840264821,-0.381066497931
208 | 0,-0.0429083282847,0.941155877774
209 | 0,-0.109029375323,1.01996728791
210 | 0,-1.22322074901,0.354479298857
211 | 0,-0.705619498992,0.582144965132
212 | 0,-0.264859675377,0.736719648845
213 | 1,0.540655702278,-0.135988980301
214 | 0,-0.00352930884236,0.665773485492
215 | 0,0.30614390096,0.797809286883
216 | 1,1.86789105587,0.141808414077
217 | 0,0.370062363887,0.555286147327
218 | 1,0.76820919997,-0.302894865119
219 | 1,0.535787516666,0.0592859075073
220 | 1,-0.0920474346528,0.218168290314
221 | 1,1.806392007,-0.356942682685
222 | 0,-1.06201348939,0.738111940409
223 | 1,1.62560542636,-0.0602246704803
224 | 1,1.82408820337,-0.190515586857
225 | 1,0.0696754375727,0.246721644507
226 | 0,-0.63888569008,0.276294424056
227 | 1,1.46577320497,-0.331123695578
228 | 1,0.055164206904,0.169508850346
229 | 0,-0.263938767665,1.08704741339
230 | 0,1.10370039365,0.403414972954
231 | 1,-0.199920171283,0.179201110229
232 | 0,0.482779058141,0.793141531312
233 | 1,1.76520636297,0.475182113774
234 | 0,-0.978513224347,0.294106718796
235 | 0,-0.605911912161,0.702224818733
236 | 0,0.438469154507,0.740149614642
237 | 0,0.911959981931,0.0294764524784
238 | 1,2.16934382851,0.3543076863
239 | 0,-0.671523712698,0.627370791205
240 | 1,1.44405853725,0.396580724065
241 | 1,2.03989420841,0.530067924946
242 | 0,-0.594866468573,0.696145798026
243 | 1,1.81678206865,-0.644236497701
244 | 1,0.0897835745471,-0.59413296516
245 | 1,1.60196067129,-0.237408898153
246 | 1,0.797718768431,-0.636800875209
247 | 0,-0.965597954582,-0.026172911804
248 | 1,1.9883645686,-0.463657944415
249 | 0,-0.400657776452,0.831319432056
250 | 1,0.930002464342,-0.378879317242
251 | 0,0.11698216212,0.83039999669
252 | 0,-0.789995550774,0.354648332647
253 | 0,-1.04594869733,0.482230063712
254 | 0,-0.164502635191,0.598430586044
255 | 1,0.0814883977049,0.615110712599
256 | 1,-0.142839910472,0.181684638411
257 | 0,-0.890384521368,0.148939393312
258 | 0,1.06971909645,-0.270198741893
259 | 1,1.92818913974,0.216947130817
260 | 0,0.213426255591,1.3671492026
261 | 1,0.255484463861,-0.678073516784
262 | 0,0.422721068853,0.85049316628
263 | 1,0.581754932076,-0.282398370563
264 | 1,0.790331233716,-0.705695902001
265 | 1,0.105469704101,-0.585848260916
266 | 1,1.28492842041,-0.183736939008
267 | 1,0.41059235276,-0.587424648497
268 | 0,0.0609079309726,0.90640367526
269 | 0,0.899219103512,0.420964946367
270 | 0,0.472226065985,1.20499100353
271 | 1,1.30818657164,0.0725487069588
272 | 0,-0.479890799046,0.99391529209
273 | 0,0.0963304698014,0.712147288487
274 | 0,-0.0142489080111,1.09118539104
275 | 1,0.765782527936,-0.354339542316
276 | 0,0.2464411457,0.989765203826
277 | 1,1.42643526312,-0.517263300469
278 | 0,-0.194914105012,0.681682944104
279 | 0,-0.20614481661,1.29966328626
280 | 1,2.11561714364,0.086243497607
281 | 0,0.588164776879,0.805090833305
282 | 0,-0.408546477384,0.833979733297
283 | 0,-0.653868177738,0.915582093916
284 | 0,-0.00213093009824,0.936985276926
285 | 1,0.247329141203,-0.1178159406
286 | 1,1.85277774445,0.203955659998
287 | 0,-0.440761197372,0.0598313843123
288 | 1,1.75671591286,0.283311384031
289 | 0,0.275487427158,0.857448789916
290 | 1,0.423102887841,-0.89425967556
291 | 0,-1.05856774487,0.235397976785
292 | 0,-1.16579065253,0.331748721686
293 | 0,1.19409216335,0.544276692319
294 | 0,0.551852084882,1.00515169693
295 | 0,-0.516539247871,0.954093784075
296 | 1,0.569236475422,-0.685623044145
297 | 1,0.103745050988,-0.466260147947
298 | 1,1.29312592343,-0.386383660974
299 | 0,0.964861051727,0.0461266289913
300 | 1,0.287884340204,0.80423013044
301 | 1,1.86888945173,0.474452781706
302 | 1,0.309580621557,0.423093767548
303 | 1,1.90645443936,0.193772127221
304 | 1,0.550947391366,-0.640639190852
305 | 0,0.706637080006,0.934757040559
306 | 1,0.87752087995,-0.240729475855
307 | 0,0.242964324715,0.861637292311
308 | 0,1.1173249684,0.176065382799
309 | 1,0.9649609358,-0.472948175002
310 | 1,1.60191748452,-0.486899518808
311 | 1,2.03797892651,0.298220421394
312 | 1,0.71440673523,-0.348290133643
313 | 0,0.937376358915,0.480189276148
314 | 1,1.27018242036,-0.793713814891
315 | 0,-0.220590596022,0.740683512836
316 | 0,0.0986817181909,0.646701630994
317 | 0,0.711229653015,0.572084433676
318 | 1,1.59932435479,-0.579195180959
319 | 1,2.064290708,0.363454182557
320 | 0,0.0699926046574,1.092313673
321 | 0,0.249626598833,1.31970449089
322 | 0,-0.0479621588092,1.14144124756
323 | 0,-0.0896742711037,1.00480860247
324 | 0,-0.870576650265,0.549066137282
325 | 0,0.982719704136,0.483405907955
326 | 0,0.975944932591,-0.0249652859916
327 | 0,-1.07094293688,0.741923281329
328 | 1,1.84302429292,0.322314912275
329 | 1,1.83618066856,0.446377827576
330 | 0,-0.187826565563,1.20494189033
331 | 0,0.387886023236,1.05944816909
332 | 0,-0.817417325005,0.226644559505
333 | 0,-0.706645379238,0.862825386741
334 | 1,1.77072689457,0.270705830592
335 | 0,-0.558662737326,1.09648263433
336 | 1,0.290181884978,-0.395636367525
337 | 1,1.18865131146,-0.304782998705
338 | 0,0.00680768309034,0.933341275857
339 | 0,-0.501778560806,0.150169620488
340 | 1,0.000684111210317,-0.212580492642
341 | 1,0.327375344166,0.342555694678
342 | 1,0.878876836858,-0.30302781919
343 | 1,1.83634441803,0.187405842584
344 | 0,-0.829971105816,0.979464864188
345 | 1,1.62488532982,-0.272749489567
346 | 1,1.3494184456,-0.532627465378
347 | 0,1.02222922556,0.443022896156
348 | 0,-0.437297913548,1.07937410636
349 | 0,0.576412051865,0.611008163463
350 | 0,1.39754631822,0.12533924565
351 | 1,2.11095093635,-0.0481933754077
352 | 1,1.85183890821,0.221740004084
353 | 1,1.14691109376,-0.358578911297
354 | 1,1.52006816416,0.214397189334
355 | 1,1.88629985719,0.0102883349053
356 | 0,0.876003634114,0.295459567553
357 | 0,-0.933747507396,0.305683070705
358 | 0,-0.817918433172,0.102317372397
359 | 1,-0.0475785095493,0.429152163289
360 | 0,-0.880658233917,0.47282120102
361 | 1,-0.0957863530801,-0.279991466841
362 | 1,1.277259887,-0.557265990789
363 | 0,1.02819527334,0.261484962205
364 | 1,1.72285048457,0.409103710525
365 | 1,-0.171325847749,0.186822977579
366 | 0,0.82339277632,0.225490532041
367 | 0,0.742347697155,0.136713290553
368 | 0,-0.665645935673,0.44749615588
369 | 1,0.096109534823,-0.0466271674399
370 | 1,0.8126683244,-0.582101290115
371 | 1,0.149191430872,0.309839036193
372 | 1,1.53408347079,-0.440756322121
373 | 1,-0.182692524499,0.141196449794
374 | 0,0.454013521956,1.29555149979
375 | 0,-0.413348879605,0.845386165939
376 | 1,0.427118212629,-0.445917725221
377 | 0,0.772625542892,0.633557724458
378 | 0,0.83509050861,0.69788886127
379 | 0,0.000431210743488,0.842140658294
380 | 1,1.68799911148,0.328497318303
381 | 1,0.0460056377961,0.360375128781
382 | 0,0.608956373608,0.652347102797
383 | 0,-0.201473918884,0.973343258674
384 | 1,2.03935430213,0.726375885636
385 | 0,0.818719291371,0.397351453678
386 | 1,0.534779859333,-0.77230349515
387 | 1,0.549813445568,-0.659565025978
388 | 0,-0.649702803741,0.539232072019
389 | 1,-0.0986623532166,0.0953528696375
390 | 1,1.26956606031,-0.475260156691
391 | 1,1.4630987064,-0.169100683406
392 | 0,-0.384370094264,0.762132904366
393 | 0,1.13348525681,0.209168457145
394 | 0,0.616501395993,0.721597402281
395 | 0,0.474036333281,1.03539093905
396 | 1,0.389979302217,0.148403076568
397 | 1,1.71562254142,0.124999589443
398 | 0,1.05805672221,0.105499111962
399 | 1,1.97708001143,0.311091277247
400 | 1,0.104196698584,0.219307993931
401 | 0,0.563035558778,0.759987249838
402 | 1,1.56210183604,-0.453390627396
403 | 1,1.95917838217,-0.219560175277
404 | 0,0.736998152077,0.595352767982
405 | 0,-0.241380695635,1.05712882719
406 | 1,1.42318596248,-0.393257808897
407 | 0,0.103143593601,1.18284468225
408 | 1,0.789748797426,-0.268118022924
409 | 0,-0.432141809473,0.692665876437
410 | 1,0.491309531582,0.179777967178
411 | 0,0.592066181729,0.469164791204
412 | 0,-1.11599017838,0.480918543658
413 | 0,-0.768676474695,0.621186320221
414 | 1,0.252554169966,0.422016976515
415 | 0,-0.822944976602,0.452442069873
416 | 1,1.0305970498,-0.219619304841
417 | 0,-0.0920654555611,1.29375468143
418 | 1,-0.261761788339,0.586258642631
419 | 0,-0.14472707767,0.841698183099
420 | 0,0.795230610088,0.515129297587
421 | 1,0.344562881641,-0.497032795989
422 | 1,0.205518782785,-0.248494131342
423 | 0,0.689358145007,1.03164550647
424 | 0,0.37173934178,0.885860787065
425 | 0,0.390004332981,0.767665326373
426 | 1,1.81728836478,-0.214402670962
427 | 1,0.633316480533,-0.630312756942
428 | 1,0.342824391542,-0.306237164795
429 | 0,-0.564010420098,0.757130682221
430 | 0,-0.289584048256,0.793466044713
431 | 0,-0.992286810554,0.35913785518
432 | 0,0.44209357742,0.315512977033
433 | 1,0.868343499058,-0.589327508286
434 | 1,1.61801987371,-0.3980008854
435 | 0,-1.00216593593,0.121421016154
436 | 1,0.759766692325,-0.231745776831
437 | 1,1.67339643553,0.234896200792
438 | 1,0.0207866230105,0.259359536738
439 | 1,1.79941746195,0.252256063467
440 | 0,1.05022146834,0.332709418709
441 | 1,1.58044425863,-0.251909310802
442 | 1,1.71164467342,-0.347692122031
443 | 1,1.70346904399,0.154974519727
444 | 0,0.197577729307,0.989830143116
445 | 0,-0.962036768937,0.0262891406316
446 | 1,0.204375105342,-0.116660274618
447 | 0,-0.960841096343,0.599079570247
448 | 1,1.5732865901,-0.256220304892
449 | 0,-0.137848654388,0.880646590459
450 | 1,1.22644151212,-0.308672349468
451 | 1,2.00576859984,-0.19537899381
452 | 0,-0.390629878897,0.829712511605
453 | 0,-0.811751559865,0.669421161132
454 | 0,-0.821231768877,0.853092049748
455 | 0,0.875881186506,0.871288490974
456 | 1,0.37453602929,-0.293873552004
457 | 1,1.72985479317,0.616181011368
458 | 1,1.67215695206,-0.100271986833
459 | 0,-0.0642223230348,0.851449166922
460 | 1,0.297802566863,-0.168468026484
461 | 1,0.681844938107,-0.361216405709
462 | 0,0.641252746697,0.442147795364
463 | 0,0.428107180193,0.501561641937
464 | 1,1.06480122643,-0.481773371739
465 | 1,1.57894952485,-0.494392601318
466 | 1,1.95954004085,0.505436641651
467 | 1,1.87555871046,-0.064594500955
468 | 0,-0.378053764257,0.842511604465
469 | 0,-1.14463478696,0.546575438668
470 | 0,0.934382558289,0.271497321551
471 | 0,-0.193840456646,0.815513562004
472 | 1,1.79471607877,0.028836631069
473 | 1,0.625663645462,-0.677750256297
474 | 1,1.42269138335,-0.723885358297
475 | 0,-0.368859446548,0.995157624693
476 | 0,0.436487878656,1.31507635115
477 | 0,0.0971011670191,1.20621939606
478 | 0,1.00225415109,0.477311401737
479 | 0,0.334298930087,0.951735250692
480 | 1,0.930917723503,-0.379588561352
481 | 0,0.835401015715,0.57421786711
482 | 0,-1.26039627157,0.0848939834129
483 | 1,2.15877573294,0.876630032594
484 | 1,1.87773710026,-0.256694312231
485 | 1,1.603711564,-0.524161505225
486 | 0,1.01069248669,0.139875034297
487 | 0,-0.00761929573226,0.958552867883
488 | 0,-0.067693562028,1.10569576479
489 | 1,0.586805317626,-0.380935211169
490 | 0,0.953613935862,0.432776509329
491 | 1,1.85578574385,0.178138622369
492 | 0,-0.165521478262,1.29208162974
493 | 1,1.6427221297,-0.00723198523201
494 | 1,1.72093718548,0.220716044611
495 | 1,0.651399590298,-0.436374691699
496 | 1,1.57295912195,-0.354526481554
497 | 1,1.65053365836,-0.21138310363
498 | 1,0.365950746129,-0.212073517326
499 | 1,0.119855415116,-0.569123578054
500 | 0,-0.930861209964,0.465999399457
501 | 0,0.743449314107,0.7429794028
502 | 0,-0.238351170932,1.28126179305
503 | 1,-0.089937379638,0.00878779916791
504 | 0,0.156185082017,0.913173223416
505 | 0,0.707370897844,0.120763198594
506 | 0,-0.418276339158,0.735695529998
507 | 1,0.107264916876,-0.142711351177
508 | 0,-0.781786986746,0.435849152188
509 | 1,0.724731069145,-0.404902671309
510 | 1,-0.00434426763965,-0.338898535407
511 | 0,-0.723763331117,0.984463572871
512 | 1,0.946968160968,-0.665805385399
513 | 1,-0.250619096152,0.218601521697
514 | 1,0.201690893976,0.1050691073
515 | 1,1.62483574908,0.128523513719
516 | 1,0.491695380262,-0.13263510477
517 | 1,0.516754657469,-0.342128141392
518 | 1,0.406844263223,-0.222011163262
519 | 1,0.99617406827,-0.565807843038
520 | 0,-0.283086854576,1.18539931743
521 | 1,1.94674249626,0.384788831486
522 | 0,0.554316369566,1.02081754625
523 | 1,1.96706087713,0.298411186344
524 | 0,-0.234333975226,0.877527450828
525 | 1,0.396975682121,-0.436320795443
526 | 1,1.15254519448,-0.576469947244
527 | 1,1.56139789127,0.0629599146949
528 | 0,0.868377714612,0.441739776578
529 | 1,0.119552073017,0.44840852513
530 | 1,0.273948710144,-0.102266739033
531 | 0,-0.411406949233,0.896440877063
532 | 0,-1.01364469106,-0.0626845448161
533 | 1,1.93403931561,0.00865172459937
534 | 1,1.85921844248,-0.341112855066
535 | 1,1.60064540886,0.0321341839111
536 | 0,-0.786408860681,0.793619197476
537 | 1,0.838444779559,-0.18598896123
538 | 0,-0.419846988373,0.942682586648
539 | 0,-0.750887888752,0.0432408481614
540 | 0,0.366833554497,1.02066516163
541 | 0,0.327248715138,1.15958427837
542 | 0,-0.145722955551,0.982249027992
543 | 0,0.772384179324,0.983583459351
544 | 0,0.866354186187,0.070347214666
545 | 0,1.08341503417,0.409462316571
546 | 0,1.14974596037,0.584038646949
547 | 0,1.10842684282,0.376053757062
548 | 0,-0.734208628742,0.389604878218
549 | 0,-0.523334889506,0.851713543422
550 | 0,-0.725755271994,0.467468415493
551 | 0,1.14250209609,0.00414187726763
552 | 0,-0.352553877707,0.936383628448
553 | 1,1.9613157104,0.467100342009
554 | 0,0.706157352029,0.640954233476
555 | 1,1.79053552806,-0.275549611754
556 | 0,0.151789899282,0.932102250712
557 | 0,-0.157840269267,0.838548025258
558 | 1,0.98915311957,-0.232619318801
559 | 0,1.03608450422,0.921467700023
560 | 1,0.0298560618995,-0.0330882413552
561 | 1,1.83445715817,-0.222812362574
562 | 1,0.192426862718,-0.226429823336
563 | 0,-1.01455485674,0.468990414416
564 | 1,2.04091513662,0.0797306413526
565 | 0,0.651283461774,0.51630952635
566 | 0,0.207081120562,1.17521352574
567 | 0,0.443022773293,1.06786658343
568 | 1,1.93685132797,0.569617716745
569 | 1,1.68867402963,0.300115892106
570 | 1,0.379007921561,-0.595275557525
571 | 1,0.105116834077,0.406374702306
572 | 1,0.15036097391,-0.106518748866
573 | 0,-0.0914859816364,0.434430319652
574 | 0,-0.578554556512,0.776627775038
575 | 0,-0.51673475366,0.948405142123
576 | 1,-0.0942511793712,0.508434413647
577 | 0,0.21134550351,0.693764923893
578 | 1,1.71494935397,-0.393260495711
579 | 0,-0.0146538409121,1.18374701254
580 | 0,-1.06744746621,0.791987610421
581 | 0,1.32764014662,-0.158756738461
582 | 0,0.303548812779,1.1875380666
583 | 1,1.73871987667,-0.079741156639
584 | 1,1.39735120435,-0.218226203016
585 | 1,1.71034476413,-0.917203985471
586 | 1,1.88859577892,0.30141936834
587 | 0,-0.688638794744,0.534821110369
588 | 0,-0.455164938491,1.01860679985
589 | 0,0.132913541243,0.832394947945
590 | 1,1.50244574447,-0.271442350357
591 | 1,0.584829792285,-0.723363625072
592 | 0,1.08262373263,0.324617316271
593 | 1,1.58816678927,-0.387629573074
594 | 1,0.47441092755,0.154986249067
595 | 1,-0.183377837022,0.230370258544
596 | 0,0.884457492205,0.788548839091
597 | 1,0.159919191311,0.484192505214
598 | 1,0.315779315697,-0.288818954785
599 | 1,1.24863308955,-0.768071399058
600 | 0,-1.20534283256,-0.0254506321275
601 | 0,0.96263720424,0.0338608173187
602 | 1,-0.0867675273427,-0.13859002644
603 | 1,0.815451236704,-0.306455257295
604 | 0,-0.492025340023,0.997197121347
605 | 1,1.91874368726,0.154260984785
606 | 0,-0.485526123406,0.844087374719
607 | 1,1.21269181309,-0.788558018346
608 | 1,1.37895173496,-0.232217714483
609 | 0,0.867544483903,0.897846528714
610 | 0,0.976829642665,0.249753559999
611 | 1,0.0485510232846,0.172771017053
612 | 1,1.63486516182,-0.218947541496
613 | 0,0.270463384761,0.89529719498
614 | 1,1.80602261066,0.208560089035
615 | 1,1.47849450863,-0.685485586749
616 | 1,0.0503720507471,0.174119012647
617 | 1,0.294976798043,0.0438375061464
618 | 1,1.19160689721,-0.509183070429
619 | 0,-1.10759540352,0.344459297438
620 | 0,0.646359625814,0.301800845266
621 | 1,0.296371894722,-0.339355325751
622 | 1,-0.0274720276684,0.239107538027
623 | 1,1.88711198623,0.366790551638
624 | 1,0.164404462085,0.212488303644
625 | 0,-0.356059270591,1.06702432689
626 | 1,0.0667442645627,0.227873510462
627 | 0,-0.760545384178,1.07086165823
628 | 1,1.17242500395,-0.543434701348
629 | 1,1.89643674101,0.612825488071
630 | 0,-0.991584820998,0.7620535927
631 | 0,-0.581421276772,0.687341863207
632 | 0,0.0134360095445,0.780188108435
633 | 1,1.85316279527,-0.0872849033223
634 | 1,1.23624309165,-0.477689414026
635 | 1,0.717508315195,-0.511013409626
636 | 0,1.12363523468,0.335647665964
637 | 0,0.60411179157,0.179710191765
638 | 0,0.985119757974,0.0672157117289
639 | 1,0.15100186486,0.543861732802
640 | 1,2.2348858476,0.434740359161
641 | 0,-0.546764052326,0.843832173045
642 | 1,0.337663304004,-0.215372795155
643 | 1,1.66397298588,0.074031049436
644 | 0,1.22667213064,0.686889888691
645 | 1,1.16208457295,-0.612771139953
646 | 1,1.75578649958,0.0390627476176
647 | 1,0.577093464401,-0.325639340424
648 | 0,0.918574581186,1.02683651066
649 | 1,1.83235255637,-0.295192550631
650 | 0,0.789480475934,0.186826259471
651 | 1,1.35302774738,-0.0892766013228
652 | 1,1.22436410933,-0.815889823042
653 | 1,1.32848629953,-0.476164825077
654 | 1,0.981514060805,-0.496620338839
655 | 1,1.48432031544,-0.00754834790732
656 | 1,0.0532529019723,0.115223303726
657 | 0,-0.251686464553,0.73788945525
658 | 0,-0.268351445508,0.694449318195
659 | 0,-0.276440309243,1.08228719542
660 | 0,-0.530773687674,0.578894049009
661 | 0,-1.28520473406,-0.237795051984
662 | 0,0.483684835399,0.932596225192
663 | 0,-1.07657807743,0.243648548885
664 | 1,0.899076330125,-0.504559292823
665 | 1,0.402370211802,0.331727630692
666 | 0,-0.502183434848,0.909421812182
667 | 0,-1.09392941022,0.320879987165
668 | 1,1.15823534914,0.0791025046672
669 | 1,1.73232313499,0.210620646833
670 | 1,1.64072996385,-0.265505318001
671 | 1,0.68139052491,-0.228566541792
672 | 1,0.667072442708,-0.477925900765
673 | 1,0.923262503593,-0.161260602327
674 | 0,0.497629236193,0.937119468151
675 | 0,0.321596114462,0.919229156649
676 | 0,0.506641136603,0.53162635435
677 | 0,-0.680137408232,0.860070985829
678 | 0,-0.633231976145,1.0581169238
679 | 0,0.453839471198,1.0315238122
680 | 0,-0.613268298623,0.678124150182
681 | 0,0.78514168925,0.804711865643
682 | 0,-0.169121207176,0.874769626219
683 | 1,1.49169966409,-0.333706699674
684 | 1,0.260686740723,0.501775088631
685 | 1,1.98063612142,-0.0971310706775
686 | 0,1.16216102352,0.126265358854
687 | 1,1.38926361501,-0.444384697323
688 | 0,0.895270834079,0.652969290928
689 | 1,1.56044100073,-0.726466977224
690 | 1,0.886011145769,-0.304463073096
691 | 0,0.425472276051,0.793598514961
692 | 0,0.587416138069,0.971809993262
693 | 1,0.515228645037,-0.153986189489
694 | 0,1.20694143205,0.188262820308
695 | 1,2.19125563282,0.400449355314
696 | 0,-0.0620103280035,0.858430000752
697 | 1,0.22576236156,0.0650725524482
698 | 1,0.65363170147,-0.231092589032
699 | 0,0.156115691685,0.848483078701
700 | 1,0.647315055136,-0.820866909835
701 | 1,0.384022945245,-0.64717309134
702 | 1,1.7429637937,0.0796745508831
703 | 0,0.388389570961,0.835685657839
704 | 0,-1.06918978197,0.522486728388
705 | 1,1.78671358917,0.201528181766
706 | 0,0.250775098245,0.510317584097
707 | 1,1.44689524788,-0.59582746814
708 | 0,0.676980004809,0.868704481969
709 | 0,0.568869448961,0.19981478778
710 | 1,0.816604221994,-0.374473903165
711 | 0,-0.184209453634,0.980400157264
712 | 1,1.76141976289,0.210521217598
713 | 1,0.903905751566,-0.225160362917
714 | 0,-0.611133847148,0.111026192647
715 | 1,1.60891958881,-0.183002789765
716 | 1,1.29399197725,-0.290292254724
717 | 1,1.87059686822,-0.763764240543
718 | 1,0.772980615176,-0.484967867083
719 | 1,0.430623701005,-0.637905534201
720 | 1,2.08722794539,0.399239246318
721 | 1,0.0892383484693,0.155984983371
722 | 0,-1.11687973905,0.369187165214
723 | 1,0.171485419393,-0.00507046332496
724 | 1,0.15233117885,0.0133716580721
725 | 0,1.26645596464,0.179337354725
726 | 0,-0.952224065303,0.539732892991
727 | 0,0.454043379056,0.834341108734
728 | 1,1.4296186708,-0.624861558523
729 | 0,1.26394478993,0.0980860289858
730 | 1,0.554235024538,-0.395056684623
731 | 0,-0.241808508995,0.795185888785
732 | 1,1.04833639155,-0.859541524893
733 | 0,0.521338827164,1.06021332179
734 | 0,-0.000157847411437,0.793652865633
735 | 0,-1.01646954654,0.48080552693
736 | 1,0.107915964402,-0.101666265949
737 | 1,1.77684745069,-0.484135042513
738 | 1,0.864689574095,-0.682256937581
739 | 1,0.289829060987,0.140806948511
740 | 1,-0.112987425146,0.202946307279
741 | 1,1.40833601312,-0.582144658923
742 | 1,0.755742646936,-0.429048567017
743 | 1,0.795564469777,-0.489354190156
744 | 1,1.67499863748,-0.431958470644
745 | 0,0.175110311843,0.734498684703
746 | 0,0.175352546837,0.992684808106
747 | 1,1.91752890229,0.0583009215354
748 | 0,-0.946941046452,0.353822144566
749 | 1,0.330908242194,-0.210071041124
750 | 0,-0.810782694618,0.845588111708
751 | 0,0.46319558379,0.528641018885
752 | 0,-1.0319082359,0.402434431518
753 | 0,0.53652507723,0.948131895188
754 | 1,1.74946485959,-0.51077399697
755 | 1,0.289800079733,0.45891281834
756 | 1,1.38605102428,-0.658292175467
757 | 0,1.05704487153,0.240416127694
758 | 0,-0.429010840151,0.904014945211
759 | 1,1.79033637644,-0.44743451061
760 | 0,-0.687038089552,0.599792444631
761 | 0,-0.889178076096,0.578259444697
762 | 1,1.16690711201,-0.873085360849
763 | 1,1.68505200281,-0.438370722449
764 | 1,0.451854729941,-0.323751360047
765 | 1,1.39168517208,-0.466379737011
766 | 0,-1.05241091451,0.559891842007
767 | 1,0.0214979934811,0.641793845375
768 | 0,-0.870489902791,0.185361373274
769 | 0,1.15492936913,0.571118949927
770 | 0,-0.635588695489,0.843118321097
771 | 0,1.34440724122,0.497278337366
772 | 1,0.611401800988,-0.391604428791
773 | 1,0.438683753072,-0.443927263035
774 | 0,1.20038711516,0.183711103078
775 | 1,0.973361610576,-0.43610324432
776 | 1,1.57415729543,-0.31834963115
777 | 0,-0.73159198209,0.507722547596
778 | 0,0.397765520989,0.479285894889
779 | 1,0.627445242314,-0.428401851656
780 | 1,0.163309295499,-0.170593938866
781 | 0,-0.908190950857,0.246798498979
782 | 1,1.48076063749,-0.339331874046
783 | 0,-1.12568276547,-0.129195581345
784 | 1,-0.156116598391,0.161929583188
785 | 0,-0.323661304219,1.26529616694
786 | 0,-1.09406065084,0.573288808413
787 | 0,-0.627976304619,0.82784831038
788 | 0,0.364051648371,0.745774125609
789 | 1,1.06741860718,-0.333917270724
790 | 1,1.34029216504,-0.572224039878
791 | 0,-0.794389208208,0.85689401223
792 | 1,0.0369683503231,0.277547081733
793 | 1,0.485556837707,-0.115612533987
794 | 0,0.839971889792,0.19186199962
795 | 1,0.332495935517,-0.183885914904
796 | 1,0.22282201505,-0.24234760329
797 | 1,0.469997518682,-0.232896191201
798 | 1,0.942857237785,-0.130488184413
799 | 0,0.883533236365,0.537011718141
800 | 1,1.19731021163,-0.608672182291
801 | 0,0.201145960811,1.48982528842
802 | 0,0.886459270485,0.269160123194
803 | 1,1.39608334522,-0.576028726875
804 | 1,0.873310404171,-0.379685630772
805 | 1,1.9143196917,0.965445807083
806 | 1,-0.101623185184,0.306728322715
807 | 0,1.11236310183,0.416263013862
808 | 1,0.0783165602419,0.38420007435
809 | 1,1.88324940842,-0.0300783657648
810 | 1,0.445849419746,-0.196436646684
811 | 1,1.08381708326,-0.258429938474
812 | 1,0.0822721770662,0.366442096267
813 | 0,-0.557241694333,0.155006499605
814 | 1,0.268440863192,0.125287751353
815 | 0,0.520443754375,1.19427212534
816 | 0,0.598065791189,0.905551782675
817 | 0,0.622378200359,0.328532237069
818 | 1,1.89216594715,0.0279590734872
819 | 0,-0.780060155894,0.404163032778
820 | 0,0.538190560455,1.23361555566
821 | 1,1.86347315615,-0.204607538074
822 | 0,1.03398999767,0.284344806262
823 | 1,1.5002619409,-0.631262450111
824 | 1,1.09050972899,-0.504123372956
825 | 1,0.257251413283,-0.24593429109
826 | 0,-0.696020199547,0.253701123424
827 | 1,1.20070741869,-0.59931614113
828 | 1,1.71338393097,0.131871003287
829 | 0,-0.796600906055,0.384468093075
830 | 0,-0.65266629529,1.14264982469
831 | 0,-0.50490664239,0.232504622007
832 | 0,-0.353713863822,0.725612860398
833 | 1,0.819521119383,-0.747316472337
834 | 1,1.58236263187,-0.115092650738
835 | 1,-0.0962248508252,0.224321217835
836 | 1,0.489153573686,-0.017654277661
837 | 0,-0.0538172458635,1.28295372103
838 | 0,-0.594560451933,0.48963087521
839 | 0,0.956638314388,0.855957141459
840 | 1,0.285059168023,-0.345271351084
841 | 1,0.278998104875,-0.460670111287
842 | 0,-0.191653178909,0.951518690744
843 | 1,1.75718740917,-0.278398864638
844 | 1,1.90848271507,0.111050537552
845 | 0,-0.766695719838,0.989709890538
846 | 1,0.446172658682,-0.116178068116
847 | 0,0.0713600611704,1.06166402535
848 | 0,-1.14997905266,0.26121762798
849 | 0,0.703289197505,0.099644730814
850 | 0,-0.259577334626,0.902383494658
851 | 1,0.0522103563328,0.402156056885
852 | 0,0.602849144131,0.782431767457
853 | 1,0.405116682264,-0.379244819982
854 | 1,1.68227255175,-0.040609560876
855 | 0,-0.199365754167,1.01038151311
856 | 1,0.848276770568,0.0145430328042
857 | 0,-0.275858356265,0.98103215699
858 | 0,1.12912122624,0.208917753686
859 | 1,1.31487542091,-0.300321533094
860 | 0,-0.233178019775,0.592139647261
861 | 1,0.828172737152,-0.211333085891
862 | 1,2.01786070981,-0.267168395743
863 | 1,-0.251526477547,0.705788471017
864 | 1,-0.0877398902259,0.822626941837
865 | 0,-0.410220080814,0.471448317826
866 | 1,1.62827497759,-0.281611744011
867 | 0,-0.63186200695,1.04085807012
868 | 1,1.60152682919,0.439980207179
869 | 0,0.438033029502,0.970668233834
870 | 0,-0.601699074472,0.724245328352
871 | 1,1.9094894538,0.247298157696
872 | 0,-0.13880450676,1.04003391402
873 | 0,0.211200997931,0.735115562875
874 | 1,1.8998505198,0.0245345970145
875 | 0,0.669752898355,1.15231728949
876 | 1,0.949555520066,-0.271816203801
877 | 0,-0.209027689718,0.812549863946
878 | 0,0.670522980038,0.865776948889
879 | 1,0.136725306416,-0.251129588697
880 | 1,1.15905262034,-0.318027316963
881 | 1,0.418363816942,-0.197997769035
882 | 1,1.44145534695,-0.519591151968
883 | 1,0.416435928389,-0.517193817827
884 | 0,0.11015283636,0.965670637884
885 | 1,1.33646295193,-0.393681601373
886 | 0,0.575285675728,1.00352641466
887 | 0,-0.725664579711,0.55285585065
888 | 0,-0.295662903014,1.03863195547
889 | 1,1.18662197406,-0.211478909361
890 | 0,0.678482060901,0.532051695835
891 | 1,2.20406303231,0.207373404657
892 | 1,0.408289611369,-0.205937104438
893 | 0,0.175118229699,0.980469676555
894 | 1,1.16248934429,-0.38339799472
895 | 0,1.21915066322,0.512607183493
896 | 0,0.367303712017,0.751369157279
897 | 0,-1.39884147259,0.269026549456
898 | 1,-0.0137603309699,-0.0828339073335
899 | 0,0.56819679116,0.577773306341
900 | 0,-0.826689215327,0.504759467103
901 | 1,0.0441626017252,0.0220648768889
902 | 1,0.751687056063,-0.456959174637
903 | 0,-0.562750161253,0.70920259828
904 | 0,0.823246870464,0.575467139199
905 | 1,2.02904227175,-0.727644554785
906 | 1,2.02855685191,-0.282594426907
907 | 0,-0.466668591796,0.640975843428
908 | 0,0.726529375078,1.15116164222
909 | 1,0.823826795955,-0.820716386208
910 | 0,0.54196653942,1.02682036378
911 | 0,0.65177090589,0.632180720523
912 | 1,1.11591540205,-0.451155134042
913 | 1,0.7664165754,-0.549639714815
914 | 0,1.38331636949,-0.0951623495007
915 | 0,-0.210682334642,1.07565428568
916 | 1,0.116311660485,0.0797957537731
917 | 0,0.112547236305,0.966643372712
918 | 0,0.311100177281,0.936524457457
919 | 0,0.950563983288,0.430509452166
920 | 0,0.394063201814,0.9730011896
921 | 0,-0.892847697026,0.397740078114
922 | 1,0.916436446865,-0.563920535972
923 | 0,0.889241566922,0.297120408926
924 | 1,1.7111909073,-0.141373724784
925 | 1,2.07723847346,0.418862310879
926 | 0,-0.894987785708,0.63294261628
927 | 1,0.177983834469,0.187479667356
928 | 1,0.148087996234,0.494125412885
929 | 0,-0.293798387984,0.702832021773
930 | 1,1.34778356077,-0.321549501417
931 | 1,1.95524494912,0.526100623643
932 | 1,1.41079300289,-0.100338668649
933 | 1,1.83938950008,-0.253318150904
934 | 0,1.08970351127,0.0424923391418
935 | 1,0.0142490836485,-0.134322399008
936 | 0,0.0380951900532,1.03895254989
937 | 0,-0.877618536503,0.0204506384276
938 | 1,0.288737357545,-0.51542129293
939 | 1,-0.174011882083,0.11885256315
940 | 0,0.842483759429,0.68766666843
941 | 1,-0.156151931864,0.448519657968
942 | 1,1.53747865476,-0.600384406513
943 | 1,0.323467202487,-0.476444316391
944 | 0,-0.336210191121,0.755961491574
945 | 1,1.85709242746,0.125878526954
946 | 1,0.673305588526,-0.324604689439
947 | 0,-0.0318362019541,1.13653920136
948 | 0,-0.688011313318,0.790894586581
949 | 0,0.520050958247,0.53579429877
950 | 1,0.964025331548,-0.168270671107
951 | 1,1.55918456169,-0.0777705154301
952 | 0,0.922614714864,0.348146863681
953 | 0,-0.302092169544,0.401612257517
954 | 0,-1.10996406042,0.572238417361
955 | 0,0.482192620204,0.449824717121
956 | 0,-0.984908142158,-0.0576560821186
957 | 1,0.312029607564,0.0266851999446
958 | 1,2.10939076236,-0.0561971372391
959 | 1,1.30631175374,-0.684295808522
960 | 0,-0.925049222097,0.161666822203
961 | 0,1.04641580645,0.214324662342
962 | 1,0.858993177974,-0.336070264191
963 | 0,-0.815452990939,0.0227312240277
964 | 1,1.02407575788,-0.986430921825
965 | 1,0.46607047661,0.272210175023
966 | 0,0.306741079445,0.525678701175
967 | 1,1.08835446456,-0.410516776154
968 | 0,-0.730525509546,0.0290620988818
969 | 1,0.706289009872,-0.280560677821
970 | 0,-0.785823298718,0.361824128485
971 | 1,1.78369940848,-0.205332220408
972 | 0,1.01352118986,0.210766406967
973 | 1,2.02450228141,0.516395666601
974 | 1,0.723898554231,-0.632665781548
975 | 1,1.97538696948,-0.111643076217
976 | 1,0.352620549491,0.0104819351831
977 | 0,0.89921669123,0.834767819075
978 | 0,0.250113501093,1.14593624776
979 | 0,0.30165928697,1.0284392498
980 | 1,1.50710876754,-0.637236476927
981 | 0,-0.743216418981,0.742508811859
982 | 0,1.27384102026,0.706226172349
983 | 1,2.05204051198,0.411193596365
984 | 0,-0.140204957382,0.810011972138
985 | 0,-0.495353772359,1.01328222
986 | 0,-0.845127370529,0.527162497097
987 | 1,2.03478799288,-0.173860813606
988 | 0,-1.27230645986,0.161587330431
989 | 0,0.587308928274,0.378486290691
990 | 0,-0.556618181053,0.782472034253
991 | 0,0.896269905,0.419096705483
992 | 0,-1.56533840711,0.0684563341062
993 | 1,2.0877198213,0.63536812862
994 | 0,-1.03738443481,0.605922491608
995 | 1,0.247268581187,-0.287227735368
996 | 1,0.91002351533,-0.460794079773
997 | 0,-1.09350703899,-0.0510927948365
998 | 0,-0.604634938413,0.861479523127
999 | 1,-0.219688807042,0.101656030166
1000 | 1,0.309964018933,0.108210920672
1001 |
--------------------------------------------------------------------------------
/simdata/moon_data_train.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/simdata/moon_data_train.jpg
--------------------------------------------------------------------------------
/simdata/output_curve_hidden_nodes.txt:
--------------------------------------------------------------------------------
1 | 1,0.52
2 | 1,0.67
3 | 1,0.55
4 | 1,0.48
5 | 1,0.64
6 | 1,0.52
7 | 1,0.53
8 | 1,0.52
9 | 1,0.58
10 | 1,0.51
11 | 2,0.79
12 | 2,0.88
13 | 2,0.77
14 | 2,0.82
15 | 2,0.81
16 | 2,0.74
17 | 2,0.61
18 | 2,0.81
19 | 2,0.55
20 | 2,0.85
21 | 3,0.84
22 | 3,0.83
23 | 3,0.78
24 | 3,0.98
25 | 3,0.97
26 | 3,0.95
27 | 3,0.74
28 | 3,0.76
29 | 3,0.99
30 | 3,0.88
31 | 4,0.96
32 | 4,0.94
33 | 4,0.87
34 | 4,0.84
35 | 4,0.75
36 | 4,0.97
37 | 4,0.91
38 | 4,0.98
39 | 4,0.89
40 | 4,1.0
41 | 5,0.88
42 | 5,0.93
43 | 5,0.99
44 | 5,0.97
45 | 5,0.96
46 | 5,1.0
47 | 5,0.77
48 | 5,0.94
49 | 5,0.94
50 | 5,0.8
51 | 6,0.99
52 | 6,0.98
53 | 6,1.0
54 | 6,0.97
55 | 6,0.98
56 | 6,0.99
57 | 6,0.89
58 | 6,0.9
59 | 6,0.96
60 | 6,0.93
61 | 7,1.0
62 | 7,0.99
63 | 7,0.96
64 | 7,0.98
65 | 7,1.0
66 | 7,1.0
67 | 7,0.96
68 | 7,1.0
69 | 7,0.96
70 | 7,0.98
71 | 8,1.0
72 | 8,0.97
73 | 8,1.0
74 | 8,0.99
75 | 8,0.96
76 | 8,0.99
77 | 8,0.96
78 | 8,1.0
79 | 8,0.93
80 | 8,0.99
81 | 9,1.0
82 | 9,1.0
83 | 9,0.98
84 | 9,0.95
85 | 9,0.94
86 | 9,1.0
87 | 9,1.0
88 | 9,0.93
89 | 9,1.0
90 | 9,1.0
91 | 10,0.97
92 | 10,0.99
93 | 10,1.0
94 | 10,1.0
95 | 10,1.0
96 | 10,0.97
97 | 10,0.92
98 | 10,1.0
99 | 10,1.0
100 | 10,1.0
101 | 15,1.0
102 | 15,0.99
103 | 15,0.98
104 | 15,0.99
105 | 15,1.0
106 | 15,1.0
107 | 15,0.99
108 | 15,1.0
109 | 15,1.0
110 | 15,1.0
111 | 14,1.0
112 | 14,0.99
113 | 14,1.0
114 | 14,0.98
115 | 14,1.0
116 | 14,0.99
117 | 14,1.0
118 | 14,1.0
119 | 14,1.0
120 | 14,1.0
121 | 13,0.98
122 | 13,1.0
123 | 13,1.0
124 | 13,1.0
125 | 13,1.0
126 | 13,1.0
127 | 13,0.99
128 | 13,0.97
129 | 13,1.0
130 | 13,0.97
131 | 12,0.98
132 | 12,0.99
133 | 12,0.99
134 | 12,1.0
135 | 12,1.0
136 | 12,1.0
137 | 12,1.0
138 | 12,0.99
139 | 12,0.96
140 | 12,1.0
141 | 11,1.0
142 | 11,0.98
143 | 11,0.92
144 | 11,0.99
145 | 11,1.0
146 | 11,0.99
147 | 11,0.95
148 | 11,0.99
149 | 11,1.0
150 | 11,1.0
151 |
--------------------------------------------------------------------------------
/simdata/plot_data.R:
--------------------------------------------------------------------------------
1 | filename = "linear_data_train"
2 | #filename = "moon_data_train"
3 | #filename = "saturn_data_train"
4 |
5 | data = read.table(paste(filename,".csv",sep=''),header=F,sep=',')
6 | colnames(data) = c("label","x","y")
7 |
8 | jpeg(paste(filename,".jpg",sep=''))
9 | plot(data[,c("x","y")],pch=21,bg=c("orange","blue")[data[,"label"]+1])
10 | dev.off
11 |
--------------------------------------------------------------------------------
/simdata/plot_hidden_curve.R:
--------------------------------------------------------------------------------
1 | d = read.table("output_curve_hidden_nodes.txt",sep=",")
2 | d$V2 = d$V2*100
3 | means = aggregate(V2 ~ V1, d, mean)
4 |
5 | #jpeg("hidden_node_curve.jpg")
6 | plot(means$V1,means$V2,xlab="Number of hidden nodes.",ylab="Accuracy",ylim=c(40,100),pch=21,bg="blue")
7 | lines(means$V1,means$V2)
8 | points(d$V1,jitter(d$V2,1),pch=1,cex=.5)
9 | #dev.off()
--------------------------------------------------------------------------------
/simdata/plot_hyperplane.R:
--------------------------------------------------------------------------------
1 | filename = "linear_data_train"
2 |
3 | data = read.table(paste(filename,".csv",sep=''),header=F,sep=',')
4 | colnames(data) = c("label","x","y")
5 |
6 | #jpeg("linear_data_hyperplane.jpg")
7 | plot(data[,c("x","y")],pch=21,bg=c("orange","blue")[data[,"label"]+1],xlim=c(-.25,1),ylim=c(-.25,1))
8 | abline(h=0,lty=2)
9 | abline(v=0,lty=2)
10 |
11 | w0 = -1.87038445
12 | w1 = -2.23716712
13 | b = 1.57296884
14 | slope = -1*(w0/w1)
15 | intercept = -1*(b/w1)
16 | abline(coef=c(intercept,slope),lwd=3)
17 | #dev.off()
--------------------------------------------------------------------------------
/simdata/saturn_data_eval.csv:
--------------------------------------------------------------------------------
1 | 0,-2.95364440838872,0.4240724153547
2 | 1,9.05324338708622,3.8332299316063
3 | 1,-9.41898808565923,-5.15292208063294
4 | 0,0.0259142145221616,0.0131769510996405
5 | 1,-5.98582112483023,9.74872712136347
6 | 1,-7.49101797542126,-5.48394497181704
7 | 1,8.65910174722262,-2.32602493955759
8 | 1,8.26922103849562,-6.56763171920307
9 | 0,2.40182827343185,1.10657173843081
10 | 0,-0.465755780497113,-0.807678174831206
11 | 1,10.2846732789519,0.828400996154731
12 | 0,1.88926691446674,-3.06093070941668
13 | 1,6.04467655182339,7.74140971333368
14 | 1,9.61871090831236,-3.80296065221943
15 | 0,2.92950492924933,0.665512002904368
16 | 0,1.87992216704095,-4.43105250577418
17 | 1,7.28892290086414,2.66698918808811
18 | 0,4.02450666602898,2.244583780411
19 | 1,-7.66630871112399,-5.29749623431877
20 | 0,-0.781453169961538,0.562678104064411
21 | 0,0.635011298824518,-3.6878002110401
22 | 1,8.22170742942158,2.88799387574692
23 | 1,-1.47455736050755,9.2203860297421
24 | 0,-0.121270871486501,0.063986309790505
25 | 1,-2.31266607975405,-10.9717639669406
26 | 1,2.83009424760612,-9.8210260630275
27 | 1,-9.19635813111571,-3.34855954713015
28 | 0,0.147725367512453,-0.437736939119105
29 | 1,-9.17331746858361,6.36392553718533
30 | 0,-2.28980339192164,1.60501617702423
31 | 1,5.13240717849589,9.86906556059293
32 | 0,1.65654360679859,-1.66581386791298
33 | 0,-0.00622256336644169,-0.0832218750907117
34 | 0,0.903538741758324,-1.27861276550525
35 | 0,1.25052117606507,-1.41863885064092
36 | 1,2.27623616139974,8.35271948733344
37 | 1,-2.19622946975649,9.26066202994795
38 | 1,4.65886462517106,7.92763500009444
39 | 1,-9.12600785652553,6.38516605589955
40 | 0,3.4851777563707,-2.41070406961164
41 | 1,-10.0955814178556,0.00676320264738719
42 | 0,-1.06378102886385,0.501713925048591
43 | 1,3.88451320638887,-8.9013454545313
44 | 1,9.6704324419551,1.31561096269033
45 | 1,-4.17881680474775,8.00173835147622
46 | 1,-5.00775406999558,9.24837974467148
47 | 1,3.7598328281536,-9.63895057965262
48 | 0,-1.46568630445723,1.02514709218113
49 | 1,2.52889846201837,-9.2666832637993
50 | 0,0.0667992413611389,0.28670701698946
51 | 1,9.10554765145899,5.68039941265474
52 | 0,-0.118192087098342,1.29788524800616
53 | 0,-2.19266842228583,-2.44434085822266
54 | 0,-0.0631441415283714,0.0365706184771397
55 | 0,0.756437258940677,3.85698106911192
56 | 0,-2.46969508412011,1.83238321355437
57 | 0,1.10193728949443,0.756008287697793
58 | 0,-1.01028268728193,-4.87447977282223
59 | 0,0.0946061442932678,0.536838992880969
60 | 1,4.5727928037117,9.40983787276743
61 | 0,-1.34938481807617,-2.68034970483454
62 | 1,-1.60719444645805,9.93588541231975
63 | 1,-7.93697056741509,3.05689582684673
64 | 1,-9.00829730587327,4.07249342645771
65 | 1,-11.2684518042161,-0.843134900869058
66 | 0,-3.36107317587188,0.663007407488394
67 | 0,-4.39609285144382,1.41304472178053
68 | 0,0.993790652959637,1.61726905315253
69 | 1,9.05467956228427,-6.47174612409913
70 | 1,7.96844249991419,5.37736349820582
71 | 1,-9.36976664961836,0.871088431804888
72 | 0,-0.758472521152815,-0.65404711174728
73 | 1,9.82751795911665,3.82413930092729
74 | 1,0.662620432079108,-11.8200982882237
75 | 1,-9.02908779011979,0.153310302751011
76 | 1,-0.101067080565379,-9.88393539126925
77 | 1,10.0054197909241,-1.50409519211489
78 | 1,-2.3857895511175,-9.91232681015645
79 | 1,9.64950043910508,-1.62499249925805
80 | 1,3.67888830791673,-10.1744112831109
81 | 0,-0.622873408062069,1.4372817097535
82 | 0,-0.45644767226127,1.33286205033499
83 | 0,3.77501530804962,-0.20492061753698
84 | 0,-0.705977520233487,0.535090235637785
85 | 0,0.952722120603435,1.39849898225992
86 | 0,2.43241042072037,-4.01804860356159
87 | 1,6.17248393242332,-6.79019861171794
88 | 0,0.332296597197245,2.84862313836927
89 | 0,-2.08466292404208,-1.26782105518914
90 | 1,7.90726565243414,5.19311974848883
91 | 1,-1.23513454896338,9.24435498482269
92 | 0,1.57853483990754,1.55662803439644
93 | 0,-0.615468713457144,0.177447373766895
94 | 1,-5.8458798259574,7.88083060792
95 | 0,1.07584883346741,0.281565147442874
96 | 1,11.3917396542548,1.58992609468347
97 | 1,-7.54119390428137,8.01151373936753
98 | 0,3.71898328296341,3.01997451231519
99 | 0,2.5103810625784,1.47566193245145
100 | 0,-1.81246514486709,2.65784907064089
101 |
--------------------------------------------------------------------------------
/simdata/saturn_data_train.csv:
--------------------------------------------------------------------------------
1 | 1,-7.1239700674365,-5.05175898010314
2 | 0,1.80771566423302,0.770505522143023
3 | 1,8.43184823707231,-4.2287794074931
4 | 0,0.451276074541732,0.669574142606103
5 | 0,1.52519959303934,-0.953055551414968
6 | 0,2.6244640096065,-3.57517214574177
7 | 1,-9.67984097020372,2.43273985198413
8 | 0,1.76662111169025,-0.599065119089674
9 | 1,-9.54072620047797,-2.44549459765326
10 | 1,-4.04223072334724,9.13685386209343
11 | 0,2.12168571110583,1.09436523562788
12 | 0,-0.683614264815307,0.401363797043046
13 | 0,-0.976997367730473,-0.594055002855628
14 | 0,-2.77224622666051,3.94657421847961
15 | 1,-9.27205318506747,1.24216037652738
16 | 0,-1.71173169597796,0.584266032844234
17 | 0,-2.15368098415523,2.98009468989564
18 | 0,0.426199438668344,-2.04674949884645
19 | 1,-1.01334285269519,-10.3345489508607
20 | 1,-9.54101504300598,-4.55047974493486
21 | 1,10.1957066568018,0.254420103019349
22 | 1,-8.6778473681861,-3.27270683674903
23 | 1,6.47809481278742,8.96100657254648
24 | 0,-1.95411001422583,-1.14169962947211
25 | 1,11.8505019698607,-0.880526995586208
26 | 1,9.85201428822543,-1.98092728566354
27 | 1,-2.61930621246295,-9.91735995764943
28 | 1,-9.57702511533498,0.856047877803597
29 | 1,1.07211379460816,10.0475383208642
30 | 1,10.2114266188495,2.56413556126083
31 | 1,-1.0837815752106,10.4989367949914
32 | 0,-2.67378400738832,-0.261205121872508
33 | 1,-8.53007726968589,-5.67998129587997
34 | 1,-10.7663138511873,-1.29515496010343
35 | 0,-2.55338710545442,1.75125814188713
36 | 0,1.35469670851131,0.223877677617805
37 | 1,-1.84504413574705,-9.33192341793193
38 | 1,-5.80946534065145,-9.47640833659501
39 | 0,-1.04008162688905,-4.49097940004103
40 | 1,-5.32311107475766,7.53870059615664
41 | 1,2.53744100730897,-11.2206190940572
42 | 1,10.0004938360844,0.95507808057832
43 | 1,-7.63830322423253,-2.50856029879941
44 | 1,8.43366809886814,-6.2195651470462
45 | 1,-2.91023170983615,8.48032045150948
46 | 0,1.45578752738673,0.308974309209017
47 | 1,6.47268350416766,-6.61767616970341
48 | 1,7.5798589614718,-5.95840297257091
49 | 1,3.57609238872552,-10.8153763028483
50 | 0,-1.94175602111866,3.25323642785476
51 | 1,-8.57430992466047,0.0162163356664646
52 | 1,-2.28973318237138,-9.18916703684439
53 | 0,-2.07413895223307,-2.84246386431687
54 | 0,1.82662162321882,1.19275832563431
55 | 1,7.53222089679843,3.50851984721159
56 | 1,-8.63645268048514,-5.51919306313108
57 | 1,-1.38283835552521,-9.20403015220651
58 | 0,-0.0693929757023144,-0.120923632938589
59 | 1,8.94157505646838,4.1511541665216
60 | 0,-2.21638326393234,-0.743642766186011
61 | 1,6.92970234039982,-5.7299855716108
62 | 1,-9.96045721031378,-3.55960137096896
63 | 1,-5.08716904419667,-7.71311752380464
64 | 0,0.432315244441807,0.146519155172826
65 | 0,3.81359256480327,-0.949161768843296
66 | 0,-2.67813066150075,0.155970525921464
67 | 1,7.06099787525839,7.47853003826184
68 | 0,-3.6022241782501,2.87563655800124
69 | 1,-8.56205298519004,-5.59296554895657
70 | 0,2.27194803379196,-0.697309431763461
71 | 0,0.414869754722005,0.773457746708829
72 | 0,4.57901963248404,-0.581826336269988
73 | 0,0.345418471438973,0.341721440272724
74 | 1,2.13088143827608,11.1248802622644
75 | 1,-9.28621855455942,3.66602302094549
76 | 0,-1.05044463591027,-2.26198313837955
77 | 0,-0.203877849613951,1.77850652695302
78 | 1,-11.5347464575896,4.04493198309583
79 | 0,0.515441830198029,-1.48118858476755
80 | 0,-1.34492291408213,-2.21045863112778
81 | 0,-2.74619790193218,2.01868126297085
82 | 0,-0.0610050091577645,2.45055169641884
83 | 1,-10.9333944290643,2.16597032988926
84 | 0,2.10185914521208,2.50441889905131
85 | 0,-1.95060135192445,1.0221208136267
86 | 1,-3.93631134497805,-11.5948459098007
87 | 1,-1.77353334274379,10.9805317915229
88 | 1,-10.0125131496373,2.87350793431108
89 | 0,0.224152299355375,1.05534430338697
90 | 1,-7.56460268914936,6.34358229167367
91 | 1,-4.73459089779698,8.21880829353707
92 | 0,0.507713445092494,1.51044358214825
93 | 1,-0.92935781010476,10.138681083383
94 | 1,-7.1254412107155,-6.959519579293
95 | 0,-2.77394317025641,2.32841485779483
96 | 1,1.48198390389601,-9.20739827901074
97 | 0,-0.626710721992932,0.88992747103403
98 | 0,0.00481475570487226,-0.0076767885430906
99 | 0,1.36976033043542,-0.911796278185519
100 | 0,0.580325270116439,0.194246997750822
101 | 0,0.0269864269953635,0.308211189424495
102 | 1,9.62694841049478,1.73787010859398
103 | 1,3.31743060164764,-8.27434267939844
104 | 1,9.29743641003553,1.83876116471785
105 | 0,-1.56634889087478,1.12087375501131
106 | 0,0.234380373042369,-0.190852897944166
107 | 0,-2.2265741418854,0.541654777530827
108 | 1,-7.91642966575721,-6.56424201265321
109 | 1,-3.37861182479723,-8.15311367827467
110 | 0,-2.30671347876545,1.27602436352785
111 | 0,0.567730139692622,0.658488776316152
112 | 0,-1.8765363614178,3.31819158474277
113 | 1,9.32250255379486,6.67570571550008
114 | 1,6.87236057429256,6.31206298993164
115 | 0,1.24283341259685,-2.43335284418338
116 | 1,-8.85389030392406,-4.86565417551327
117 | 0,0.422414911283583,-1.78131047947536
118 | 0,0.00463276189188866,1.91938006596196
119 | 0,0.788171009954018,-1.40060317554696
120 | 1,-5.15648839125392,7.93637022030082
121 | 1,-8.88719630635514,-4.77641418214191
122 | 0,4.65847607993914,0.573533671794049
123 | 0,4.10489797677036,1.46331719664337
124 | 1,-5.64988236745269,7.59535604779533
125 | 1,1.97988634662085,-9.34147961488475
126 | 1,-9.56786717092285,6.15062630347382
127 | 0,-0.305509920415386,-0.816687799655518
128 | 0,4.37640279752176,-0.208692920893062
129 | 0,-2.99618409119118,-1.33982190816661
130 | 0,-3.5051709982934,-3.26706600506422
131 | 1,-0.444385483407193,-11.1351261211535
132 | 0,-0.908983892610002,4.33860668853704
133 | 0,0.308859742393393,0.668636267443342
134 | 1,4.26133424202836,-10.9742307794595
135 | 0,0.0549530959883597,0.0130070694151358
136 | 1,7.53390793895624,-5.07453255571314
137 | 0,0.040282157515584,-0.0413994226910768
138 | 0,0.364807994500752,-1.41501146682248
139 | 1,-9.8371849383414,-0.532494657299655
140 | 1,-8.92446101991649,-6.55325261517569
141 | 1,5.42742176639377,7.64488548818019
142 | 0,-0.0231829153121243,-0.0826624332581433
143 | 1,9.98354791862655,-2.21244212488613
144 | 0,-0.245484760362165,0.708915138075223
145 | 0,1.00809963140419,1.80528790612361
146 | 1,7.96253309805136,-7.85313040622627
147 | 1,-4.85802016477551,8.84980056346979
148 | 1,-2.2377711461735,12.1651953006766
149 | 1,-9.26259991078727,6.12948894225519
150 | 1,-2.50174647515082,9.94275644334929
151 | 0,1.69989746467005,-1.85933593163952
152 | 0,-1.55162808416059,2.51764424788514
153 | 1,10.2067319750296,1.44620992046507
154 | 0,0.395047528277998,2.75099869255495
155 | 0,3.47235286038504,0.312787622637351
156 | 0,0.572346258154881,0.128281214038418
157 | 0,-2.71563517805396,-0.555094438250771
158 | 0,0.0327901304785445,0.38291078109353
159 | 0,-0.498060478251128,-1.7347019542345
160 | 0,3.36656051937686,3.4067759448776
161 | 1,5.46174197424899,-9.54155164348352
162 | 1,8.1986107522222,-4.78277561362211
163 | 0,-3.41158980446985,1.82009660206347
164 | 1,-8.73380239355277,6.55987672658631
165 | 1,4.16415800597914,-9.96329401057575
166 | 1,0.0635559384201306,-9.97106599315458
167 | 1,10.6010735746994,-4.48171435604685
168 | 1,-7.29353120905952,7.77094608053753
169 | 1,-4.28656352697919,7.86961326195471
170 | 1,6.87878620757886,-8.5880001448525
171 | 0,-0.00596364426059484,-0.312808418406871
172 | 0,2.44510351847744,-0.61797711888748
173 | 1,-10.0840960767766,1.16202560086554
174 | 1,2.62845369667177,-8.69414146855081
175 | 1,5.55113670271272,-9.39642126569543
176 | 0,-0.594048098785318,0.996897492935077
177 | 0,-4.40699157911697,-1.52196900587115
178 | 0,1.12118760598014,0.457381308714515
179 | 0,2.94573263955246,1.71445582937027
180 | 0,0.421372932702674,-0.72130233384774
181 | 0,1.30805321432315,1.71906146831119
182 | 0,-1.02594303892191,-2.97572485299017
183 | 0,-2.27067165274015,0.422842952161015
184 | 1,10.6456910698005,-3.31028616534108
185 | 1,-6.20302052061548,7.36628634276438
186 | 0,-0.890365500283691,4.49479121231346
187 | 0,2.06094009366537,-0.492963725757071
188 | 0,-0.0932576441584378,0.035383386386386
189 | 0,-0.945507509093508,-1.59607616730411
190 | 1,7.88114689193623,-4.89056480426439
191 | 1,0.223906883492946,11.7432077852699
192 | 0,2.84591368720827,2.46926767037653
193 | 0,3.55565289348271,-0.258461484780092
194 | 1,-9.50178734944532,2.69678575087567
195 | 1,-10.342252581035,-1.84082834217332
196 | 1,-8.90024124086661,4.12315811788013
197 | 1,-9.15481878470657,-6.78170261760227
198 | 0,-3.2979728880191,-1.12113005646469
199 | 1,-8.20732169834139,-7.00081442556793
200 | 0,0.0992702573431094,-1.17817785449192
201 | 1,4.17694627875903,-11.543308379581
202 | 0,-0.036663809603862,0.498632860654854
203 | 0,-0.201069585544108,1.17033543172605
204 | 0,2.80328218434308,-2.43129523797881
205 | 1,-4.21178149153774,-7.46428961884393
206 | 0,2.09921429016358,0.55304807869477
207 | 0,1.31248181814165,3.20562148619731
208 | 0,2.85883233366319,-0.759820972487253
209 | 1,-10.1650537770967,-2.0801791464585
210 | 0,-2.1670183576218,1.13972157472014
211 | 0,0.0108340841362071,1.95019307061518
212 | 1,-8.90542819752757,-8.09167777511843
213 | 1,-7.18547658943247,-6.38352000910207
214 | 0,-3.83651353187629,-2.6679417051265
215 | 1,-8.55731495030576,-5.12139053015818
216 | 1,-4.37309536404361,9.02683827698961
217 | 1,-9.95153189289823,7.25419730239411
218 | 1,-5.77792243229521,3.29897405088575
219 | 0,-0.978257858632217,1.33994057196346
220 | 1,-11.0490912634278,-3.63285164237176
221 | 0,4.34998518587393,-1.66190225921625
222 | 1,-2.28037410757814,-11.0065414827351
223 | 1,-6.48182023032073,-5.32008165848267
224 | 1,10.9321896085361,2.57625752854501
225 | 0,-0.855414880783667,-1.54582066524341
226 | 0,-2.48694191253009,-3.59242366223534
227 | 1,5.86833030067717,6.3060069020669
228 | 0,-4.52524274333023,0.597159014354813
229 | 1,9.65570106735639,3.69546395614109
230 | 1,-0.0145260662996758,10.1955086142379
231 | 1,5.86424469886416,7.53022893335345
232 | 1,10.3293467448726,-1.46270278118589
233 | 1,-9.6979144376165,-0.823221977645361
234 | 0,-0.0864989675056687,-0.0145488715039846
235 | 1,-2.3009184215667,7.25363336122079
236 | 0,0.0539809946522458,-0.011196844242836
237 | 1,11.2360322931931,-3.54284413133745
238 | 0,0.46915024957354,2.50185938269418
239 | 0,1.44720642012143,-3.67414872787175
240 | 1,4.96807922437839,7.57036410988707
241 | 1,-10.2400248695332,-1.00556394001087
242 | 0,1.79701924182876,-3.08521139838892
243 | 0,2.77693436795065,-0.0483701255937625
244 | 0,1.79652802751289,-2.11565505711717
245 | 0,-1.92427883670382,-0.709061204830383
246 | 1,5.61764309511674,-8.73229448001942
247 | 0,-1.00322969639514,2.59680977945393
248 | 1,-4.68928444646833,10.1266922500795
249 | 0,-1.65336618508314,3.73537411747847
250 | 1,-4.53519757096137,8.34607508200337
251 | 0,-2.62093993693279,-4.03020564623166
252 | 1,3.13516337853046,9.56780525203028
253 | 1,-11.5693229568301,-1.08355293885503
254 | 1,6.65823956882569,6.19625646514916
255 | 0,0.486472535548376,-0.341617441886455
256 | 1,-4.74131778067451,9.17698644950461
257 | 1,-9.31629777687801,6.13871622365375
258 | 0,0.169753410009909,1.51553810602403
259 | 1,4.51265360811713,9.31056986576143
260 | 1,-11.5298906428818,-4.02648933898779
261 | 0,0.556936314561493,2.08485141648561
262 | 1,-9.30419408079191,4.83282206801119
263 | 0,2.64016942808667,-0.384054680864585
264 | 1,3.43184413979082,11.7058713082919
265 | 1,-8.07712675897402,-2.93810931806213
266 | 1,8.63215175866594,3.36668334278562
267 | 1,5.73393846246632,7.39759173411551
268 | 0,-0.662201424760142,0.215099882614474
269 | 0,4.60918258996901,0.650773023760416
270 | 0,-0.261473663189791,-2.19519201727349
271 | 0,-3.84077466633499,-0.212959082915917
272 | 1,1.82636522939007,-7.44304156943001
273 | 1,-7.21525698398697,-7.33394786861225
274 | 1,-10.3118393281703,3.2358546270545
275 | 0,0.215654356311328,-0.289456577175344
276 | 1,4.2529449732302,8.28218987873138
277 | 1,3.76374781341379,7.76200699974713
278 | 0,-0.94078232675739,-1.24243636864211
279 | 0,-1.4265522936326,-1.27078895818712
280 | 0,0.0572953680023405,1.73248590303423
281 | 1,-5.68157377068265,5.98595810182127
282 | 0,-2.63384081068015,-0.892352521483994
283 | 1,6.10474472964248,-7.67250589797721
284 | 0,-0.602510651621868,2.45918007234019
285 | 1,-7.80571298811699,-8.33643924522971
286 | 1,8.78000776532509,-2.15033601616451
287 | 0,-1.46430040138404,-3.77495770956946
288 | 0,-2.66103963980917,0.412474104123609
289 | 0,-0.530804429669786,-0.564925540744405
290 | 0,-0.650359299721333,-3.02282869266444
291 | 0,0.414369656705535,-0.349880043226695
292 | 0,-0.128547997046947,-1.01830367639969
293 | 1,5.55668360698597,-9.12570236535847
294 | 0,0.355100082643081,1.37300364593414
295 | 1,2.12057341641049,9.04247486828233
296 | 0,2.19931919280748,-1.45505785720368
297 | 1,6.53987795735302,5.57674559973847
298 | 1,-8.85329571875129,-2.86357142773942
299 | 1,-9.48955292204604,3.15834735037315
300 | 1,-0.913342950045948,-9.25404050453199
301 | 0,-1.10655097576856,1.75146348802199
302 | 1,-9.39974823202092,-1.01560623441376
303 | 1,5.14815392183269,9.33131342390757
304 | 1,1.04909320547358,9.82224135822256
305 | 1,1.03149158375957,10.8579495514345
306 | 1,4.01261486260903,-8.67833646589891
307 | 0,2.02632360049172,-2.03154697616943
308 | 1,9.47359843510278,-2.49418195261994
309 | 1,-1.82965121553759,-11.0386285491209
310 | 1,7.96272092800234,-5.01336554234833
311 | 1,3.76887331843534,8.40803649028227
312 | 1,9.7383931076913,3.3538227961213
313 | 1,1.78152776056893,9.05720468556054
314 | 1,4.33463663877414,9.00675114796895
315 | 1,11.0459363233516,-2.47102305504371
316 | 0,0.0160930568002092,-0.00527210785890367
317 | 0,0.200360393279074,-1.20473808506873
318 | 1,-9.03404028932402,2.4524119406066
319 | 1,4.82756479874551,9.54294111536609
320 | 1,9.82608323971342,-0.490140627425152
321 | 0,0.317561289908611,1.29827531053109
322 | 1,-6.04556676130117,-7.88520748369331
323 | 0,2.59457392106183,-4.15472619292352
324 | 0,-3.37986460368526,1.52391489458251
325 | 0,0.021585687554024,2.33853514499505
326 | 0,1.034240013822,4.59312695611448
327 | 0,-1.23703103466187,-2.86647507908145
328 | 1,6.77387554884675,-7.7432809690283
329 | 1,-9.33954022371436,5.05647852994881
330 | 1,10.7084590543593,-3.30382360443407
331 | 0,-2.2842516670118,-1.05802289945986
332 | 0,-0.888627852525752,-0.199529689433506
333 | 0,-3.49937141832903,-2.78920098209679
334 | 1,-8.54444734703623,5.15290642411556
335 | 1,-10.9991359516716,1.56550203945429
336 | 0,-1.62271758630075,-1.06679195796304
337 | 0,3.40938612174097,-1.743378716954
338 | 1,-2.71823572876396,-8.65082288280461
339 | 0,-1.14854086435445,2.17489782751366
340 | 0,-0.196117913297385,-0.469117003631568
341 | 0,1.8380083954889,-0.875301647316829
342 | 0,4.64455220124379,0.409699374145785
343 | 1,8.47405129064004,-4.105974342275
344 | 1,-8.24221229974011,-4.59862113718691
345 | 1,3.81127217175893,-10.1050492972145
346 | 0,0.519431081604702,1.80614226236606
347 | 1,7.63615706995123,-7.75610168323522
348 | 0,-1.08437481639501,1.36288900412428
349 | 1,2.02905335541104,-10.1295889685264
350 | 1,0.161528961632774,-10.137125846199
351 | 1,10.4765957714668,4.1090897240005
352 | 1,-0.166629728788782,-8.85661502275487
353 | 0,-2.28121826990176,2.53351787365609
354 | 0,1.13928572721706,-1.8571950727544
355 | 0,2.64666722634757,0.0684503922307458
356 | 0,0.988953107819114,3.04082827262379
357 | 0,-1.39114927907483,1.4733394477615
358 | 1,0.649922120356765,-9.50083325228564
359 | 0,-2.51072609791792,-3.73602525799966
360 | 0,1.76110299666534,-1.87344808043147
361 | 0,2.71502354960309,-0.380105408579504
362 | 1,9.91666560314039,-2.4011289843972
363 | 0,0.00276493386156799,-0.00047389739896824
364 | 1,-8.31177930184917,3.14755376054514
365 | 1,-1.39691313380189,10.7843324422809
366 | 1,8.41027575454507,8.15286478379426
367 | 0,0.0538155161432796,-0.0274959922475747
368 | 1,-6.68173143062251,-4.86130800934844
369 | 1,2.06226906947099,-8.83087995033647
370 | 0,-0.514244407468289,0.56340709581946
371 | 0,0.904204500451216,0.145890291084222
372 | 1,-0.373772532438791,-10.5540046056149
373 | 0,-0.64733166168444,-2.50972745926982
374 | 0,2.49066331770437,-0.38980117684316
375 | 0,-0.121861266350661,-0.975691969493998
376 | 0,0.0136936554077183,0.00905354378238528
377 | 1,-1.35634217401442,-10.2435578196301
378 | 0,1.19400319228345,0.0425435453573504
379 | 1,-6.31685439869954,7.05270156576201
380 | 0,3.95613090853654,2.26118736341486
381 | 1,-3.80935681740784,-6.93877248608993
382 | 1,9.16078006408909,5.21362287942432
383 | 0,-0.0961663520905861,1.14391133898005
384 | 1,6.69807957205987,8.01721502179085
385 | 1,6.10061383293858,-10.1254957027826
386 | 0,-0.145830006419021,-3.21051314597282
387 | 1,-9.644505921852,-4.68322890529518
388 | 0,3.49401374272763,-1.93328418402963
389 | 1,-8.1777170322131,3.67898840300946
390 | 0,0.789934350816148,-3.61381835417044
391 | 0,0.12557832514292,0.0574660263274467
392 | 0,3.66138256455025,0.746035857019821
393 | 0,-1.74978761846171,-3.83359047434783
394 | 0,-0.120644041263509,-0.15610430606642
395 | 1,-8.95462335917438,1.53367857645982
396 | 1,-8.45802746715784,-9.01314440064012
397 | 1,8.50331338295575,-1.67841696346795
398 | 1,-9.04203661019369,-0.655294863928458
399 | 1,10.062124111155,0.707837481479977
400 | 1,5.52377610489494,-9.66710501600297
401 | 1,5.94168173950735,-7.32976055377187
402 | 1,8.72984887597856,-4.45601084674185
403 | 0,0.247164836253612,0.191640153898579
404 | 1,2.51675618592276,-8.90377643185965
405 | 0,0.938453174791395,1.2401746914312
406 | 0,-1.61315367175214,-1.74181961997504
407 | 1,-2.00770937575851,-9.79543686818336
408 | 1,-10.6081758993541,-2.30586962693893
409 | 1,4.37593142103152,8.82205859828211
410 | 1,-0.807364582015188,-9.02865735574655
411 | 1,-8.65026309883851,-5.01060749421735
412 | 0,-1.56773710593331,-4.04392399326692
413 | 1,7.47825752278111,7.63682951990349
414 | 1,5.06004894214798,6.73863502211331
415 | 0,4.00022648929161,0.0372091773790018
416 | 0,-1.42421041251485,-0.356806512913475
417 | 0,-4.31549925866844,0.0160878368964697
418 | 0,-1.41624769404774,0.643758491802536
419 | 1,9.88666834786708,-3.90849935346593
420 | 1,2.2416201243829,-9.30374355745025
421 | 1,-10.0514333178811,1.33169925415424
422 | 0,-1.49293875157692,-0.0803784815129737
423 | 0,-0.373542134581573,0.0739314629850316
424 | 1,7.1304325571835,7.61169055909981
425 | 0,-2.78948913081346,2.64235449427601
426 | 1,8.5590872976283,-5.90499063570252
427 | 0,-2.21492907088892,-0.198719575664393
428 | 0,-1.27420071548796,0.880972164214771
429 | 1,2.20572171345621,8.92586340554095
430 | 1,-4.37764848123899,-8.49532796666674
431 | 0,0.661821170388373,0.702696272016525
432 | 0,1.09455979165081,-0.3171514027456
433 | 1,5.85232430912185,7.51292054362969
434 | 1,-8.7572670015153,-5.29486930179609
435 | 1,-8.71995807316605,2.73843845779256
436 | 1,-10.469306922401,0.677912104009433
437 | 0,-0.238834278775273,2.66637814479462
438 | 1,-9.97402925726769,0.516751990249783
439 | 0,4.52499972514605,-0.210723879797662
440 | 1,-9.50349393933319,-4.90914644583363
441 | 0,3.09292443183826,0.691094755846385
442 | 1,6.52451117529468,-6.16749398471218
443 | 1,-7.89142284201564,6.22771592008867
444 | 0,1.34160906824512,-4.29457210905755
445 | 0,3.37690030445055,3.23823171065721
446 | 0,-1.74756460040412,0.115161636705417
447 | 1,2.09259891027732,10.7133984068071
448 | 0,-1.55897832550468,-0.54500742923725
449 | 1,-10.1510948130152,0.705391307620071
450 | 1,9.87819060488774,1.56149257723905
451 | 1,0.664193123754842,-10.3450571304519
452 | 1,-2.16919495074005,11.5422720203636
453 | 1,-10.5964911037683,-4.48693512558455
454 | 0,-0.322151945415054,-0.149942682756622
455 | 1,10.0468947277568,-1.69126438563845
456 | 1,-7.20530950510333,8.35002058558393
457 | 1,-10.199068740954,1.69761763138409
458 | 0,-0.886715286531061,-1.15315250839658
459 | 0,1.00028439905777,4.78286803165714
460 | 0,-0.796134471471364,3.3803218609938
461 | 0,0.879628835052097,-0.347048944632013
462 | 1,3.316360621084,-9.29064304093852
463 | 1,9.52490785592518,0.768107493010713
464 | 0,-3.6179464683711,3.08367245980074
465 | 0,-2.50129521328432,2.16638503017564
466 | 1,3.50796070658719,-8.82085259291101
467 | 0,1.7874192589039,-1.28557477831587
468 | 0,1.47090174592303,-1.49000209727276
469 | 1,2.187792192926,10.98122734842
470 | 0,3.18521173851152,1.64120290792636
471 | 0,0.0804722743095204,-0.00491169081515815
472 | 1,2.72292660820893,10.3267262991723
473 | 1,-5.50481123132574,10.8808726886333
474 | 1,-0.752106622674431,-11.9449770648393
475 | 0,0.597743773625992,0.118823361672357
476 | 1,7.74804187657779,-2.64517373831667
477 | 1,9.41725961538244,4.47617853340803
478 | 1,8.7005416319304,1.46344559888605
479 | 1,2.13873428675928,-9.99996555050612
480 | 1,6.23703955182193,-8.33833450626592
481 | 1,-7.34246515228146,-7.28411491888873
482 | 1,8.19256506542502,4.66499857335704
483 | 0,1.08386770608604,2.04170123691303
484 | 0,0.792918657651094,-0.665874555793935
485 | 1,8.01455681864082,-6.91718657478915
486 | 1,-6.60565490957551,-6.77312146274872
487 | 0,-0.698826810679956,-0.132635132827256
488 | 1,-8.07291096364784,3.60841779203007
489 | 0,-2.86976134371763,1.61068512039112
490 | 0,-0.153519843634597,0.143814571936753
491 | 0,0.751526133391785,4.83144598607007
492 | 1,2.30364533084953,9.63050615755962
493 | 1,-8.78162413153049,0.604223388050903
494 | 1,4.40980981866989,7.88898941741628
495 | 1,7.47449640415546,-4.77502520934164
496 | 1,-4.74145482074682,7.30976083268096
497 | 0,-0.909361015237169,-2.24009871761788
498 | 0,0.509643813824878,1.72612970266538
499 | 1,7.53013498254882,-7.65865946560226
500 | 0,-0.0925880663722012,-0.111777543954269
501 |
--------------------------------------------------------------------------------
/simdata/saturn_data_train.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/jasonbaldridge/try-tf/b9c24949c26c6f36bfec0466bfa951e1e903d3c3/simdata/saturn_data_train.jpg
--------------------------------------------------------------------------------
/softmax-python3.py:
--------------------------------------------------------------------------------
1 | import tensorflow.python.platform
2 |
3 | import numpy as np
4 | import tensorflow as tf
5 |
6 | import plot_boundary_on_data
7 |
8 | # Global variables.
9 | NUM_LABELS = 2 # The number of labels.
10 | BATCH_SIZE = 100 # The number of training examples to use per training step.
11 |
12 | # Define the flags useable from the command line.
13 | tf.app.flags.DEFINE_string('train', None,
14 | 'File containing the training data (labels & features).')
15 | tf.app.flags.DEFINE_string('test', None,
16 | 'File containing the test data (labels & features).')
17 | tf.app.flags.DEFINE_integer('num_epochs', 1,
18 | 'Number of examples to separate from the training '
19 | 'data for the validation set.')
20 | tf.app.flags.DEFINE_boolean('verbose', False, 'Produce verbose output.')
21 | tf.app.flags.DEFINE_boolean('plot', True, 'Plot the final decision boundary on the data.')
22 | FLAGS = tf.app.flags.FLAGS
23 |
24 | # Extract numpy representations of the labels and features given rows consisting of:
25 | # label, feat_0, feat_1, ..., feat_n
26 | def extract_data(filename):
27 |
28 | # Arrays to hold the labels and feature vectors.
29 | labels = []
30 | fvecs = []
31 |
32 | # Iterate over the rows, splitting the label from the features. Convert labels
33 | # to integers and features to floats.
34 | for line in open(filename):
35 | row = line.split(",")
36 | labels.append(int(row[0]))
37 | fvecs.append([float(x) for x in row[1:]])
38 |
39 | # Convert the array of float arrays into a numpy float matrix.
40 | fvecs_np = np.matrix(fvecs).astype(np.float32)
41 |
42 | # Convert the array of int labels into a numpy array.
43 | labels_np = np.array(labels).astype(dtype=np.uint8)
44 |
45 | # Convert the int numpy array into a one-hot matrix.
46 | labels_onehot = (np.arange(NUM_LABELS) == labels_np[:, None]).astype(np.float32)
47 |
48 | # Return a pair of the feature matrix and the one-hot label matrix.
49 | return fvecs_np,labels_onehot
50 |
51 | def main(argv=None):
52 | # Be verbose?
53 | verbose = FLAGS.verbose
54 |
55 | # Plot?
56 | plot = FLAGS.plot
57 |
58 | # Get the data.
59 | train_data_filename = FLAGS.train
60 | test_data_filename = FLAGS.test
61 |
62 | # Extract it into numpy matrices.
63 | train_data,train_labels = extract_data(train_data_filename)
64 | test_data, test_labels = extract_data(test_data_filename)
65 |
66 | # Get the shape of the training data.
67 | train_size,num_features = train_data.shape
68 |
69 | # Get the number of epochs for training.
70 | num_epochs = FLAGS.num_epochs
71 |
72 | # This is where training samples and labels are fed to the graph.
73 | # These placeholder nodes will be fed a batch of training data at each
74 | # training step using the {feed_dict} argument to the Run() call below.
75 | x = tf.placeholder("float", shape=[None, num_features])
76 | y_ = tf.placeholder("float", shape=[None, NUM_LABELS])
77 |
78 | # These are the weights that inform how much each feature contributes to
79 | # the classification.
80 | W = tf.Variable(tf.zeros([num_features,NUM_LABELS]))
81 | b = tf.Variable(tf.zeros([NUM_LABELS]))
82 | y = tf.nn.softmax(tf.matmul(x,W) + b)
83 |
84 | # Optimization.
85 | cross_entropy = -tf.reduce_sum(y_*tf.log(y))
86 | train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)
87 |
88 |
89 | # For the test data, hold the entire dataset in one constant node.
90 | test_data_node = tf.constant(test_data)
91 |
92 | # Evaluation.
93 | predicted_class = tf.argmax(y,1);
94 | correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
95 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
96 |
97 | # Create a local session to run this computation.
98 | with tf.Session() as s:
99 |
100 | # Run all the initializers to prepare the trainable parameters.
101 | tf.global_variables_initializer().run()
102 |
103 | # Iterate and train.
104 | for step in range(num_epochs * train_size // BATCH_SIZE):
105 |
106 | offset = (step * BATCH_SIZE) % train_size
107 |
108 | # get a batch of data
109 | batch_data = train_data[offset:(offset + BATCH_SIZE), :]
110 | batch_labels = train_labels[offset:(offset + BATCH_SIZE)]
111 |
112 | # feed data into the model
113 | train_step.run(feed_dict={x: batch_data, y_: batch_labels})
114 |
115 |
116 |
117 | # Give very detailed output.
118 | if verbose:
119 | print
120 | print('Weight matrix.')
121 | print (s.run(W))
122 | print
123 | print ('Bias vector.')
124 | print (s.run(b))
125 | print
126 | print ("Applying model to first test instance.")
127 | first = test_data[:1]
128 | print ("Point =", first)
129 | print ("Wx+b = ", s.run(tf.matmul(first,W)+b))
130 | print ("softmax(Wx+b) = ", s.run(tf.nn.softmax(tf.matmul(first,W)+b)))
131 | print
132 |
133 | print ("Accuracy:", accuracy.eval(feed_dict={x: test_data, y_: test_labels}))
134 |
135 | if plot:
136 | eval_fun = lambda X: predicted_class.eval(feed_dict={x:X});
137 | plot_boundary_on_data.plot(test_data, test_labels, eval_fun)
138 |
139 | if __name__ == '__main__':
140 | tf.app.run()
141 |
--------------------------------------------------------------------------------
/softmax.py:
--------------------------------------------------------------------------------
1 | import tensorflow.python.platform
2 |
3 | import numpy as np
4 | import tensorflow as tf
5 |
6 | import plot_boundary_on_data
7 |
8 | # Global variables.
9 | NUM_LABELS = 2 # The number of labels.
10 | BATCH_SIZE = 100 # The number of training examples to use per training step.
11 |
12 | # Define the flags useable from the command line.
13 | tf.app.flags.DEFINE_string('train', None,
14 | 'File containing the training data (labels & features).')
15 | tf.app.flags.DEFINE_string('test', None,
16 | 'File containing the test data (labels & features).')
17 | tf.app.flags.DEFINE_integer('num_epochs', 1,
18 | 'Number of examples to separate from the training '
19 | 'data for the validation set.')
20 | tf.app.flags.DEFINE_boolean('verbose', False, 'Produce verbose output.')
21 | tf.app.flags.DEFINE_boolean('plot', True, 'Plot the final decision boundary on the data.')
22 | FLAGS = tf.app.flags.FLAGS
23 |
24 | # Extract numpy representations of the labels and features given rows consisting of:
25 | # label, feat_0, feat_1, ..., feat_n
26 | def extract_data(filename):
27 |
28 | # Arrays to hold the labels and feature vectors.
29 | labels = []
30 | fvecs = []
31 |
32 | # Iterate over the rows, splitting the label from the features. Convert labels
33 | # to integers and features to floats.
34 | for line in file(filename):
35 | row = line.split(",")
36 | labels.append(int(row[0]))
37 | fvecs.append([float(x) for x in row[1:]])
38 |
39 | # Convert the array of float arrays into a numpy float matrix.
40 | fvecs_np = np.matrix(fvecs).astype(np.float32)
41 |
42 | # Convert the array of int labels into a numpy array.
43 | labels_np = np.array(labels).astype(dtype=np.uint8)
44 |
45 | # Convert the int numpy array into a one-hot matrix.
46 | labels_onehot = (np.arange(NUM_LABELS) == labels_np[:, None]).astype(np.float32)
47 |
48 | # Return a pair of the feature matrix and the one-hot label matrix.
49 | return fvecs_np,labels_onehot
50 |
51 | def main(argv=None):
52 | # Be verbose?
53 | verbose = FLAGS.verbose
54 |
55 | # Plot?
56 | plot = FLAGS.plot
57 |
58 | # Get the data.
59 | train_data_filename = FLAGS.train
60 | test_data_filename = FLAGS.test
61 |
62 | # Extract it into numpy matrices.
63 | train_data,train_labels = extract_data(train_data_filename)
64 | test_data, test_labels = extract_data(test_data_filename)
65 |
66 | # Get the shape of the training data.
67 | train_size,num_features = train_data.shape
68 |
69 | # Get the number of epochs for training.
70 | num_epochs = FLAGS.num_epochs
71 |
72 | # This is where training samples and labels are fed to the graph.
73 | # These placeholder nodes will be fed a batch of training data at each
74 | # training step using the {feed_dict} argument to the Run() call below.
75 | x = tf.placeholder("float", shape=[None, num_features])
76 | y_ = tf.placeholder("float", shape=[None, NUM_LABELS])
77 |
78 | # These are the weights that inform how much each feature contributes to
79 | # the classification.
80 | W = tf.Variable(tf.zeros([num_features,NUM_LABELS]))
81 | b = tf.Variable(tf.zeros([NUM_LABELS]))
82 | y = tf.nn.softmax(tf.matmul(x,W) + b)
83 |
84 | # Optimization.
85 | cross_entropy = -tf.reduce_sum(y_*tf.log(y))
86 | train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)
87 |
88 |
89 | # For the test data, hold the entire dataset in one constant node.
90 | test_data_node = tf.constant(test_data)
91 |
92 | # Evaluation.
93 | predicted_class = tf.argmax(y,1);
94 | correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
95 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
96 |
97 | # Create a local session to run this computation.
98 | with tf.Session() as s:
99 |
100 | # Run all the initializers to prepare the trainable parameters.
101 | tf.initialize_all_variables().run()
102 |
103 | # Iterate and train.
104 | for step in xrange(num_epochs * train_size // BATCH_SIZE)
105 |
106 | offset = (step * BATCH_SIZE) % train_size
107 |
108 | # get a batch of data
109 | batch_data = train_data[offset:(offset + BATCH_SIZE), :]
110 | batch_labels = train_labels[offset:(offset + BATCH_SIZE)]
111 |
112 | # feed data into the model
113 | train_step.run(feed_dict={x: batch_data, y_: batch_labels})
114 |
115 |
116 |
117 | # Give very detailed output.
118 | if verbose:
119 | print
120 | print 'Weight matrix.'
121 | print s.run(W)
122 | print
123 | print 'Bias vector.'
124 | print s.run(b)
125 | print
126 | print "Applying model to first test instance."
127 | first = test_data[:1]
128 | print "Point =", first
129 | print "Wx+b = ", s.run(tf.matmul(first,W)+b)
130 | print "softmax(Wx+b) = ", s.run(tf.nn.softmax(tf.matmul(first,W)+b))
131 | print
132 |
133 | print "Accuracy:", accuracy.eval(feed_dict={x: test_data, y_: test_labels})
134 |
135 | if plot:
136 | eval_fun = lambda X: predicted_class.eval(feed_dict={x:X});
137 | plot_boundary_on_data.plot(test_data, test_labels, eval_fun)
138 |
139 | if __name__ == '__main__':
140 | tf.app.run()
141 |
--------------------------------------------------------------------------------
/softmax_checkpoints.py:
--------------------------------------------------------------------------------
1 | import tensorflow.python.platform
2 |
3 | import numpy as np
4 | import tensorflow as tf
5 | import os
6 |
7 | # Global variables.
8 | NUM_LABELS = 2 # The number of labels.
9 | BATCH_SIZE = 100 # The number of training examples to use per training step.
10 |
11 |
12 | # Define the flags useable from the command line.
13 | tf.app.flags.DEFINE_string('train', None,
14 | 'File containing the training data (labels & features).')
15 | tf.app.flags.DEFINE_string('test', None,
16 | 'File containing the test data (labels & features).')
17 |
18 | tf.app.flags.DEFINE_integer('num_epochs', 1,
19 | 'Number of examples to separate from the training '
20 | 'data for the validation set.')
21 | tf.app.flags.DEFINE_boolean('verbose', False, 'Produce verbose output.')
22 | tf.app.flags.DEFINE_string('checkpoint_dir', 'saved_models',
23 | 'Directory in which to save model checkpoints')
24 | tf.app.flags.DEFINE_boolean('training', True, 'Training model or loading saved model')
25 | FLAGS = tf.app.flags.FLAGS
26 |
27 | # Extract numpy representations of the labels and features given rows consisting of:
28 | # label, feat_0, feat_1, ..., feat_n
29 | def extract_data(filename):
30 |
31 | # Arrays to hold the labels and feature vectors.
32 | labels = []
33 | fvecs = []
34 |
35 | # Iterate over the rows, splitting the label from the features. Convert labels
36 | # to integers and features to floats.
37 | for line in file(filename):
38 | row = line.split(",")
39 | labels.append(int(row[0]))
40 | fvecs.append([float(x) for x in row[1:]])
41 |
42 | # Convert the array of float arrays into a numpy float matrix.
43 | fvecs_np = np.matrix(fvecs).astype(np.float32)
44 |
45 | # Convert the array of int labels into a numpy array.
46 | labels_np = np.array(labels).astype(dtype=np.uint8)
47 |
48 | # Convert the int numpy array into a one-hot matrix.
49 | labels_onehot = (np.arange(NUM_LABELS) == labels_np[:, None]).astype(np.float32)
50 |
51 | # Return a pair of the feature matrix and the one-hot label matrix.
52 | return fvecs_np,labels_onehot
53 |
54 | def main(argv=None):
55 | # Be verbose?
56 | verbose = FLAGS.verbose
57 |
58 | # Get the data.
59 | train_data_filename = FLAGS.train
60 | test_data_filename = FLAGS.test
61 |
62 | # Extract it into numpy matrices.
63 | train_data,train_labels = extract_data(train_data_filename)
64 | test_data, test_labels = extract_data(test_data_filename)
65 |
66 | # Get the shape of the training data.
67 | train_size,num_features = train_data.shape
68 |
69 | # Get the number of epochs for training.
70 | num_epochs = FLAGS.num_epochs
71 |
72 | # This is where training samples and labels are fed to the graph.
73 | # These placeholder nodes will be fed a batch of training data at each
74 | # training step using the {feed_dict} argument to the Run() call below.
75 | x = tf.placeholder("float", shape=[None, num_features])
76 | y_ = tf.placeholder("float", shape=[None, NUM_LABELS])
77 |
78 | # These are the weights that inform how much each feature contributes to
79 | # the classification.
80 | W = tf.Variable(tf.zeros([num_features,NUM_LABELS]))
81 | b = tf.Variable(tf.zeros([NUM_LABELS]))
82 | y = tf.nn.softmax(tf.matmul(x,W) + b)
83 |
84 | # Optimization.
85 | cross_entropy = -tf.reduce_sum(y_*tf.log(y))
86 | train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)
87 |
88 |
89 | # For the test data, hold the entire dataset in one constant node.
90 | test_data_node = tf.constant(test_data)
91 |
92 |
93 | # Evaluation.
94 | correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
95 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
96 |
97 | saver = tf.train.Saver()
98 |
99 | # Create a local session to run this computation.
100 | with tf.Session() as s:
101 |
102 | # Run all the initializers to prepare the trainable parameters.
103 | tf.initialize_all_variables().run()
104 |
105 | epoch = 0
106 |
107 | if FLAGS.training:
108 |
109 | # Iterate and train.
110 | for step in xrange(num_epochs * train_size // BATCH_SIZE):
111 |
112 | if step % num_epochs == 0:
113 | saver.save(s, 'model_epoch.ckpt', global_step=epoch)
114 | epoch += 1
115 |
116 | offset = (step * BATCH_SIZE) % train_size
117 |
118 | # get a batch of data
119 | batch_data = train_data[offset:(offset + BATCH_SIZE), :]
120 | batch_labels = train_labels[offset:(offset + BATCH_SIZE)]
121 |
122 | # feed data into the model
123 | train_step.run(feed_dict={x: batch_data, y_: batch_labels})
124 |
125 | else:
126 |
127 | saver.restore(s, 'model_epoch.ckpt-9')
128 |
129 | model_accuracy = accuracy.eval(feed_dict={x: test_data, y_: test_labels})
130 |
131 | print "Accuracy: ", model_accuracy
132 |
133 |
134 |
135 | if __name__ == '__main__':
136 | tf.app.run()
137 |
--------------------------------------------------------------------------------
/truncnorm_hidden.py:
--------------------------------------------------------------------------------
1 | import tensorflow.python.platform
2 |
3 | import numpy as np
4 | import tensorflow as tf
5 |
6 | # Global variables.
7 | NUM_LABELS = 2 # The number of labels.
8 | BATCH_SIZE = 100 # The number of training examples to use per training step.
9 | SEED = None # Set to None for random seed.
10 |
11 | tf.app.flags.DEFINE_string('train', None,
12 | 'File containing the training data (labels & features).')
13 | tf.app.flags.DEFINE_string('test', None,
14 | 'File containing the test data (labels & features).')
15 | tf.app.flags.DEFINE_integer('num_epochs', 1,
16 | 'Number of passes over the training data.')
17 | tf.app.flags.DEFINE_integer('num_hidden', 1,
18 | 'Number of nodes in the hidden layer.')
19 | tf.app.flags.DEFINE_boolean('verbose', False, 'Produce verbose output.')
20 | FLAGS = tf.app.flags.FLAGS
21 |
22 | # Extract numpy representations of the labels and features given rows consisting of:
23 | # label, feat_0, feat_1, ..., feat_n
24 | def extract_data(filename):
25 |
26 | # Arrays to hold the labels and feature vectors.
27 | labels = []
28 | fvecs = []
29 |
30 | # Iterate over the rows, splitting the label from the features. Convert labels
31 | # to integers and features to floats.
32 | for line in file(filename):
33 | row = line.split(",")
34 | labels.append(int(row[0]))
35 | fvecs.append([float(x) for x in row[1:]])
36 |
37 | # Convert the array of float arrays into a numpy float matrix.
38 | fvecs_np = np.matrix(fvecs).astype(np.float32)
39 |
40 | # Convert the array of int labels into a numpy array.
41 | labels_np = np.array(labels).astype(dtype=np.uint8)
42 |
43 | # Convert the int numpy array into a one-hot matrix.
44 | labels_onehot = (np.arange(NUM_LABELS) == labels_np[:, None]).astype(np.float32)
45 |
46 | # Return a pair of the feature matrix and the one-hot label matrix.
47 | return fvecs_np,labels_onehot
48 |
49 | def main(argv=None):
50 | # Be verbose?
51 | verbose = FLAGS.verbose
52 |
53 | # Get the data.
54 | train_data_filename = FLAGS.train
55 | test_data_filename = FLAGS.test
56 |
57 | # Extract it into numpy arrays.
58 | train_data,train_labels = extract_data(train_data_filename)
59 | test_data, test_labels = extract_data(test_data_filename)
60 |
61 | # Get the shape of the training data.
62 | train_size,num_features = train_data.shape
63 |
64 | # Get the number of epochs for training.
65 | num_epochs = FLAGS.num_epochs
66 |
67 | # Get the size of layer one.
68 | num_hidden = FLAGS.num_hidden
69 |
70 | # This is where training samples and labels are fed to the graph.
71 | # These placeholder nodes will be fed a batch of training data at each
72 | # training step using the {feed_dict} argument to the Run() call below.
73 | x = tf.placeholder("float", shape=[None, num_features])
74 | y_ = tf.placeholder("float", shape=[None, NUM_LABELS])
75 |
76 | # For the test data, hold the entire dataset in one constant node.
77 | test_data_node = tf.constant(test_data)
78 |
79 | # Define and initialize the network.
80 |
81 | # Initialize the hidden weights and biases.
82 | w_hidden = tf.Variable(
83 | tf.truncated_normal([num_features, num_hidden],
84 | stddev=0.1,
85 | seed=SEED))
86 |
87 | b_hidden = tf.Variable(tf.constant(0.1, shape=[num_hidden]))
88 |
89 | # The hidden layer.
90 | hidden = tf.nn.relu(tf.matmul(x,w_hidden) + b_hidden)
91 |
92 | # Initialize the output weights and biases.
93 | w_out = tf.Variable(
94 | tf.truncated_normal([num_hidden, NUM_LABELS],
95 | stddev=0.1,
96 | seed=SEED))
97 |
98 | b_out = tf.Variable(tf.constant(0.1, shape=[NUM_LABELS]))
99 |
100 | # The output layer.
101 | y = tf.nn.softmax(tf.matmul(hidden, w_out) + b_out)
102 |
103 | # Optimization.
104 | cross_entropy = -tf.reduce_sum(y_*tf.log(y))
105 | train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)
106 |
107 | # Evaluation.
108 | correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
109 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
110 |
111 | # Create a local session to run this computation.
112 | with tf.Session() as s:
113 | # Run all the initializers to prepare the trainable parameters.
114 | tf.initialize_all_variables().run()
115 | if verbose:
116 | print 'Initialized!'
117 | print
118 | print 'Training.'
119 |
120 | # Iterate and train.
121 | for step in xrange(num_epochs * train_size // BATCH_SIZE):
122 | if verbose:
123 | print step,
124 |
125 | offset = (step * BATCH_SIZE) % train_size
126 | batch_data = train_data[offset:(offset + BATCH_SIZE), :]
127 | batch_labels = train_labels[offset:(offset + BATCH_SIZE)]
128 | train_step.run(feed_dict={x: batch_data, y_: batch_labels})
129 | if verbose and offset >= train_size-BATCH_SIZE:
130 | print
131 | print "Accuracy:", accuracy.eval(feed_dict={x: test_data, y_: test_labels})
132 |
133 | if __name__ == '__main__':
134 | tf.app.run()
135 |
--------------------------------------------------------------------------------