├── .gitignore ├── CONTRIBUTORS ├── LICENSE ├── README.md ├── deep_learning ├── example1 │ ├── airplane.jpg │ ├── example1.go │ ├── gopher.jpg │ └── pug.jpg ├── example2 │ └── example2.go └── example3 │ └── example3.go ├── linear_regression ├── data │ └── advertising.csv ├── example1 │ └── example1.go ├── example2 │ └── example2.go ├── example3 │ └── example3.go ├── example4 │ └── example4.go ├── example5 │ └── example5.go └── example6 │ └── example6.go ├── logistic_regression ├── data │ └── loan_data.csv ├── example1 │ └── example1.go ├── example2 │ └── example2.go ├── example3 │ └── example3.go ├── example4 │ └── example4.go ├── example5 │ └── example5.go ├── example6 │ └── example6.go └── example7 │ └── example7.go ├── neural_networks ├── data │ ├── test.csv │ └── train.csv ├── example1 │ └── example1.go └── example2 │ └── example2.go └── projects.md /.gitignore: -------------------------------------------------------------------------------- 1 | "vim 2 | *.sw* 3 | -------------------------------------------------------------------------------- /CONTRIBUTORS: -------------------------------------------------------------------------------- 1 | # This is a list of people have contributed to the go-ml 2 | # repository. It is maintained on a best-effort basis. If you 3 | # have made a contribution but do not appear on this list 4 | # please open a pull request. 5 | 6 | # Names should be added to this file like so: 7 | # Name 8 | 9 | # Please keep the list sorted. 10 | 11 | Daniel Whitenack 12 | Miriah Peterson 13 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ![Alt text](https://docs.google.com/drawings/d/e/2PACX-1vT37glyZXd8ViXedt0LCSpzsbWCUSSLhWuR3o5_74tL92fh7zeIo3hVtCzhnpw8IeWAM-KcI419cIkm/pub?w=745&h=310) 2 | 3 | # Machine Learning and AI with Go 4 | 5 | These materials are for the 2019 "Machine learning and AI with Go" workshop at GopherCon. The workshop provides an intensive and idiomatic view on training, utilizing, evaluating, and deploying machine learning and AI models using Go. This class is perfect for gophers wanting an approachable, code-first introduction to ML/AI. 6 | 7 | - [Slides from the workshop](https://docs.google.com/presentation/d/1igJntH89r0qT3BhD-91AewOKz9CZ9FWfOmmicxino7k/edit?usp=sharing) 8 | - Instructors 9 | - Daniel Whitenack - [website/blog](http://www.datadan.io/), [twitter](https://twitter.com/dwhitena), [github](https://github.com/dwhitena) 10 | - Miriah Peterson - [twitter](https://twitter.com/captainnobody1), [github](https://github.com/Soypete) 11 | - During the workshop, you will also need to work a bit at the command line. If you are new to the command line or need a refresher, look through [this guide](http://bit.ly/2SytJAR) or [this quick tutorial](https://lifehacker.com/5633909/who-needs-a-mouse-learn-to-use-the-command-line-for-almost-anything). 12 | 13 | *Note: This material has been designed to be taught in a classroom environment at GopherCon. The code is well commented but missing some of the contextual concepts and ideas that will be covered in class.* 14 | 15 | ## Agenda 16 | 17 | 9:00-9:30 Introductions, Logistics 🎤 18 | 9:30-10:45 [Introduction to ML/AI](#introduction-to-machine-learning-and-ai-) 🧠 19 | 10:45-11:00 Break ☕ 20 | 11:00-12:00 [Linear and Logistic Regression](#linear-regression-) 📈 21 | 12:00-1:15 Lunch 🍕 22 | 1:15-2:45 [Neural Networks, Deep Learning](#neural-networks-) 🤖 23 | 2:45-3:00 Break ☕ 24 | 3:00-4:30 [Hands on](#hands-on-%EF%B8%8F) ⌨️ 25 | 4:30-5:00 Next steps, Conclusions 💥 26 | 27 | ## Introduction to Machine Learning and AI 🧠 28 | 29 | This session introduces the basics of Machine Learning and AI along with a whole bunch of related jargon. Are ML and AI the same thing or are they different? What differentiates them from traditional software engineering? How does a machine learn anyway? We will answer these questions and many more, such that you can communicate well with data science teams and understand the steps involved in most machine learning workflows. 30 | 31 | [See slides](https://docs.google.com/presentation/d/1igJntH89r0qT3BhD-91AewOKz9CZ9FWfOmmicxino7k/edit?usp=sharing) 32 | 33 | ## Linear Regression 📈 34 | 35 | There are a couple fundamental AI building blocks that show up all over AI applications. These are _linear regression_ and _gradient descent_. In this session, we will explore these building blocks by implementing them from scratch in Go. We will then utilize the `github.com/sajari/regression` package to extend our linear regression model to a multiple regression model. 36 | 37 | Linear regression code examples: 38 | 39 | - [Example 1: Profiling advertising/sales data](linear_regression/example1/example1.go) 40 | - [Example 2: Splitting the data into training/test](linear_regression/example2/example2.go) 41 | - [Example 3: Training a linear regression model, SGD](linear_regression/example3/example3.go) 42 | - [Example 4: Training a single regression model w/ github.com/sajari/regression](linear_regression/example4/example4.go) 43 | - [Example 5: Training a multiple regression model w/ github.com/sajari/regression](linear_regression/example5/example5.go) 44 | - [Example 6: Evaluating regression models](linear_regression/example6/example6.go) 45 | 46 | ## Logistic Regression 📈 47 | 48 | Linear regression is great, but there's only so much you can do with it. It is, after all, limited to modeling rather simple relationships where one thing is proportional to changes in another thing. Not all relationships are like that. In particular, classification problems require us to model discrete states or categories, not continuous values. In this session, we will move from linear regression to _logistic regression_ such that we can model non-linear relationships. 49 | 50 | Logistic regression code examples: 51 | 52 | - [Example 1: Plotting a logistic function](logistic_regression/example1/example1.go) 53 | - [Example 2: "Clean" loan data](logistic_regression/example2/example2.go) 54 | - [Example 3: Profile the loan data](logistic_regression/example3/example3.go) 55 | - [Example 4: Split the data into training/test](logistic_regression/example4/example4.go) 56 | - [Example 5: Train a logistic regression model](logistic_regression/example5/example5.go) 57 | - [Example 6: Evaluate the logistic regression model](logistic_regression/example6/example6.go) 58 | - [Example 6: Train a logistic regression model w/ github.com/cdipaolo/goml](logistic_regression/example7/example7.go) 59 | 60 | ## Neural Networks 🤖 61 | 62 | Ok, in this session we finally get to the thing everyone is interested in... _neural nets_! Despite the intimidation that normally surrounds these seemingly magical, brain-like things, they aren't that much more complicated than our previous regression models. In this session, we will implement a simple neural network from scratch with Go to understand the fundamentals. 63 | 64 | Neural Network code examples: 65 | 66 | - [Example 1: Building a simple neural network](neural_networks/example1/example1.go) 67 | - [Example 2: Utilizing the simple neural network for classification](neural_networks/example2/example2.go) 68 | 69 | ## Deep Learning 🤖 70 | 71 | In this final teaching session, the concepts get very deep. We will take our basic neural network and figure out how we could modify it to be a _deep learning_ neural network (because, after we know about deep learning, we can all ask for a raise at our company). These super powerful networks can do amazing things, and it's not that difficult to leverage them in your Go applications! 72 | 73 | Deep Learning code examples: 74 | 75 | - [Example 1: Using the TensorFlow Go bindings for inference](deep_learning/example1/example1.go), Setup required: 76 | - Download the Tensorflow model from [here](http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v1_coco_2018_01_28.tar.gz) 77 | - Unzip the model files 78 | - [Example 2: Using GoCV to analyze video](deep_learning/example2/example2.go), Setup required: 79 | - [Install GoCV](https://gocv.io/getting-started/) 80 | - Download the TensorFlow Inception model from [here](https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip) 81 | - Unzip the Inception model files. 82 | - [Example 3: Sentiment analysis with MachineBox](deep_learning/example3/example3.go), Setup required 83 | - Run MachineBox's [Textbox](https://docs.veritone.com/#/developer/machine-box/boxes/textbox) 84 | 85 | ## Hands on ⌨️ 86 | 87 | Based on feedback from previous GopherCon workshops, this iteration will include a couple hands-on options for the final session: 88 | 89 | 1. Try your hand at training some of your own ML/AI models using publicly available data. [Miriah Peterson](https://github.com/Soypete) was nice enough to compile some data sets where you will be able flex your regression, classification, and neural net muscles. [See here for more information and instructions on getting started](projects.md). 90 | 2. Learn how infrastructure projects written in Go (Docker, Kubernetes, Pachyderm, and Minio) can be used to deploy production ML/AI model training and inference. If you choose this option, you will follow a tutorial that will walk you through deploying a production ML pipeline on top of Kubernetes. [See here for more information and instructions on getting started](https://github.com/dwhitena/pach-go-regression). 91 | 92 | The instructors and TAs will be available for questions during this time, but you can also follow up with the instructor during the rest of GopherCon or on Gophers Slack. 93 | 94 | 95 | 96 | ___ 97 | All material is licensed under the [Apache License Version 2.0, January 2004](http://www.apache.org/licenses/LICENSE-2.0). 98 | -------------------------------------------------------------------------------- /deep_learning/example1/airplane.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dwhitena/gc-ml/6d945fc9be1264a260a8f5dcb9cbe617790f4d94/deep_learning/example1/airplane.jpg -------------------------------------------------------------------------------- /deep_learning/example1/example1.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "archive/zip" 5 | "bufio" 6 | "flag" 7 | "fmt" 8 | "io" 9 | "io/ioutil" 10 | "log" 11 | "net/http" 12 | "os" 13 | "path/filepath" 14 | 15 | tf "github.com/tensorflow/tensorflow/tensorflow/go" 16 | "github.com/tensorflow/tensorflow/tensorflow/go/op" 17 | ) 18 | 19 | func main() { 20 | // An example for using the TensorFlow Go API for image recognition 21 | // using a pre-trained inception model (http://arxiv.org/abs/1512.00567). 22 | // 23 | // Sample usage: -dir=/tmp/modeldir -image=/path/to/some/jpeg 24 | // 25 | // The pre-trained model takes input in the form of a 4-dimensional 26 | // tensor with shape [ BATCH_SIZE, IMAGE_HEIGHT, IMAGE_WIDTH, 3 ], 27 | // where: 28 | // - BATCH_SIZE allows for inference of multiple images in one pass through the graph 29 | // - IMAGE_HEIGHT is the height of the images on which the model was trained 30 | // - IMAGE_WIDTH is the width of the images on which the model was trained 31 | // - 3 is the (R, G, B) values of the pixel colors represented as a float. 32 | // 33 | // And produces as output a vector with shape [ NUM_LABELS ]. 34 | // output[i] is the probability that the input image was recognized as 35 | // having the i-th label. 36 | // 37 | // A separate file contains a list of string labels corresponding to the 38 | // integer indices of the output. 39 | // 40 | // This example: 41 | // - Loads the serialized representation of the pre-trained model into a Graph 42 | // - Creates a Session to execute operations on the Graph 43 | // - Converts an image file to a Tensor to provide as input to a Session run 44 | // - Executes the Session and prints out the label with the highest probability 45 | // 46 | // To convert an image file to a Tensor suitable for input to the Inception model, 47 | // this example: 48 | // - Constructs another TensorFlow graph to normalize the image into a 49 | // form suitable for the model (for example, resizing the image) 50 | // - Creates and executes a Session to obtain a Tensor in this normalized form. 51 | modeldir := flag.String("dir", "", "Directory containing the trained model files. The directory will be created and the model downloaded into it if necessary") 52 | imagefile := flag.String("image", "", "Path of a JPEG-image to extract labels for") 53 | flag.Parse() 54 | if *modeldir == "" || *imagefile == "" { 55 | flag.Usage() 56 | return 57 | } 58 | // Load the serialized GraphDef from a file. 59 | modelfile, labelsfile, err := modelFiles(*modeldir) 60 | if err != nil { 61 | log.Fatal(err) 62 | } 63 | model, err := ioutil.ReadFile(modelfile) 64 | if err != nil { 65 | log.Fatal(err) 66 | } 67 | 68 | // Construct an in-memory graph from the serialized form. 69 | graph := tf.NewGraph() 70 | if err := graph.Import(model, ""); err != nil { 71 | log.Fatal(err) 72 | } 73 | 74 | // Create a session for inference over graph. 75 | session, err := tf.NewSession(graph, nil) 76 | if err != nil { 77 | log.Fatal(err) 78 | } 79 | defer session.Close() 80 | 81 | // Run inference on *imageFile. 82 | // For multiple images, session.Run() can be called in a loop (and 83 | // concurrently). Alternatively, images can be batched since the model 84 | // accepts batches of image data as input. 85 | tensor, err := makeTensorFromImage(*imagefile) 86 | if err != nil { 87 | log.Fatal(err) 88 | } 89 | output, err := session.Run( 90 | map[tf.Output]*tf.Tensor{ 91 | graph.Operation("input").Output(0): tensor, 92 | }, 93 | []tf.Output{ 94 | graph.Operation("output").Output(0), 95 | }, 96 | nil) 97 | if err != nil { 98 | log.Fatal(err) 99 | } 100 | // output[0].Value() is a vector containing probabilities of 101 | // labels for each image in the "batch". The batch size was 1. 102 | // Find the most probably label index. 103 | probabilities := output[0].Value().([][]float32)[0] 104 | printBestLabel(probabilities, labelsfile) 105 | } 106 | 107 | func printBestLabel(probabilities []float32, labelsFile string) { 108 | bestIdx := 0 109 | for i, p := range probabilities { 110 | if p > probabilities[bestIdx] { 111 | bestIdx = i 112 | } 113 | } 114 | // Found the best match. Read the string from labelsFile, which 115 | // contains one line per label. 116 | file, err := os.Open(labelsFile) 117 | if err != nil { 118 | log.Fatal(err) 119 | } 120 | defer file.Close() 121 | scanner := bufio.NewScanner(file) 122 | var labels []string 123 | for scanner.Scan() { 124 | labels = append(labels, scanner.Text()) 125 | } 126 | if err := scanner.Err(); err != nil { 127 | log.Printf("ERROR: failed to read %s: %v", labelsFile, err) 128 | } 129 | fmt.Printf("BEST MATCH: (%2.0f%% likely) %s\n", probabilities[bestIdx]*100.0, labels[bestIdx]) 130 | } 131 | 132 | // Convert the image in filename to a Tensor suitable as input to the Inception model. 133 | func makeTensorFromImage(filename string) (*tf.Tensor, error) { 134 | bytes, err := ioutil.ReadFile(filename) 135 | if err != nil { 136 | return nil, err 137 | } 138 | // DecodeJpeg uses a scalar String-valued tensor as input. 139 | tensor, err := tf.NewTensor(string(bytes)) 140 | if err != nil { 141 | return nil, err 142 | } 143 | // Construct a graph to normalize the image 144 | graph, input, output, err := constructGraphToNormalizeImage() 145 | if err != nil { 146 | return nil, err 147 | } 148 | // Execute that graph to normalize this one image 149 | session, err := tf.NewSession(graph, nil) 150 | if err != nil { 151 | return nil, err 152 | } 153 | defer session.Close() 154 | normalized, err := session.Run( 155 | map[tf.Output]*tf.Tensor{input: tensor}, 156 | []tf.Output{output}, 157 | nil) 158 | if err != nil { 159 | return nil, err 160 | } 161 | return normalized[0], nil 162 | } 163 | 164 | // The inception model takes as input the image described by a Tensor in a very 165 | // specific normalized format (a particular image size, shape of the input tensor, 166 | // normalized pixel values etc.). 167 | // 168 | // This function constructs a graph of TensorFlow operations which takes as 169 | // input a JPEG-encoded string and returns a tensor suitable as input to the 170 | // inception model. 171 | func constructGraphToNormalizeImage() (graph *tf.Graph, input, output tf.Output, err error) { 172 | // Some constants specific to the pre-trained model at: 173 | // https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip 174 | // 175 | // - The model was trained after with images scaled to 224x224 pixels. 176 | // - The colors, represented as R, G, B in 1-byte each were converted to 177 | // float using (value - Mean)/Scale. 178 | const ( 179 | H, W = 224, 224 180 | Mean = float32(117) 181 | Scale = float32(1) 182 | ) 183 | // - input is a String-Tensor, where the string the JPEG-encoded image. 184 | // - The inception model takes a 4D tensor of shape 185 | // [BatchSize, Height, Width, Colors=3], where each pixel is 186 | // represented as a triplet of floats 187 | // - Apply normalization on each pixel and use ExpandDims to make 188 | // this single image be a "batch" of size 1 for ResizeBilinear. 189 | s := op.NewScope() 190 | input = op.Placeholder(s, tf.String) 191 | output = op.Div(s, 192 | op.Sub(s, 193 | op.ResizeBilinear(s, 194 | op.ExpandDims(s, 195 | op.Cast(s, 196 | op.DecodeJpeg(s, input, op.DecodeJpegChannels(3)), tf.Float), 197 | op.Const(s.SubScope("make_batch"), int32(0))), 198 | op.Const(s.SubScope("size"), []int32{H, W})), 199 | op.Const(s.SubScope("mean"), Mean)), 200 | op.Const(s.SubScope("scale"), Scale)) 201 | graph, err = s.Finalize() 202 | return graph, input, output, err 203 | } 204 | 205 | func modelFiles(dir string) (modelfile, labelsfile string, err error) { 206 | const URL = "https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip" 207 | var ( 208 | model = filepath.Join(dir, "tensorflow_inception_graph.pb") 209 | labels = filepath.Join(dir, "imagenet_comp_graph_label_strings.txt") 210 | zipfile = filepath.Join(dir, "inception5h.zip") 211 | ) 212 | if filesExist(model, labels) == nil { 213 | return model, labels, nil 214 | } 215 | log.Println("Did not find model in", dir, "downloading from", URL) 216 | if err := os.MkdirAll(dir, 0755); err != nil { 217 | return "", "", err 218 | } 219 | if err := download(URL, zipfile); err != nil { 220 | return "", "", fmt.Errorf("failed to download %v - %v", URL, err) 221 | } 222 | if err := unzip(dir, zipfile); err != nil { 223 | return "", "", fmt.Errorf("failed to extract contents from model archive: %v", err) 224 | } 225 | os.Remove(zipfile) 226 | return model, labels, filesExist(model, labels) 227 | } 228 | 229 | func filesExist(files ...string) error { 230 | for _, f := range files { 231 | if _, err := os.Stat(f); err != nil { 232 | return fmt.Errorf("unable to stat %s: %v", f, err) 233 | } 234 | } 235 | return nil 236 | } 237 | 238 | func download(URL, filename string) error { 239 | resp, err := http.Get(URL) 240 | if err != nil { 241 | return err 242 | } 243 | defer resp.Body.Close() 244 | file, err := os.OpenFile(filename, os.O_RDWR|os.O_CREATE, 0644) 245 | if err != nil { 246 | return err 247 | } 248 | defer file.Close() 249 | _, err = io.Copy(file, resp.Body) 250 | return err 251 | } 252 | 253 | func unzip(dir, zipfile string) error { 254 | r, err := zip.OpenReader(zipfile) 255 | if err != nil { 256 | return err 257 | } 258 | defer r.Close() 259 | for _, f := range r.File { 260 | src, err := f.Open() 261 | if err != nil { 262 | return err 263 | } 264 | log.Println("Extracting", f.Name) 265 | dst, err := os.OpenFile(filepath.Join(dir, f.Name), os.O_WRONLY|os.O_CREATE, 0644) 266 | if err != nil { 267 | return err 268 | } 269 | if _, err := io.Copy(dst, src); err != nil { 270 | return err 271 | } 272 | dst.Close() 273 | } 274 | return nil 275 | } 276 | -------------------------------------------------------------------------------- /deep_learning/example1/gopher.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dwhitena/gc-ml/6d945fc9be1264a260a8f5dcb9cbe617790f4d94/deep_learning/example1/gopher.jpg -------------------------------------------------------------------------------- /deep_learning/example1/pug.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dwhitena/gc-ml/6d945fc9be1264a260a8f5dcb9cbe617790f4d94/deep_learning/example1/pug.jpg -------------------------------------------------------------------------------- /deep_learning/example2/example2.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "bufio" 5 | "fmt" 6 | "image" 7 | "image/color" 8 | "log" 9 | "os" 10 | "strconv" 11 | 12 | "gocv.io/x/gocv" 13 | ) 14 | 15 | // readDescriptions reads the descriptions from a file 16 | // and returns a slice of its lines. 17 | func readDescriptions(path string) ([]string, error) { 18 | file, err := os.Open(path) 19 | if err != nil { 20 | return nil, err 21 | } 22 | defer file.Close() 23 | 24 | var lines []string 25 | scanner := bufio.NewScanner(file) 26 | for scanner.Scan() { 27 | lines = append(lines, scanner.Text()) 28 | } 29 | return lines, scanner.Err() 30 | } 31 | 32 | func main() { 33 | if len(os.Args) < 4 { 34 | fmt.Println("How to run:\ntf-classifier [camera ID] [modelfile] [descriptionsfile]") 35 | return 36 | } 37 | 38 | // Parse the command line arguments. 39 | deviceID, err := strconv.Atoi(os.Args[1]) 40 | if err != nil { 41 | log.Fatal(err) 42 | } 43 | 44 | model := os.Args[2] 45 | 46 | descriptions, err := readDescriptions(os.Args[3]) 47 | if err != nil { 48 | log.Fatal(err) 49 | } 50 | 51 | // Open the default capture device, 0. 52 | webcam, err := gocv.VideoCaptureDevice(deviceID) 53 | if err != nil { 54 | fmt.Printf("Error opening video capture device: %v\n", deviceID) 55 | return 56 | } 57 | defer webcam.Close() 58 | 59 | window := gocv.NewWindow("Tensorflow Classifier") 60 | defer window.Close() 61 | 62 | img := gocv.NewMat() 63 | defer img.Close() 64 | 65 | // Read in the TensorFlow model. 66 | net := gocv.ReadNetFromTensorflow(model) 67 | defer net.Close() 68 | 69 | status := "Ready" 70 | statusColor := color.RGBA{0, 255, 0, 0} 71 | fmt.Printf("Start reading camera device: %v\n", deviceID) 72 | 73 | // Begin reading in frames from the camera. 74 | for { 75 | if ok := webcam.Read(&img); !ok { 76 | fmt.Printf("Error cannot read device %d\n", deviceID) 77 | return 78 | } 79 | if img.Empty() { 80 | continue 81 | } 82 | 83 | // Convert the image Mat to 224x244 blob that the classifier can analyze. 84 | blob := gocv.BlobFromImage(img, 1.0, image.Pt(224, 244), gocv.NewScalar(0, 0, 0, 0), true, false) 85 | defer blob.Close() 86 | 87 | // Feed the blob into the classifier. 88 | net.SetInput(blob, "input") 89 | 90 | // Run a forward pass thru the network. 91 | prob := net.Forward("softmax2") 92 | defer prob.Close() 93 | 94 | // Reshape the results into a 1x1000 matrix. 95 | probMat := prob.Reshape(1, 1) 96 | defer probMat.Close() 97 | 98 | // Determine the most probable classification. 99 | _, maxVal, _, maxLoc := gocv.MinMaxLoc(probMat) 100 | 101 | // Display the classification. 102 | desc := "Unknown" 103 | if maxLoc.X < 1000 { 104 | desc = descriptions[maxLoc.X] 105 | } 106 | status = fmt.Sprintf("description: %v, maxVal: %v\n", desc, maxVal) 107 | gocv.PutText(&img, status, image.Pt(10, 20), gocv.FontHersheyPlain, 1.2, statusColor, 2) 108 | 109 | window.IMShow(img) 110 | if window.WaitKey(1) >= 0 { 111 | break 112 | } 113 | } 114 | } 115 | -------------------------------------------------------------------------------- /deep_learning/example3/example3.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "fmt" 5 | "strings" 6 | 7 | "github.com/machinebox/sdk-go/textbox" 8 | ) 9 | 10 | func main() { 11 | 12 | // Connect to MachineBox. 13 | machBoxIP := "http://localhost:8080" 14 | client := textbox.New(machBoxIP) 15 | 16 | // Example text input. 17 | positiveStatement := "I am so excited to be teaching to super awesome, fun workshop!" 18 | //negativeStatement := "It is sad, depressing, and unfortunate that this workshop will terminate at the end of the day." 19 | 20 | // Analyze the text with TextBox. 21 | analysis, err := client.Check(strings.NewReader(positiveStatement)) 22 | if err != nil { 23 | fmt.Println(err) 24 | } 25 | 26 | // Calculate the sentiment. 27 | sentimentTotal := 0.0 28 | for _, sentence := range analysis.Sentences { 29 | sentimentTotal += sentence.Sentiment 30 | } 31 | 32 | // Higher sentitment is more positive, and lower is more negative. 33 | fmt.Printf("\nSentiment: %0.2f\n\n", sentimentTotal/float64(len(analysis.Sentences))) 34 | } 35 | -------------------------------------------------------------------------------- /linear_regression/data/advertising.csv: -------------------------------------------------------------------------------- 1 | tv,radio,newspaper,sales 2 | 0.7757862698681096,0.7620967741935483,0.605980650835532,0.8070866141732284 3 | 0.14812309773419008,0.7923387096774193,0.3940193491644679,0.3464566929133858 4 | 0.05579979709164694,0.9254032258064515,0.6068601583113455,0.3031496062992126 5 | 0.5099763273588097,0.8326612903225805,0.5118733509234827,0.6653543307086615 6 | 0.6090632397700373,0.21774193548387097,0.5109938434476692,0.4448818897637795 7 | 0.02705444707473791,0.9858870967741935,0.6569920844327176,0.2204724409448819 8 | 0.19208657423063918,0.6612903225806451,0.20404573438874227,0.4015748031496063 9 | 0.40412580317889757,0.3951612903225807,0.0993843447669305,0.45669291338582674 10 | 0.026716266486303687,0.04233870967741936,0.006156552330694811,0.12598425196850394 11 | 0.6733175515725399,0.05241935483870968,0.18381706244503074,0.3543307086614173 12 | 0.22117010483598243,0.11693548387096774,0.21020228671943708,0.2755905511811023 13 | 0.7237064592492392,0.4838709677419355,0.03254177660510114,0.6220472440944881 14 | 0.07811971592830573,0.7076612903225806,0.5769569041336851,0.2992125984251968 15 | 0.3273588096043287,0.15322580645161288,0.06068601583113455,0.3188976377952756 16 | 0.6878593168752114,0.6633064516129031,0.40193491644678975,0.6850393700787402 17 | 0.658437605681434,0.9616935483870968,0.46262093227792433,0.8188976377952755 18 | 0.22691917483936425,0.7379032258064516,0.9999999999999999,0.42913385826771655 19 | 0.9492729117348665,0.7983870967741935,0.4881266490765171,0.8976377952755905 20 | 0.2316537030774434,0.4133064516129032,0.15831134564643798,0.3818897637795276 21 | 0.4957727426445723,0.4818548387096774,0.16534740545294635,0.5118110236220472 22 | 0.7362191410213055,0.5584677419354839,0.46701846965699206,0.6456692913385826 23 | 0.800473452823808,0.10282258064516128,0.20404573438874227,0.42913385826771655 24 | 0.04227257355427799,0.32056451612903225,0.4335971855760774,0.1574803149606299 25 | 0.7696990192762937,0.3407258064516129,0.22779243623570797,0.547244094488189 26 | 0.20831924247548192,0.2540322580645161,0.15831134564643798,0.3188976377952756 27 | 0.8867095028745351,0.07056451612903225,0.1688654353562005,0.4094488188976378 28 | 0.4808927967534664,0.5907258064516129,0.10817941952506595,0.5275590551181102 29 | 0.8096043287115321,0.33669354838709675,0.19876868953386098,0.562992125984252 30 | 0.8390260399053096,0.5463709677419355,0.19876868953386098,0.6811023622047243 31 | 0.2363882313155225,0.3225806451612903,0.35620052770448546,0.35039370078740156 32 | 0.9881636794048022,0.5705645161290323,0.37730870712401055,0.7795275590551181 33 | 0.37943862022319924,0.3508064516129032,0.3368513632365875,0.40551181102362205 34 | 0.32634426783902604,0.03024193548387097,0.26121372031662266,0.31496062992125984 35 | 0.8958403787622592,0.4032258064516129,0.0,0.6220472440944881 36 | 0.3212715590125127,0.0282258064516129,0.06244503078276165,0.3110236220472441 37 | 0.9807237064592493,0.08266129032258064,0.07211961301671065,0.4409448818897638 38 | 0.900236726411904,0.8830645161290321,0.04133685136323658,0.9370078740157479 39 | 0.2502536354413257,0.9959677419354838,0.3992963940193492,0.515748031496063 40 | 0.14338856949611095,0.5383064516129032,0.30606860158311344,0.33464566929133854 41 | 0.768684477510991,0.7600806451612904,0.27880386983289357,0.7834645669291338 42 | 0.6824484274602639,0.4495967741935484,0.2752858399296394,0.5905511811023623 43 | 0.5962123774095368,0.6733870967741935,0.33773087071240104,0.610236220472441 44 | 0.990530943523842,0.5584677419354839,0.013192612137203165,0.7519685039370079 45 | 0.6973283733513698,0.16935483870967744,0.22955145118733505,0.4448818897637795 46 | 0.08251606357795065,0.5181451612903225,0.37818821459982405,0.27165354330708663 47 | 0.5897869462292865,0.4536290322580645,0.27440633245382584,0.5236220472440944 48 | 0.3009807237064593,0.19959677419354838,0.3113456464379947,0.3543307086614173 49 | 0.8089279675346637,0.8366935483870968,0.16007036059806506,0.8503937007874015 50 | 0.7659790328035171,0.3185483870967742,0.43623570800351796,0.5196850393700788 51 | 0.22387554954345626,0.23588709677419353,0.3210202286719437,0.3188976377952756 52 | 0.6733175515725399,0.0625,0.3016710642040457,0.38582677165354334 53 | 0.33716604666892125,0.1935483870967742,0.029023746701846962,0.35826771653543305 54 | 0.729455529252621,0.840725806451613,0.34564643799472294,0.8267716535433072 55 | 0.6151504903618533,0.9314516129032259,0.5136323658751099,0.7716535433070866 56 | 0.8860331416976667,0.5806451612903226,0.13720316622691292,0.732283464566929 57 | 0.6702739262766318,0.9959677419354838,0.5250659630606859,0.8700787401574803 58 | 0.022319918836658778,0.5665322580645161,0.36147757255936674,0.15354330708661418 59 | 0.45823469732837335,0.3870967741935484,0.14335971855760773,0.45669291338582674 60 | 0.7105174163003045,1.0,0.32893579595426564,0.8740157480314961 61 | 0.7101792357118702,0.594758064516129,0.079155672823219,0.6614173228346456 62 | 0.17855935069327022,0.04032258064516129,0.18557607739665782,0.2559055118110236 63 | 0.8812986134595876,0.8608870967741936,0.47845206684256814,0.889763779527559 64 | 0.8068988840040583,0.3125,0.23746701846965695,0.5551181102362205 65 | 0.3449442002029084,0.5967741935483871,0.0712401055408971,0.4881889763779528 66 | 0.44098748731822796,0.8629032258064515,0.2515391380826737,0.6456692913385826 67 | 0.23097734190057492,0.1875,0.005277044854881266,0.3031496062992126 68 | 0.10415962123774097,0.4959677419354839,0.016710642040457344,0.3110236220472441 69 | 0.46871829556983435,0.2923387096774194,0.08707124010554089,0.46456692913385833 70 | 0.800473452823808,0.5544354838709677,0.09410729991204925,0.6811023622047243 71 | 0.730808251606358,0.8850806451612903,0.2365875109938434,0.8149606299212598 72 | 0.6709502874535003,0.6169354838709677,0.33773087071240104,0.65748031496063 73 | 0.36895502198173824,0.28830645161290325,0.27616534740545295,0.4251968503937008 74 | 0.08826513358133245,0.6653225806451613,0.1671064204045734,0.2834645669291339 75 | 0.43523841731484614,0.11491935483870967,0.2726473175021988,0.3700787401574803 76 | 0.7193101115995943,0.4959677419354839,0.11257695690413368,0.6062992125984252 77 | 0.05478525532634426,0.8810483870967742,0.783641160949868,0.27952755905511806 78 | 0.09063239770037201,0.03225806451612903,0.17941952506596304,0.20866141732283466 79 | 0.4051403449442002,0.5745967741935484,0.12225153913808266,0.49606299212598426 80 | 0.015894487656408524,0.6028225806451613,0.08003518029903255,0.14566929133858267 81 | 0.38992221846466013,0.15524193548387097,0.20052770448548812,0.3700787401574803 82 | 0.2560027054447075,0.5383064516129032,0.19349164467897975,0.4015748031496063 83 | 0.8085897869462294,0.08266129032258064,0.32189973614775724,0.42125984251968507 84 | 0.252282718971931,0.4092741935483871,0.2832014072119613,0.3818897637795276 85 | 0.2289482583699696,0.8971774193548386,0.31046613896218117,0.4724409448818897 86 | 0.7196482921880285,0.8669354838709677,0.2946350043975373,0.7913385826771653 87 | 0.650997632735881,0.3709677419354838,0.5751978891820579,0.5354330708661417 88 | 0.2556645248562732,0.5544354838709677,0.13808267370272645,0.4094488188976378 89 | 0.3719986472776463,0.8185483870967742,0.5532102022867194,0.5669291338582677 90 | 0.2962461954683801,0.5141129032258064,0.6429199648197009,0.4448818897637795 91 | 0.36895502198173824,0.9637096774193548,0.44942832014072115,0.5944881889763779 92 | 0.45180926614812317,0.09879032258064517,0.079155672823219,0.37795275590551175 93 | 0.09435238417314848,0.03024193548387097,0.287598944591029,0.22440944881889763 94 | 0.7338518769022658,0.6754032258064516,0.5162708883025505,0.7007874015748031 95 | 0.8461278322624283,0.7358870967741935,0.6332453825857518,0.8110236220472441 96 | 0.3608386878593169,0.282258064516129,0.0932277924362357,0.38976377952755903 97 | 0.5498816367940481,0.6370967741935484,0.46262093227792433,0.6023622047244094 98 | 0.6658775786269869,0.07056451612903225,0.04925241864555848,0.3976377952755905 99 | 0.6229286438958405,0.42338709677419356,0.1908531222515391,0.547244094488189 100 | 0.9773419005749071,0.8528225806451613,0.4476693051890941,0.9370078740157479 101 | 0.45485289144403107,0.840725806451613,0.4010554089709762,0.6141732283464566 102 | 0.7497463645586745,0.08669354838709677,0.4353562005277044,0.3976377952755905 103 | 1.0,0.7318548387096774,0.8847845206684256,0.8740157480314961 104 | 0.9452147446736558,0.2036290322580645,0.18557607739665782,0.5196850393700788 105 | 0.6330740615488673,0.34677419354838707,0.15479331574318378,0.515748031496063 106 | 0.8031788975312818,0.691532258064516,0.04397537379067721,0.7519685039370079 107 | 0.46398376733175517,0.9354838709677419,0.5162708883025505,0.6929133858267716 108 | 0.08217788298951641,0.2217741935483871,0.25857519788918204,0.2204724409448819 109 | 0.3033479878254988,0.006048387096774193,0.20140721196130162,0.27952755905511806 110 | 0.04193439296584377,0.008064516129032258,0.2225153913808267,0.14566929133858267 111 | 0.8613459587419684,0.5423387096774193,0.0457343887423043,0.7165354330708662 112 | 0.7612445045654381,0.16532258064516128,0.49428320140721194,0.46456692913385833 113 | 0.8150152181264796,0.7661290322580645,0.20140721196130162,0.7952755905511811 114 | 0.5918160297598918,0.31048387096774194,0.01846965699208443,0.4921259842519685 115 | 0.7064592492390938,0.4153225806451613,0.0914687774846086,0.562992125984252 116 | 0.2620899560365235,0.9435483870967741,0.30079155672823216,0.5118110236220472 117 | 0.2516063577950625,0.7056451612903225,0.4608619173262973,0.43307086614173224 118 | 0.46838011498140003,0.28830645161290325,0.2225153913808267,0.41732283464566927 119 | 0.2560027054447075,0.016129032258064516,0.12752858399296393,0.30708661417322836 120 | 0.42272573554277987,0.7439516129032258,0.6939313984168864,0.562992125984252 121 | 0.06323977003719987,0.3225806451612903,0.19349164467897975,0.19685039370078738 122 | 0.4754819073385188,0.5403225806451613,0.4036939313984169,0.547244094488189 123 | 0.06121068650659453,0.4375,0.4406332453825857,0.2125984251968504 124 | 0.755157253973622,0.04838709677419355,0.13456464379947228,0.39370078740157477 125 | 0.41393304024349004,0.6975806451612904,0.10642040457343888,0.5354330708661417 126 | 0.7737571863375043,0.6512096774193548,0.6499560246262093,0.7125984251968503 127 | 0.29252620899560366,0.23790322580645162,0.22515391380826733,0.3543307086614173 128 | 0.024010821778829898,0.784274193548387,0.4423922603342128,0.19685039370078738 129 | 0.26885356780520797,0.0,0.07827616534740545,0.2834645669291339 130 | 0.7426445722015558,0.9879032258064516,0.025505716798592787,0.9094488188976377 131 | 0.1991883665877579,0.24193548387096775,0.376429199648197,0.3188976377952756 132 | 0.0,0.7983870967741935,0.07387862796833772,0.0 133 | 0.8944876564085222,0.05846774193548387,0.37554969217238343,0.437007874015748 134 | 0.026039905309435243,0.5483870967741935,0.0158311345646438,0.1614173228346457 135 | 0.7409536692593847,0.6754032258064516,0.3940193491644679,0.7086614173228347 136 | 0.12242137301318905,0.7782258064516129,0.5743183817062444,0.36220472440944884 137 | 0.16097396009469056,0.9475806451612903,0.07211961301671065,0.39370078740157477 138 | 0.08420696652012176,0.7862903225806451,0.079155672823219,0.3110236220472441 139 | 0.9232330064254313,0.5826612903225806,0.5224274406332453,0.7559055118110236 140 | 0.1430503889076767,0.5221774193548386,0.17766051011433595,0.31496062992125984 141 | 0.6229286438958405,0.8850806451612903,0.01231310466138962,0.7519685039370079 142 | 0.2458572877916808,0.34274193548387094,0.11081794195250659,0.3661417322834646 143 | 0.6526885356780522,0.7137096774193548,0.6622691292875987,0.6929133858267716 144 | 0.7433209333784242,0.6693548387096775,0.33069481090589264,0.7283464566929134 145 | 0.3513696313831586,0.11491935483870967,0.2999120492524186,0.3464566929133858 146 | 0.3229624619546838,0.29838709677419356,0.3394898856640281,0.38582677165354334 147 | 0.4721001014541766,0.03830645161290322,0.07651715039577836,0.3425196850393701 148 | 0.8096043287115321,0.1471774193548387,0.07387862796833772,0.45669291338582674 149 | 0.8200879269529929,0.9879032258064516,0.3869832893579595,0.9370078740157479 150 | 0.12614135948596553,0.8124999999999999,0.10202286719437115,0.3661417322834646 151 | 0.14879945891105853,0.5201612903225806,0.1785400175901495,0.33464566929133854 152 | 0.946905647615827,0.28024193548387094,0.3227792436235708,0.5708661417322836 153 | 0.4068312478863713,0.16935483870967744,0.4256816182937555,0.39370078740157477 154 | 0.6658775786269869,0.46975806451612906,0.12225153913808266,0.5905511811023623 155 | 0.576936083868786,0.8004032258064516,0.32893579595426564,0.6850393700787402 156 | 0.632735880960433,0.4254032258064516,0.08091468777484608,0.5511811023622047 157 | 0.011498140006763611,0.23387096774193547,0.047493403693931395,0.06299212598425197 158 | 0.31518430842069667,0.877016129032258,0.44151275285839925,0.5393700787401575 159 | 0.504227257355428,0.02620967741935484,0.21108179419525064,0.33464566929133854 160 | 0.037199864727764625,0.7439516129032258,0.39489885664028146,0.22440944881889763 161 | 0.4430165708488333,0.3709677419354838,0.3016710642040457,0.4448818897637795 162 | 0.5809942509299967,0.3649193548387097,0.2673702726473175,0.5039370078740157 163 | 0.2874535001690903,0.721774193548387,0.43095866314863673,0.4606299212598426 164 | 0.6347649644910384,0.3649193548387097,0.2225153913808267,0.5236220472440944 165 | 0.5505579979709165,0.7419354838709676,0.06244503078276165,0.6456692913385826 166 | 0.39398038552587084,0.29637096774193544,0.04485488126649076,0.40551181102362205 167 | 0.7906662157592155,0.06854838709677419,0.7431838170624449,0.40551181102362205 168 | 0.0581670612106865,0.7580645161290323,0.18733509234828494,0.25196850393700787 169 | 0.6969901927629355,0.10483870967741936,0.16798592788038694,0.41732283464566927 170 | 0.7260737233682788,0.47580645161290325,0.5039577836411608,0.610236220472441 171 | 0.959080148799459,0.21370967741935482,0.0536499560246262,0.5275590551181102 172 | 0.16672303009807238,0.23387096774193547,0.1591908531222515,0.2677165354330709 173 | 0.5539398038552588,0.42137096774193544,0.4142480211081794,0.5078740157480315 174 | 0.06391613121406833,0.405241935483871,0.1468777484608619,0.23622047244094485 175 | 0.5671288468041935,0.14314516129032256,0.10993843447669305,0.3976377952755905 176 | 0.7497463645586745,0.06854838709677419,0.11257695690413368,0.38976377952755903 177 | 0.9340547852553264,0.9858870967741935,0.36499560246262086,0.9999999999999999 178 | 0.8376733175515727,0.6088709677419355,0.17590149516270887,0.732283464566929 179 | 0.5732160973960095,0.15725806451612903,0.306948109058927,0.3976377952755905 180 | 0.933378424078458,0.04637096774193548,0.20580474934036935,0.4015748031496063 181 | 0.5576597903280353,0.20161290322580644,0.15215479331574316,0.43307086614173224 182 | 0.5272235373689551,0.05241935483870968,0.07036059806508356,0.35039370078740156 183 | 0.7365573216097397,0.10887096774193548,0.23834652594547048,0.41732283464566927 184 | 0.18769022658099427,0.11491935483870967,0.25857519788918204,0.27952755905511806 185 | 0.9702401082177885,0.8669354838709677,0.6288478452066841,0.968503937007874 186 | 0.8559350693270208,0.42943548387096775,0.26121372031662266,0.6299212598425198 187 | 0.6909029421711195,0.9092741935483871,0.16974494283201405,0.8267716535433072 188 | 0.4693946567467028,0.04233870967741936,0.23131046613896217,0.3425196850393701 189 | 0.6438958403787624,0.5786290322580645,0.15743183817062442,0.6181102362204725 190 | 0.9648292188028409,0.28024193548387094,0.029903254177660512,0.562992125984252 191 | 0.060872505918160305,0.24395161290322578,0.2031662269129287,0.20078740157480318 192 | 0.1312140683124789,0.8286290322580645,0.04837291116974493,0.36220472440944884 193 | 0.2529590801487995,0.21774193548387097,0.050131926121372024,0.3267716535433071 194 | 0.05579979709164694,0.08266129032258064,0.2752858399296394,0.16929133858267717 195 | 0.561717957389246,0.8467741935483871,0.029023746701846962,0.7086614173228347 196 | 0.5038890767669936,0.717741935483871,0.050131926121372024,0.6181102362204725 197 | 0.12681772066283398,0.07459677419354839,0.11873350923482849,0.23622047244094485 198 | 0.31619885018599936,0.09879032258064517,0.06860158311345646,0.3188976377952756 199 | 0.5962123774095368,0.1875,0.0536499560246262,0.4409448818897638 200 | 0.9567128846804196,0.8467741935483871,0.5795954265611257,0.9409448818897638 201 | 0.7825498816367942,0.17338709677419353,0.07387862796833772,0.46456692913385833 202 | -------------------------------------------------------------------------------- /linear_regression/example1/example1.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "image/color" 5 | "log" 6 | "os" 7 | 8 | "github.com/go-gota/gota/dataframe" 9 | "gonum.org/v1/plot" 10 | "gonum.org/v1/plot/plotter" 11 | "gonum.org/v1/plot/vg" 12 | ) 13 | 14 | func main() { 15 | 16 | // Open the advertising dataset file. 17 | f, err := os.Open("../data/advertising.csv") 18 | if err != nil { 19 | log.Fatal(err) 20 | } 21 | defer f.Close() 22 | 23 | // Create a dataframe from the CSV file. 24 | // The types of the columns will be inferred. 25 | adDF := dataframe.ReadCSV(f) 26 | 27 | // Extract the target column. 28 | yVals := adDF.Col("sales").Float() 29 | 30 | // Create a scatter plot for each of the features in the dataset. 31 | for _, colName := range adDF.Names() { 32 | 33 | // pts will hold the values for plotting 34 | pts := make(plotter.XYs, adDF.Nrow()) 35 | 36 | // Fill pts with data. 37 | for i, floatVal := range adDF.Col(colName).Float() { 38 | pts[i].X = floatVal 39 | pts[i].Y = yVals[i] 40 | } 41 | 42 | // Create the plot. 43 | p, err := plot.New() 44 | if err != nil { 45 | log.Fatal(err) 46 | } 47 | p.X.Label.Text = colName 48 | p.Y.Label.Text = "y" 49 | p.Add(plotter.NewGrid()) 50 | 51 | s, err := plotter.NewScatter(pts) 52 | if err != nil { 53 | log.Fatal(err) 54 | } 55 | s.GlyphStyle.Color = color.RGBA{R: 255, B: 128, A: 255} 56 | s.GlyphStyle.Radius = vg.Points(3) 57 | 58 | // Save the plot to a PNG file. 59 | p.Add(s) 60 | if err := p.Save(4*vg.Inch, 4*vg.Inch, colName+"_scatter.png"); err != nil { 61 | log.Fatal(err) 62 | } 63 | } 64 | } 65 | -------------------------------------------------------------------------------- /linear_regression/example2/example2.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "bufio" 5 | "log" 6 | "os" 7 | "path" 8 | 9 | "github.com/go-gota/gota/dataframe" 10 | ) 11 | 12 | func main() { 13 | 14 | // Open the advertising dataset file. 15 | f, err := os.Open("../data/advertising.csv") 16 | if err != nil { 17 | log.Fatal(err) 18 | } 19 | defer f.Close() 20 | 21 | // Create a dataframe from the CSV file. 22 | // The types of the columns will be inferred. 23 | adDF := dataframe.ReadCSV(f) 24 | 25 | // Calculate the number of elements in each set. 26 | trainingNum := (4 * adDF.Nrow()) / 5 27 | testNum := adDF.Nrow() / 5 28 | if trainingNum+testNum < adDF.Nrow() { 29 | trainingNum++ 30 | } 31 | 32 | // Create the subset indices. 33 | trainingIdx := make([]int, trainingNum) 34 | testIdx := make([]int, testNum) 35 | 36 | // Enumerate the training indices. 37 | for i := 0; i < trainingNum; i++ { 38 | trainingIdx[i] = i 39 | } 40 | 41 | // Enumerate the test indices. 42 | for i := 0; i < testNum; i++ { 43 | testIdx[i] = trainingNum + i 44 | } 45 | 46 | // Create the subset dataframes. 47 | trainingDF := adDF.Subset(trainingIdx) 48 | testDF := adDF.Subset(testIdx) 49 | 50 | // Create a map that will be used in writing the data 51 | // to files. 52 | setMap := map[int]dataframe.DataFrame{ 53 | 0: trainingDF, 54 | 1: testDF, 55 | } 56 | 57 | // Create the respective files. 58 | for idx, setName := range []string{"training.csv", "test.csv"} { 59 | 60 | // Save the filtered dataset file. 61 | f, err := os.Create(path.Join("../data/", setName)) 62 | if err != nil { 63 | log.Fatal(err) 64 | } 65 | 66 | // Create a buffered writer. 67 | w := bufio.NewWriter(f) 68 | 69 | // Write the dataframe out as a CSV. 70 | if err := setMap[idx].WriteCSV(w); err != nil { 71 | log.Fatal(err) 72 | } 73 | } 74 | } 75 | -------------------------------------------------------------------------------- /linear_regression/example3/example3.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "fmt" 6 | "io" 7 | "log" 8 | "math" 9 | "os" 10 | "strconv" 11 | ) 12 | 13 | func main() { 14 | 15 | // Open the Advertising data set. 16 | f, err := os.Open("../data/training.csv") 17 | if err != nil { 18 | log.Fatal(err) 19 | } 20 | defer f.Close() 21 | 22 | // Create a new CSV reader reading from the opened file. 23 | reader := csv.NewReader(f) 24 | 25 | // We should have 4 fields per line. By setting 26 | // FieldsPerRecord to 4, we can validate that each of 27 | // the rows in our CSV has the correct number of fields. 28 | reader.FieldsPerRecord = 4 29 | 30 | // features and response will hold our successfully parsed 31 | // TV and Sales values respectively. 32 | line := 1 33 | var features []float64 34 | var response []float64 35 | for { 36 | 37 | // Read in a row. Check if we are at the end of 38 | // the file. 39 | record, err := reader.Read() 40 | if err == io.EOF { 41 | break 42 | } 43 | 44 | // Skip the header. 45 | if line == 1 { 46 | line++ 47 | continue 48 | } 49 | 50 | // If we had a parsing error, log the error 51 | // and move on. 52 | if err != nil { 53 | log.Println(err) 54 | line++ 55 | continue 56 | } 57 | 58 | // Try to parse the values we want (the TV and Sales values) 59 | // as floats. 60 | var tv float64 61 | if tv, err = strconv.ParseFloat(record[0], 64); err != nil { 62 | log.Printf("Unexpected type in TV column at line %d\n", line) 63 | fmt.Println(record) 64 | line++ 65 | continue 66 | } 67 | 68 | var sales float64 69 | if sales, err = strconv.ParseFloat(record[3], 64); err != nil { 70 | log.Printf("Unexpected type in Sales column at line %d\n", line) 71 | line++ 72 | continue 73 | } 74 | 75 | // Append the records to our feature and response slices. 76 | features = append(features, tv) 77 | response = append(response, sales) 78 | line++ 79 | } 80 | 81 | // Train our linear regression model. 82 | w, b, _ := sgdTrain(features, response, 0.1, 200) 83 | 84 | // Print our results. 85 | fmt.Printf("\nRegression Formula:\ny = %0.2f * x + %0.2f\n\n", w, b) 86 | } 87 | 88 | // sgdTrain calculates the ideal parameters using SGD for 89 | // a linear regression model. 90 | func sgdTrain(features, response []float64, lr float64, epochs int) (float64, float64, float64) { 91 | 92 | // Initialize the weight and bias. 93 | w := 0.0 94 | b := 0.0 95 | 96 | // Set the number of observations in the data. 97 | n := float64(len(response)) 98 | 99 | // Loop over the number of epochs. 100 | loss := 0.0 101 | for i := 0; i < epochs; i++ { 102 | 103 | // Calculate current predictions. 104 | var predictions []float64 105 | for _, x := range features { 106 | predictions = append(predictions, w*x+b) 107 | } 108 | 109 | // Calculate the loss for this epoch. 110 | loss = 0.0 111 | for idx, p := range predictions { 112 | loss += squaredError(p, response[idx]) / n 113 | } 114 | 115 | // Output some info to standard out so we know 116 | // how training is progressing. 117 | if i%10 == 0 { 118 | fmt.Printf("Epoch %d, Loss %0.4f\n", i, loss) 119 | } 120 | 121 | // Calculate the gradients for w and b. 122 | wGradient := 0.0 123 | bGradient := 0.0 124 | for idx, p := range predictions { 125 | wGradient += -(2 / n) * (features[idx] * (response[idx] - p)) 126 | bGradient += -(2 / n) * (response[idx] - p) 127 | } 128 | 129 | // Update the weight and bias. 130 | w = w - (lr * wGradient) 131 | b = b - (lr * bGradient) 132 | } 133 | 134 | return w, b, loss 135 | } 136 | 137 | // squaredError returns the error associted with a particular 138 | // pair of prediction and observation. 139 | func squaredError(observation, prediction float64) float64 { 140 | return math.Pow(observation-prediction, 2) 141 | } 142 | -------------------------------------------------------------------------------- /linear_regression/example4/example4.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "fmt" 6 | "log" 7 | "os" 8 | "strconv" 9 | 10 | "github.com/sajari/regression" 11 | ) 12 | 13 | func main() { 14 | 15 | // Open the training dataset file. 16 | f, err := os.Open("../data/training.csv") 17 | if err != nil { 18 | log.Fatal(err) 19 | } 20 | defer f.Close() 21 | 22 | // Create a new CSV reader reading from the opened file. 23 | reader := csv.NewReader(f) 24 | 25 | // Read in all of the CSV records 26 | reader.FieldsPerRecord = 4 27 | trainingData, err := reader.ReadAll() 28 | if err != nil { 29 | log.Fatal(err) 30 | } 31 | 32 | // In this case we are going to try and model our sales 33 | // y by the tv feature plust an intercept. As such, let's create 34 | // the struct needed to train a model using github.com/sajari/regression. 35 | var r regression.Regression 36 | r.SetObserved("sales") 37 | r.SetVar(0, "tv") 38 | 39 | // Loop of records in the CSV, adding the training data to the regression value. 40 | for i, record := range trainingData { 41 | 42 | // Skip the header. 43 | if i == 0 { 44 | continue 45 | } 46 | 47 | // Parse the sales, or "y". 48 | yVal, err := strconv.ParseFloat(record[3], 64) 49 | if err != nil { 50 | log.Fatal(err) 51 | } 52 | 53 | // Parse the tv value. 54 | tvVal, err := strconv.ParseFloat(record[0], 64) 55 | if err != nil { 56 | log.Fatal(err) 57 | } 58 | 59 | // Add these points to the regression value. 60 | r.Train(regression.DataPoint(yVal, []float64{tvVal})) 61 | } 62 | 63 | // Train/fit the regression model. 64 | r.Run() 65 | 66 | // Output the trained model parameters. 67 | fmt.Printf("\nRegression Formula:\n%v\n\n", r.Formula) 68 | } 69 | -------------------------------------------------------------------------------- /linear_regression/example5/example5.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "fmt" 6 | "log" 7 | "os" 8 | "strconv" 9 | 10 | "github.com/sajari/regression" 11 | ) 12 | 13 | func main() { 14 | 15 | // Open the training dataset file. 16 | f, err := os.Open("../data/training.csv") 17 | if err != nil { 18 | log.Fatal(err) 19 | } 20 | defer f.Close() 21 | 22 | // Create a new CSV reader reading from the opened file. 23 | reader := csv.NewReader(f) 24 | 25 | // Read in all of the CSV records 26 | reader.FieldsPerRecord = 4 27 | trainingData, err := reader.ReadAll() 28 | if err != nil { 29 | log.Fatal(err) 30 | } 31 | 32 | // In this case we are going to try and model our sales 33 | // y by the tv and radio features plus an intercept. As such, let's create 34 | // the struct needed to train a model using github.com/sajari/regression. 35 | var r regression.Regression 36 | r.SetObserved("sales") 37 | r.SetVar(0, "tv") 38 | r.SetVar(1, "radio") 39 | 40 | // Loop over the CSV records adding the training data. 41 | for i, record := range trainingData { 42 | 43 | // Skip the header. 44 | if i == 0 { 45 | continue 46 | } 47 | 48 | // Parse the diabetes progression measure, or "y". 49 | yVal, err := strconv.ParseFloat(record[3], 64) 50 | if err != nil { 51 | log.Fatal(err) 52 | } 53 | 54 | // Parse the tv value. 55 | tvVal, err := strconv.ParseFloat(record[0], 64) 56 | if err != nil { 57 | log.Fatal(err) 58 | } 59 | 60 | // Parse the radio value. 61 | radioVal, err := strconv.ParseFloat(record[1], 64) 62 | if err != nil { 63 | log.Fatal(err) 64 | } 65 | 66 | // Add these points to the regression value. 67 | r.Train(regression.DataPoint(yVal, []float64{tvVal, radioVal})) 68 | } 69 | 70 | // Train/fit the regression model. 71 | r.Run() 72 | 73 | // Output the trained model parameters. 74 | fmt.Printf("\nRegression Formula:\n%v\n\n", r.Formula) 75 | } 76 | -------------------------------------------------------------------------------- /linear_regression/example6/example6.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "fmt" 6 | "log" 7 | "math" 8 | "os" 9 | "strconv" 10 | ) 11 | 12 | // scratchPredict makes a prediction for sales 13 | // based on the from-scratch SGD-trained model. 14 | func scratchPredict(tv float64) float64 { 15 | return 0.55*tv + 0.23 16 | } 17 | 18 | // sajariSinglePredict makes a prediction for sales 19 | // based on the sajari single regression model. 20 | func sajariSinglePredict(tv float64) float64 { 21 | return 0.57*tv + 0.22 22 | } 23 | 24 | // sajariMultiPredict makes a prediction for sales 25 | // based on the sajari multiple regression model. 26 | func sajariMultiPredict(tv, radio float64) float64 { 27 | return 0.55*tv + 0.35*radio + 0.05 28 | } 29 | 30 | func main() { 31 | 32 | // Open the test dataset file. 33 | f, err := os.Open("../data/test.csv") 34 | if err != nil { 35 | log.Fatal(err) 36 | } 37 | defer f.Close() 38 | 39 | // Create a CSV reader reading from the opened file. 40 | reader := csv.NewReader(f) 41 | 42 | // Read in all of the CSV records 43 | reader.FieldsPerRecord = 4 44 | testData, err := reader.ReadAll() 45 | if err != nil { 46 | log.Fatal(err) 47 | } 48 | 49 | // Loop over the test data predicting y and evaluating the prediction 50 | // with the mean squared error. 51 | var mSEScratch float64 52 | var mSESajariSingle float64 53 | var mSESajariMulti float64 54 | for i, record := range testData { 55 | 56 | // Skip the header. 57 | if i == 0 { 58 | continue 59 | } 60 | 61 | // Parse the observed sales, or "y". 62 | yObserved, err := strconv.ParseFloat(record[3], 64) 63 | if err != nil { 64 | log.Fatal(err) 65 | } 66 | 67 | // Parse the tv value. 68 | tvVal, err := strconv.ParseFloat(record[0], 64) 69 | if err != nil { 70 | log.Fatal(err) 71 | } 72 | 73 | // Parse the radio value. 74 | radioVal, err := strconv.ParseFloat(record[1], 64) 75 | if err != nil { 76 | log.Fatal(err) 77 | } 78 | 79 | // Predict y with our trained models. 80 | yScratch := scratchPredict(tvVal) 81 | ySajariSingle := sajariSinglePredict(tvVal) 82 | ySajariMulti := sajariMultiPredict(tvVal, radioVal) 83 | 84 | // Add the to the mean squared error. 85 | mSEScratch += math.Pow(yObserved-yScratch, 2) / float64(len(testData)) 86 | mSESajariSingle += math.Pow(yObserved-ySajariSingle, 2) / float64(len(testData)) 87 | mSESajariMulti += math.Pow(yObserved-ySajariMulti, 2) / float64(len(testData)) 88 | } 89 | 90 | // Output the RMSE to standard out. 91 | fmt.Printf("\nRMSE (Scratch) = %0.2f\n", math.Sqrt(mSEScratch)) 92 | fmt.Printf("RMSE (Sajari Single) = %0.2f\n", math.Sqrt(mSESajariSingle)) 93 | fmt.Printf("RMSE (Sajari Multi) = %0.2f\n\n", math.Sqrt(mSESajariMulti)) 94 | } 95 | -------------------------------------------------------------------------------- /logistic_regression/data/loan_data.csv: -------------------------------------------------------------------------------- 1 | FICO.Range,Interest.Rate 2 | 735-739,8.90% 3 | 715-719,12.12% 4 | 690-694,21.98% 5 | 695-699,9.99% 6 | 695-699,11.71% 7 | 670-674,15.31% 8 | 720-724,7.90% 9 | 705-709,17.14% 10 | 685-689,14.33% 11 | 715-719,6.91% 12 | 670-674,19.72% 13 | 665-669,14.27% 14 | 670-674,21.67% 15 | 735-739,8.90% 16 | 725-729,7.62% 17 | 730-734,15.65% 18 | 695-699,12.12% 19 | 740-744,10.37% 20 | 730-734,9.76% 21 | 760-764,9.99% 22 | 665-669,21.98% 23 | 695-699,19.05% 24 | 665-669,17.99% 25 | 695-699,11.99% 26 | 670-674,16.82% 27 | 705-709,7.90% 28 | 675-679,14.42% 29 | 675-679,15.31% 30 | 765-769,8.59% 31 | 760-764,7.90% 32 | 685-689,21.00% 33 | 685-689,12.12% 34 | 720-724,16.49% 35 | 685-689,15.80% 36 | 675-679,13.55% 37 | 780-784,7.90% 38 | 720-724,7.90% 39 | 830-834,7.62% 40 | 715-719,8.90% 41 | 660-664,12.49% 42 | 670-674,17.27% 43 | 720-724,11.14% 44 | 660-664,19.13% 45 | 660-664,21.74% 46 | 675-679,17.27% 47 | 715-719,11.86% 48 | 710-714,10.38% 49 | 670-674,23.91% 50 | 785-789,7.49% 51 | 705-709,12.12% 52 | 750-754,10.38% 53 | 660-664,17.44% 54 | 700-704,17.27% 55 | 665-669,21.49% 56 | 680-684,17.80% 57 | 725-729,11.14% 58 | 670-674,12.98% 59 | 715-719,9.99% 60 | 690-694,14.33% 61 | 755-759,8.59% 62 | 705-709,11.49% 63 | 715-719,10.16% 64 | 680-684,14.33% 65 | 665-669,15.80% 66 | 730-734,9.91% 67 | 725-729,7.90% 68 | 685-689,13.11% 69 | 685-689,16.29% 70 | 705-709,18.25% 71 | 695-699,14.09% 72 | 695-699,19.03% 73 | 715-719,8.90% 74 | 735-739,11.83% 75 | 665-669,17.27% 76 | 670-674,15.31% 77 | 670-674,13.12% 78 | 790-794,6.76% 79 | 700-704,10.83% 80 | 665-669,17.77% 81 | 725-729,7.90% 82 | 710-714,13.06% 83 | 760-764,7.62% 84 | 680-684,17.77% 85 | 690-694,12.12% 86 | 695-699,16.49% 87 | 725-729,11.12% 88 | 810-814,8.90% 89 | 675-679,14.09% 90 | 750-754,7.51% 91 | 685-689,13.11% 92 | 665-669,16.29% 93 | 765-769,7.62% 94 | 670-674,15.81% 95 | 675-679,14.35% 96 | 675-679,14.96% 97 | 750-754,6.62% 98 | 765-769,5.99% 99 | 735-739,11.14% 100 | 665-669,19.72% 101 | 670-674,18.75% 102 | 700-704,13.11% 103 | 705-709,7.90% 104 | 740-744,6.62% 105 | 690-694,21.48% 106 | 720-724,10.65% 107 | 765-769,7.62% 108 | 715-719,6.03% 109 | 710-714,13.11% 110 | 690-694,14.09% 111 | 710-714,14.09% 112 | 775-779,7.90% 113 | 815-819,6.03% 114 | 745-749,6.17% 115 | 660-664,13.99% 116 | 690-694,15.80% 117 | 710-714,10.99% 118 | 725-729,10.16% 119 | 675-679,14.09% 120 | 765-769,6.03% 121 | 735-739,7.66% 122 | 700-704,19.05% 123 | 735-739,8.90% 124 | 690-694,11.14% 125 | 760-764,8.49% 126 | 680-684,15.21% 127 | 695-699,19.72% 128 | 675-679,14.65% 129 | 700-704,14.33% 130 | 695-699,10.74% 131 | 675-679,15.70% 132 | 730-734,9.76% 133 | 665-669,16.32% 134 | 675-679,14.07% 135 | 680-684,14.09% 136 | 725-729,9.91% 137 | 660-664,16.40% 138 | 805-809,6.62% 139 | 745-749,5.79% 140 | 740-744,9.88% 141 | 700-704,15.99% 142 | 720-724,9.76% 143 | 660-664,23.28% 144 | 735-739,9.63% 145 | 700-704,13.11% 146 | 705-709,11.14% 147 | 725-729,13.11% 148 | 660-664,17.27% 149 | 735-739,7.51% 150 | 725-729,14.11% 151 | 705-709,14.33% 152 | 670-674,21.49% 153 | 720-724,8.90% 154 | 715-719,10.75% 155 | 705-709,16.69% 156 | 710-714,13.98% 157 | 690-694,10.91% 158 | 705-709,11.48% 159 | 685-689,13.49% 160 | 660-664,18.25% 161 | 665-669,17.77% 162 | 745-749,6.92% 163 | 740-744,7.90% 164 | 665-669,14.33% 165 | 695-699,10.16% 166 | 680-684,16.29% 167 | 700-704,8.90% 168 | 715-719,12.53% 169 | 710-714,15.31% 170 | 720-724,15.27% 171 | 665-669,15.23% 172 | 700-704,11.14% 173 | 685-689,13.11% 174 | 725-729,8.90% 175 | 685-689,12.12% 176 | 680-684,21.28% 177 | 750-754,10.38% 178 | 800-804,7.90% 179 | 740-744,11.14% 180 | 730-734,7.62% 181 | 720-724,11.14% 182 | 700-704,14.33% 183 | 695-699,7.90% 184 | 695-699,12.69% 185 | 700-704,10.16% 186 | 670-674,19.05% 187 | 745-749,6.03% 188 | 675-679,16.89% 189 | 740-744,8.59% 190 | 660-664,18.49% 191 | 730-734,12.69% 192 | 745-749,13.11% 193 | 700-704,15.80% 194 | 675-679,19.05% 195 | 665-669,19.47% 196 | 730-734,8.90% 197 | 745-749,6.17% 198 | 755-759,10.59% 199 | 790-794,11.48% 200 | 725-729,7.88% 201 | 670-674,14.09% 202 | 665-669,15.80% 203 | 685-689,14.83% 204 | 750-754,8.00% 205 | 715-719,10.00% 206 | 700-704,13.98% 207 | 750-754,8.94% 208 | 730-734,7.90% 209 | 755-759,7.90% 210 | 690-694,12.12% 211 | 690-694,13.11% 212 | 735-739,7.90% 213 | 700-704,8.90% 214 | 690-694,14.09% 215 | 675-679,14.09% 216 | 665-669,17.77% 217 | 730-734,13.11% 218 | 705-709,10.16% 219 | 680-684,13.11% 220 | 760-764,11.14% 221 | 725-729,10.65% 222 | 685-689,15.31% 223 | 710-714,12.12% 224 | 665-669,22.47% 225 | 720-724,7.90% 226 | 660-664,18.67% 227 | 750-754,6.03% 228 | 710-714,12.12% 229 | 730-734,7.51% 230 | 665-669,24.89% 231 | 690-694,12.12% 232 | 670-674,15.31% 233 | 660-664,19.91% 234 | 670-674,14.33% 235 | 665-669,20.77% 236 | 665-669,14.09% 237 | 750-754,12.84% 238 | 720-724,13.11% 239 | 680-684,13.67% 240 | 685-689,10.16% 241 | 760-764,6.99% 242 | 700-704,18.49% 243 | 695-699,16.29% 244 | 750-754,7.74% 245 | 680-684,16.29% 246 | 750-754,6.62% 247 | 790-794,8.59% 248 | 740-744,10.75% 249 | 715-719,8.49% 250 | 710-714,12.12% 251 | 725-729,7.90% 252 | 760-764,8.94% 253 | 730-734,7.49% 254 | 685-689,11.03% 255 | 705-709,10.16% 256 | 670-674,16.89% 257 | 675-679,14.35% 258 | 670-674,14.09% 259 | 700-704,15.57% 260 | 665-669,14.59% 261 | 665-669,16.29% 262 | 750-754,5.99% 263 | 720-724,14.65% 264 | 730-734,7.90% 265 | 800-804,10.99% 266 | 805-809,9.62% 267 | 680-684,20.50% 268 | 730-734,7.62% 269 | 725-729,6.91% 270 | 695-699,17.77% 271 | 690-694,15.31% 272 | 695-699,10.16% 273 | 715-719,10.36% 274 | 665-669,17.27% 275 | 670-674,15.27% 276 | 665-669,17.49% 277 | 685-689,17.77% 278 | 670-674,11.54% 279 | 715-719,12.69% 280 | 715-719,10.74% 281 | 720-724,13.99% 282 | 665-669,15.80% 283 | 780-784,6.62% 284 | 705-709,16.49% 285 | 690-694,13.06% 286 | 670-674,16.29% 287 | 705-709,17.49% 288 | 675-679,18.75% 289 | 685-689,11.71% 290 | 675-679,17.27% 291 | 730-734,10.36% 292 | 680-684,15.31% 293 | 670-674,15.80% 294 | 750-754,6.03% 295 | 745-749,11.71% 296 | 655-659,14.12% 297 | 690-694,19.99% 298 | 690-694,14.09% 299 | 775-779,6.62% 300 | 750-754,8.90% 301 | 665-669,12.92% 302 | 660-664,18.67% 303 | 730-734,9.88% 304 | 800-804,6.62% 305 | 755-759,10.38% 306 | 790-794,7.90% 307 | 685-689,14.33% 308 | 750-754,9.91% 309 | 695-699,19.05% 310 | 685-689,13.11% 311 | 770-774,7.90% 312 | 705-709,7.90% 313 | 795-799,7.90% 314 | 715-719,12.99% 315 | 685-689,11.14% 316 | 665-669,17.77% 317 | 705-709,14.27% 318 | 660-664,20.49% 319 | 740-744,14.33% 320 | 680-684,11.71% 321 | 730-734,9.91% 322 | 785-789,6.03% 323 | 755-759,11.99% 324 | 665-669,19.72% 325 | 665-669,15.96% 326 | 660-664,20.49% 327 | 685-689,14.09% 328 | 680-684,10.46% 329 | 700-704,19.72% 330 | 670-674,14.22% 331 | 730-734,5.79% 332 | 670-674,15.31% 333 | 710-714,15.31% 334 | 710-714,12.12% 335 | 755-759,10.38% 336 | 780-784,5.79% 337 | 715-719,11.49% 338 | 710-714,10.16% 339 | 685-689,19.05% 340 | 690-694,12.12% 341 | 660-664,22.47% 342 | 785-789,5.42% 343 | 720-724,7.90% 344 | 720-724,12.12% 345 | 710-714,15.31% 346 | 675-679,22.78% 347 | 690-694,11.11% 348 | 685-689,17.04% 349 | 675-679,12.61% 350 | 675-679,15.80% 351 | 690-694,10.99% 352 | 690-694,18.25% 353 | 790-794,5.99% 354 | 710-714,9.62% 355 | 675-679,12.12% 356 | 760-764,13.49% 357 | 670-674,14.65% 358 | 715-719,10.16% 359 | 740-744,7.90% 360 | 675-679,17.77% 361 | 695-699,13.11% 362 | 710-714,13.06% 363 | 710-714,16.49% 364 | 685-689,18.17% 365 | 690-694,22.78% 366 | 735-739,10.59% 367 | 660-664,19.05% 368 | 800-804,7.43% 369 | 710-714,15.31% 370 | 740-744,6.62% 371 | 735-739,13.11% 372 | 690-694,14.09% 373 | 660-664,17.77% 374 | 770-774,5.42% 375 | 755-759,10.75% 376 | 710-714,10.16% 377 | 665-669,22.95% 378 | 700-704,13.11% 379 | 705-709,13.11% 380 | 790-794,6.03% 381 | 685-689,16.77% 382 | 690-694,16.70% 383 | 715-719,14.91% 384 | 705-709,19.42% 385 | 700-704,14.11% 386 | 725-729,13.99% 387 | 690-694,12.12% 388 | 720-724,8.90% 389 | 690-694,13.11% 390 | 720-724,10.25% 391 | 685-689,14.09% 392 | 665-669,15.31% 393 | 700-704,10.65% 394 | 680-684,15.31% 395 | 670-674,18.75% 396 | 680-684,19.72% 397 | 670-674,14.33% 398 | 690-694,14.33% 399 | 700-704,8.90% 400 | 780-784,6.62% 401 | 775-779,11.71% 402 | 735-739,7.29% 403 | 685-689,12.29% 404 | 755-759,6.03% 405 | 715-719,8.90% 406 | 730-734,7.90% 407 | 680-684,16.29% 408 | 670-674,22.47% 409 | 685-689,13.11% 410 | 750-754,6.62% 411 | 705-709,12.12% 412 | 680-684,13.11% 413 | 675-679,10.37% 414 | 665-669,15.80% 415 | 695-699,15.96% 416 | 690-694,22.47% 417 | 700-704,15.81% 418 | 700-704,13.11% 419 | 700-704,10.16% 420 | 730-734,6.91% 421 | 715-719,8.90% 422 | 690-694,15.80% 423 | 675-679,14.65% 424 | 725-729,14.09% 425 | 730-734,7.29% 426 | 735-739,15.80% 427 | 685-689,14.33% 428 | 670-674,13.11% 429 | 685-689,13.11% 430 | 715-719,11.14% 431 | 705-709,9.91% 432 | 690-694,16.49% 433 | 695-699,12.68% 434 | 715-719,7.90% 435 | 710-714,16.29% 436 | 735-739,10.38% 437 | 760-764,5.79% 438 | 715-719,12.12% 439 | 705-709,13.85% 440 | 725-729,10.16% 441 | 730-734,10.00% 442 | 675-679,15.31% 443 | 695-699,18.75% 444 | 660-664,19.05% 445 | 760-764,6.91% 446 | 680-684,14.65% 447 | 730-734,9.99% 448 | 730-734,10.16% 449 | 705-709,12.42% 450 | 720-724,10.16% 451 | 705-709,7.90% 452 | 685-689,14.09% 453 | 670-674,21.00% 454 | 730-734,9.91% 455 | 730-734,7.62% 456 | 685-689,15.31% 457 | 700-704,15.21% 458 | 675-679,14.11% 459 | 665-669,15.31% 460 | 720-724,12.87% 461 | 725-729,14.33% 462 | 700-704,18.49% 463 | 670-674,13.11% 464 | 780-784,9.63% 465 | 725-729,17.27% 466 | 720-724,7.90% 467 | 660-664,19.05% 468 | 700-704,13.11% 469 | 685-689,13.11% 470 | 730-734,10.99% 471 | 680-684,14.33% 472 | 680-684,16.63% 473 | 740-744,13.11% 474 | 670-674,22.95% 475 | 780-784,9.76% 476 | 730-734,10.38% 477 | 690-694,13.67% 478 | 695-699,15.31% 479 | 660-664,23.28% 480 | 665-669,18.25% 481 | 695-699,17.49% 482 | 680-684,11.99% 483 | 685-689,11.71% 484 | 690-694,19.05% 485 | 675-679,10.16% 486 | 665-669,14.09% 487 | 720-724,11.71% 488 | 665-669,19.72% 489 | 715-719,11.12% 490 | 690-694,12.12% 491 | 705-709,14.84% 492 | 705-709,11.97% 493 | 675-679,17.43% 494 | 725-729,7.90% 495 | 675-679,19.72% 496 | 690-694,13.11% 497 | 670-674,15.31% 498 | 725-729,16.00% 499 | 700-704,12.21% 500 | 670-674,15.31% 501 | 680-684,15.27% 502 | 670-674,17.27% 503 | 700-704,12.12% 504 | 680-684,17.77% 505 | 690-694,13.49% 506 | 685-689,16.32% 507 | 675-679,14.09% 508 | 745-749,9.99% 509 | 680-684,12.12% 510 | 685-689,12.12% 511 | 670-674,13.16% 512 | 735-739,14.09% 513 | 730-734,7.90% 514 | 690-694,10.16% 515 | 725-729,12.21% 516 | 760-764,11.14% 517 | 745-749,10.74% 518 | 775-779,10.25% 519 | 710-714,11.36% 520 | 760-764,7.14% 521 | 715-719,12.12% 522 | 715-719,10.00% 523 | 750-754,6.03% 524 | 700-704,12.29% 525 | 755-759,6.76% 526 | 670-674,14.84% 527 | 665-669,15.96% 528 | 750-754,6.03% 529 | 670-674,15.31% 530 | 675-679,15.80% 531 | 680-684,12.12% 532 | 670-674,11.14% 533 | 775-779,9.32% 534 | 665-669,17.77% 535 | 660-664,17.77% 536 | 665-669,22.47% 537 | 675-679,16.29% 538 | 680-684,15.21% 539 | 680-684,8.90% 540 | 670-674,13.06% 541 | 730-734,7.49% 542 | 705-709,16.29% 543 | 740-744,7.62% 544 | 685-689,13.99% 545 | 670-674,17.77% 546 | 720-724,10.75% 547 | 685-689,18.62% 548 | 675-679,12.69% 549 | 725-729,14.65% 550 | 740-744,6.62% 551 | 715-719,11.14% 552 | 670-674,21.00% 553 | 700-704,7.66% 554 | 685-689,14.18% 555 | 685-689,11.14% 556 | 660-664,19.05% 557 | 720-724,13.98% 558 | 670-674,14.09% 559 | 710-714,14.65% 560 | 740-744,11.89% 561 | 680-684,17.14% 562 | 740-744,9.76% 563 | 665-669,22.47% 564 | 710-714,7.90% 565 | 685-689,13.11% 566 | 670-674,17.27% 567 | 695-699,14.09% 568 | 720-724,8.94% 569 | 770-774,6.17% 570 | 705-709,10.16% 571 | 670-674,18.49% 572 | 690-694,22.11% 573 | 730-734,11.49% 574 | 685-689,12.99% 575 | 690-694,14.09% 576 | 695-699,10.59% 577 | 725-729,7.29% 578 | 730-734,11.71% 579 | 675-679,16.77% 580 | 680-684,19.72% 581 | 720-724,7.62% 582 | 690-694,15.31% 583 | 665-669,15.31% 584 | 690-694,12.12% 585 | 660-664,16.77% 586 | 680-684,15.81% 587 | 685-689,13.49% 588 | 695-699,9.76% 589 | 690-694,18.75% 590 | 735-739,7.51% 591 | 760-764,10.16% 592 | 750-754,6.99% 593 | 790-794,7.51% 594 | 740-744,12.18% 595 | 720-724,11.14% 596 | 660-664,15.96% 597 | 700-704,10.16% 598 | 700-704,14.33% 599 | 680-684,13.11% 600 | 690-694,12.12% 601 | 700-704,10.99% 602 | 670-674,15.23% 603 | 780-784,7.88% 604 | 665-669,15.31% 605 | 700-704,13.11% 606 | 705-709,10.74% 607 | 675-679,20.53% 608 | 685-689,16.07% 609 | 685-689,12.53% 610 | 690-694,19.05% 611 | 800-804,7.90% 612 | 725-729,16.29% 613 | 690-694,13.11% 614 | 750-754,8.90% 615 | 700-704,11.14% 616 | 790-794,5.42% 617 | 685-689,10.65% 618 | 720-724,10.65% 619 | 675-679,17.99% 620 | 705-709,12.42% 621 | 680-684,18.75% 622 | 735-739,6.62% 623 | 705-709,12.53% 624 | 675-679,13.98% 625 | 740-744,6.03% 626 | 755-759,7.51% 627 | 705-709,11.12% 628 | 710-714,9.76% 629 | 700-704,12.12% 630 | 785-789,9.63% 631 | 690-694,16.29% 632 | 670-674,12.42% 633 | 710-714,14.33% 634 | 720-724,7.90% 635 | 700-704,10.74% 636 | 695-699,15.31% 637 | 720-724,10.16% 638 | 710-714,13.35% 639 | 695-699,17.27% 640 | 715-719,11.83% 641 | 725-729,18.25% 642 | 735-739,7.51% 643 | 710-714,13.67% 644 | 675-679,10.37% 645 | 675-679,11.71% 646 | 710-714,12.12% 647 | 730-734,8.63% 648 | 730-734,7.90% 649 | 720-724,6.62% 650 | 640-644,14.70% 651 | 700-704,10.00% 652 | 680-684,14.09% 653 | 735-739,6.91% 654 | 720-724,13.99% 655 | 675-679,13.11% 656 | 690-694,12.12% 657 | 725-729,7.66% 658 | 720-724,15.28% 659 | 730-734,12.12% 660 | 710-714,8.90% 661 | 760-764,14.33% 662 | 795-799,6.54% 663 | 660-664,16.77% 664 | 680-684,19.03% 665 | 695-699,12.12% 666 | 735-739,7.90% 667 | 670-674,17.49% 668 | 685-689,10.74% 669 | 705-709,10.16% 670 | 690-694,14.33% 671 | 695-699,18.25% 672 | 665-669,13.35% 673 | 680-684,19.22% 674 | 745-749,6.03% 675 | 685-689,14.65% 676 | 665-669,15.95% 677 | 745-749,6.17% 678 | 675-679,16.29% 679 | 725-729,11.49% 680 | 725-729,7.90% 681 | 710-714,9.99% 682 | 725-729,7.90% 683 | 690-694,13.48% 684 | 670-674,14.11% 685 | 695-699,10.99% 686 | 720-724,7.90% 687 | 785-789,7.74% 688 | 675-679,18.49% 689 | 695-699,17.27% 690 | 685-689,14.33% 691 | 665-669,23.33% 692 | 660-664,14.09% 693 | 700-704,15.58% 694 | 690-694,17.58% 695 | 665-669,13.72% 696 | 700-704,13.11% 697 | 720-724,10.62% 698 | 670-674,13.43% 699 | 725-729,11.83% 700 | 705-709,9.91% 701 | 690-694,19.05% 702 | 745-749,11.09% 703 | 640-644,15.45% 704 | 705-709,7.90% 705 | 665-669,19.22% 706 | 665-669,16.89% 707 | 675-679,14.09% 708 | 780-784,6.03% 709 | 690-694,18.25% 710 | 665-669,18.75% 711 | 665-669,17.77% 712 | 675-679,13.67% 713 | 680-684,15.31% 714 | 660-664,18.55% 715 | 715-719,8.90% 716 | 710-714,11.14% 717 | 670-674,13.11% 718 | 775-779,7.62% 719 | 680-684,14.33% 720 | 665-669,12.12% 721 | 665-669,17.77% 722 | 675-679,13.11% 723 | 710-714,19.72% 724 | 695-699,15.58% 725 | 680-684,12.69% 726 | 710-714,17.27% 727 | 715-719,11.14% 728 | 670-674,13.49% 729 | 715-719,7.49% 730 | 695-699,12.12% 731 | 670-674,17.77% 732 | 685-689,21.00% 733 | 695-699,18.49% 734 | 675-679,14.33% 735 | 695-699,14.72% 736 | 680-684,13.11% 737 | 770-774,6.17% 738 | 735-739,6.99% 739 | 715-719,13.99% 740 | 720-724,11.26% 741 | 665-669,17.27% 742 | 685-689,14.09% 743 | 680-684,15.31% 744 | 670-674,14.33% 745 | 675-679,21.49% 746 | 670-674,15.80% 747 | 725-729,17.27% 748 | 660-664,21.49% 749 | 700-704,12.53% 750 | 675-679,17.90% 751 | 705-709,11.14% 752 | 690-694,13.11% 753 | 720-724,7.90% 754 | 665-669,23.76% 755 | 680-684,15.31% 756 | 675-679,14.11% 757 | 680-684,21.98% 758 | 670-674,15.31% 759 | 680-684,14.09% 760 | 685-689,19.05% 761 | 680-684,13.49% 762 | 690-694,16.77% 763 | 720-724,10.74% 764 | 665-669,14.65% 765 | 730-734,9.91% 766 | 670-674,21.97% 767 | 680-684,14.65% 768 | 665-669,17.27% 769 | 660-664,18.49% 770 | 705-709,12.53% 771 | 690-694,13.11% 772 | 720-724,12.69% 773 | 700-704,8.90% 774 | 705-709,7.90% 775 | 720-724,10.65% 776 | 710-714,7.90% 777 | 660-664,15.33% 778 | 710-714,12.99% 779 | 665-669,15.99% 780 | 745-749,10.59% 781 | 710-714,16.29% 782 | 745-749,15.31% 783 | 660-664,16.89% 784 | 730-734,10.16% 785 | 730-734,10.25% 786 | 670-674,13.11% 787 | 730-734,9.91% 788 | 705-709,17.27% 789 | 730-734,8.49% 790 | 665-669,22.78% 791 | 700-704,11.71% 792 | 670-674,17.27% 793 | 745-749,13.11% 794 | 670-674,18.55% 795 | 700-704,13.47% 796 | 725-729,13.99% 797 | 700-704,13.99% 798 | 670-674,15.27% 799 | 665-669,21.48% 800 | 745-749,12.18% 801 | 710-714,9.76% 802 | 670-674,15.21% 803 | 720-724,13.85% 804 | 705-709,19.69% 805 | 680-684,16.29% 806 | 715-719,11.48% 807 | 735-739,13.11% 808 | 790-794,9.63% 809 | 755-759,10.08% 810 | 750-754,10.25% 811 | 680-684,17.27% 812 | 670-674,14.33% 813 | 660-664,18.49% 814 | 685-689,16.77% 815 | 670-674,19.03% 816 | 725-729,11.14% 817 | 780-784,6.03% 818 | 690-694,13.67% 819 | 705-709,11.86% 820 | 685-689,15.31% 821 | 720-724,14.91% 822 | 660-664,16.29% 823 | 665-669,12.98% 824 | 680-684,15.31% 825 | 675-679,15.37% 826 | 765-769,10.36% 827 | 700-704,11.12% 828 | 700-704,8.90% 829 | 730-734,7.49% 830 | 725-729,9.91% 831 | 660-664,17.27% 832 | 710-714,14.09% 833 | 700-704,11.14% 834 | 790-794,6.62% 835 | 775-779,11.12% 836 | 690-694,11.49% 837 | 745-749,6.99% 838 | 730-734,13.99% 839 | 675-679,23.28% 840 | 690-694,13.47% 841 | 680-684,15.68% 842 | 740-744,7.29% 843 | 670-674,20.50% 844 | 780-784,11.49% 845 | 810-814,7.90% 846 | 700-704,16.89% 847 | 695-699,13.67% 848 | 670-674,17.27% 849 | 720-724,11.71% 850 | 765-769,7.90% 851 | 760-764,10.62% 852 | 680-684,11.03% 853 | 675-679,13.11% 854 | 685-689,13.11% 855 | 720-724,10.36% 856 | 675-679,14.33% 857 | 765-769,6.62% 858 | 665-669,16.35% 859 | 690-694,12.68% 860 | 675-679,15.20% 861 | 680-684,15.31% 862 | 700-704,11.14% 863 | 720-724,10.99% 864 | 735-739,19.05% 865 | 705-709,14.59% 866 | 750-754,6.03% 867 | 700-704,12.12% 868 | 660-664,14.72% 869 | 770-774,6.62% 870 | 660-664,23.76% 871 | 715-719,12.12% 872 | 675-679,19.05% 873 | 740-744,6.62% 874 | 695-699,16.89% 875 | 725-729,6.54% 876 | 780-784,10.38% 877 | 705-709,8.90% 878 | 745-749,8.90% 879 | 735-739,14.27% 880 | 665-669,14.27% 881 | 670-674,13.11% 882 | 700-704,15.28% 883 | 670-674,17.27% 884 | 785-789,7.51% 885 | 705-709,12.12% 886 | 740-744,6.62% 887 | 745-749,7.90% 888 | 765-769,7.14% 889 | 680-684,13.11% 890 | 695-699,13.11% 891 | 695-699,18.75% 892 | 705-709,11.83% 893 | 730-734,13.67% 894 | 690-694,14.33% 895 | 770-774,6.03% 896 | 725-729,7.62% 897 | 725-729,12.53% 898 | 715-719,14.33% 899 | 670-674,15.33% 900 | 670-674,18.49% 901 | 690-694,14.59% 902 | 665-669,13.75% 903 | 780-784,5.42% 904 | 685-689,14.09% 905 | 780-784,9.32% 906 | 675-679,13.99% 907 | 670-674,11.14% 908 | 675-679,17.99% 909 | 735-739,16.29% 910 | 730-734,6.54% 911 | 720-724,7.90% 912 | 675-679,12.12% 913 | 675-679,15.80% 914 | 665-669,17.77% 915 | 670-674,15.31% 916 | 675-679,12.73% 917 | 685-689,19.72% 918 | 700-704,11.11% 919 | 735-739,7.62% 920 | 750-754,7.14% 921 | 770-774,7.90% 922 | 670-674,19.91% 923 | 760-764,8.07% 924 | 675-679,18.64% 925 | 730-734,8.90% 926 | 705-709,12.12% 927 | 730-734,11.83% 928 | 660-664,19.74% 929 | 685-689,11.14% 930 | 710-714,14.33% 931 | 695-699,15.31% 932 | 740-744,7.66% 933 | 695-699,9.99% 934 | 760-764,8.90% 935 | 680-684,12.12% 936 | 720-724,10.59% 937 | 670-674,12.84% 938 | 710-714,7.90% 939 | 720-724,7.62% 940 | 685-689,12.12% 941 | 675-679,13.49% 942 | 675-679,12.61% 943 | 690-694,13.47% 944 | 730-734,15.23% 945 | 710-714,13.99% 946 | 695-699,13.61% 947 | 680-684,14.22% 948 | 730-734,6.62% 949 | 700-704,16.32% 950 | 695-699,11.89% 951 | 700-704,10.74% 952 | 740-744,13.11% 953 | 705-709,15.95% 954 | 765-769,5.79% 955 | 725-729,11.71% 956 | 755-759,11.71% 957 | 700-704,12.12% 958 | 690-694,10.36% 959 | 660-664,18.25% 960 | 710-714,13.11% 961 | 720-724,7.90% 962 | 705-709,11.71% 963 | 665-669,12.12% 964 | 710-714,10.16% 965 | 750-754,8.32% 966 | 695-699,13.11% 967 | 670-674,15.31% 968 | 710-714,12.12% 969 | 700-704,12.12% 970 | 765-769,7.62% 971 | 725-729,12.69% 972 | 700-704,10.99% 973 | 765-769,10.16% 974 | 665-669,18.75% 975 | 695-699,14.09% 976 | 710-714,13.48% 977 | 700-704,14.33% 978 | 745-749,13.43% 979 | 705-709,15.23% 980 | 675-679,14.33% 981 | 760-764,15.23% 982 | 685-689,22.95% 983 | 740-744,12.18% 984 | 790-794,6.03% 985 | 675-679,15.05% 986 | 750-754,11.12% 987 | 760-764,7.75% 988 | 690-694,12.12% 989 | 670-674,17.27% 990 | 710-714,12.69% 991 | 660-664,22.47% 992 | 660-664,17.27% 993 | 665-669,12.69% 994 | 725-729,7.66% 995 | 695-699,16.29% 996 | 685-689,18.25% 997 | 755-759,7.90% 998 | 680-684,15.96% 999 | 670-674,20.77% 1000 | 695-699,16.32% 1001 | 730-734,8.63% 1002 | 760-764,6.62% 1003 | 705-709,21.49% 1004 | 685-689,12.42% 1005 | 720-724,10.74% 1006 | 675-679,14.33% 1007 | 790-794,7.88% 1008 | 720-724,12.12% 1009 | 735-739,13.23% 1010 | 790-794,12.12% 1011 | 685-689,12.12% 1012 | 680-684,14.09% 1013 | 745-749,7.88% 1014 | 690-694,11.14% 1015 | 685-689,11.97% 1016 | 695-699,13.11% 1017 | 695-699,16.29% 1018 | 735-739,10.00% 1019 | 715-719,8.90% 1020 | 660-664,20.49% 1021 | 680-684,16.49% 1022 | 690-694,19.72% 1023 | 675-679,15.80% 1024 | 695-699,15.96% 1025 | 705-709,12.12% 1026 | 695-699,11.89% 1027 | 725-729,8.90% 1028 | 675-679,23.28% 1029 | 735-739,6.62% 1030 | 675-679,24.70% 1031 | 705-709,12.12% 1032 | 695-699,20.30% 1033 | 675-679,14.74% 1034 | 695-699,15.31% 1035 | 730-734,14.65% 1036 | 685-689,14.33% 1037 | 670-674,14.96% 1038 | 660-664,17.99% 1039 | 690-694,10.36% 1040 | 785-789,5.42% 1041 | 785-789,7.88% 1042 | 690-694,15.31% 1043 | 695-699,21.48% 1044 | 710-714,10.16% 1045 | 665-669,15.31% 1046 | 765-769,5.99% 1047 | 670-674,14.72% 1048 | 720-724,11.71% 1049 | 710-714,13.11% 1050 | 745-749,11.99% 1051 | 675-679,19.22% 1052 | 680-684,11.97% 1053 | 720-724,12.41% 1054 | 680-684,12.12% 1055 | 690-694,12.99% 1056 | 685-689,12.12% 1057 | 670-674,24.33% 1058 | 680-684,17.77% 1059 | 730-734,10.25% 1060 | 695-699,11.71% 1061 | 700-704,13.16% 1062 | 760-764,7.90% 1063 | 720-724,9.91% 1064 | 665-669,19.05% 1065 | 680-684,14.17% 1066 | 700-704,13.11% 1067 | 735-739,11.14% 1068 | 705-709,11.71% 1069 | 780-784,6.03% 1070 | 690-694,12.12% 1071 | 660-664,23.28% 1072 | 715-719,18.49% 1073 | 660-664,15.31% 1074 | 680-684,11.59% 1075 | 680-684,12.12% 1076 | 715-719,12.18% 1077 | 680-684,17.56% 1078 | 670-674,19.05% 1079 | 715-719,8.90% 1080 | 680-684,13.11% 1081 | 730-734,13.23% 1082 | 760-764,7.90% 1083 | 745-749,11.89% 1084 | 720-724,18.64% 1085 | 705-709,20.50% 1086 | 680-684,13.16% 1087 | 675-679,15.80% 1088 | 710-714,10.16% 1089 | 670-674,14.65% 1090 | 715-719,10.16% 1091 | 670-674,17.27% 1092 | 720-724,13.99% 1093 | 705-709,11.12% 1094 | 760-764,5.99% 1095 | 675-679,14.27% 1096 | 695-699,14.65% 1097 | 675-679,12.12% 1098 | 720-724,7.90% 1099 | 680-684,22.95% 1100 | 710-714,10.59% 1101 | 750-754,12.12% 1102 | 695-699,17.27% 1103 | 675-679,18.49% 1104 | 805-809,7.51% 1105 | 725-729,9.76% 1106 | 720-724,11.89% 1107 | 685-689,21.49% 1108 | 660-664,23.63% 1109 | 675-679,18.75% 1110 | 695-699,13.49% 1111 | 730-734,9.20% 1112 | 795-799,11.14% 1113 | 670-674,19.91% 1114 | 685-689,11.36% 1115 | 665-669,17.77% 1116 | 680-684,15.96% 1117 | 670-674,14.27% 1118 | 700-704,9.25% 1119 | 725-729,8.90% 1120 | 720-724,18.49% 1121 | 695-699,17.49% 1122 | 765-769,6.62% 1123 | 765-769,6.03% 1124 | 750-754,9.99% 1125 | 660-664,22.78% 1126 | 765-769,8.90% 1127 | 740-744,12.99% 1128 | 670-674,15.31% 1129 | 675-679,17.27% 1130 | 740-744,5.99% 1131 | 675-679,14.09% 1132 | 745-749,16.89% 1133 | 720-724,10.65% 1134 | 755-759,8.90% 1135 | 720-724,13.49% 1136 | 670-674,16.77% 1137 | 765-769,5.79% 1138 | 660-664,20.49% 1139 | 665-669,14.61% 1140 | 660-664,21.74% 1141 | 675-679,15.99% 1142 | 755-759,11.99% 1143 | 715-719,13.98% 1144 | 770-774,5.99% 1145 | 715-719,12.12% 1146 | 705-709,9.76% 1147 | 760-764,7.88% 1148 | 680-684,19.22% 1149 | 715-719,17.99% 1150 | 685-689,12.12% 1151 | 685-689,17.99% 1152 | 660-664,18.49% 1153 | 720-724,11.99% 1154 | 755-759,6.76% 1155 | 675-679,19.72% 1156 | 695-699,13.99% 1157 | 750-754,9.07% 1158 | 700-704,10.74% 1159 | 690-694,20.50% 1160 | 690-694,12.87% 1161 | 700-704,11.11% 1162 | 790-794,6.62% 1163 | 740-744,10.74% 1164 | 710-714,12.12% 1165 | 655-659,14.82% 1166 | 750-754,7.90% 1167 | 710-714,10.16% 1168 | 765-769,7.51% 1169 | 705-709,9.76% 1170 | 715-719,7.90% 1171 | 670-674,17.77% 1172 | 745-749,13.67% 1173 | 725-729,7.90% 1174 | 755-759,10.16% 1175 | 745-749,11.48% 1176 | 715-719,13.11% 1177 | 760-764,6.62% 1178 | 715-719,11.58% 1179 | 660-664,20.49% 1180 | 710-714,12.12% 1181 | 670-674,16.77% 1182 | 695-699,14.09% 1183 | 670-674,17.56% 1184 | 690-694,14.33% 1185 | 720-724,11.83% 1186 | 705-709,13.11% 1187 | 710-714,7.90% 1188 | 675-679,13.67% 1189 | 695-699,12.12% 1190 | 725-729,17.58% 1191 | 755-759,7.14% 1192 | 720-724,14.33% 1193 | 695-699,13.11% 1194 | 750-754,6.62% 1195 | 705-709,11.71% 1196 | 710-714,19.99% 1197 | 670-674,22.47% 1198 | 675-679,18.64% 1199 | 730-734,7.88% 1200 | 705-709,14.65% 1201 | 665-669,15.31% 1202 | 675-679,18.39% 1203 | 770-774,7.90% 1204 | 730-734,7.90% 1205 | 700-704,14.09% 1206 | 740-744,7.90% 1207 | 695-699,14.27% 1208 | 710-714,13.11% 1209 | 745-749,10.00% 1210 | 665-669,17.77% 1211 | 735-739,12.69% 1212 | 695-699,12.12% 1213 | 695-699,12.12% 1214 | 705-709,18.49% 1215 | 695-699,18.25% 1216 | 710-714,14.27% 1217 | 755-759,6.03% 1218 | 695-699,15.01% 1219 | 680-684,14.46% 1220 | 675-679,13.85% 1221 | 690-694,14.65% 1222 | 685-689,16.32% 1223 | 695-699,11.14% 1224 | 670-674,14.17% 1225 | 775-779,13.11% 1226 | 685-689,13.11% 1227 | 700-704,10.74% 1228 | 705-709,11.14% 1229 | 745-749,10.65% 1230 | 725-729,11.49% 1231 | 670-674,17.77% 1232 | 675-679,14.33% 1233 | 710-714,16.49% 1234 | 730-734,7.91% 1235 | 750-754,7.90% 1236 | 715-719,8.90% 1237 | 670-674,17.49% 1238 | 710-714,14.09% 1239 | 715-719,12.12% 1240 | 700-704,15.95% 1241 | 710-714,17.77% 1242 | 770-774,7.40% 1243 | 735-739,15.65% 1244 | 725-729,9.76% 1245 | 695-699,13.11% 1246 | 725-729,15.81% 1247 | 675-679,15.31% 1248 | 755-759,10.75% 1249 | 660-664,15.96% 1250 | 685-689,15.31% 1251 | 645-649,14.82% 1252 | 760-764,6.03% 1253 | 730-734,7.90% 1254 | 685-689,12.12% 1255 | 680-684,19.22% 1256 | 675-679,15.99% 1257 | 695-699,20.50% 1258 | 705-709,13.49% 1259 | 675-679,13.99% 1260 | 730-734,10.59% 1261 | 765-769,5.99% 1262 | 680-684,14.09% 1263 | 675-679,14.27% 1264 | 705-709,8.90% 1265 | 665-669,17.27% 1266 | 690-694,15.99% 1267 | 715-719,10.16% 1268 | 695-699,13.11% 1269 | 665-669,15.31% 1270 | 720-724,10.16% 1271 | 690-694,12.12% 1272 | 700-704,13.49% 1273 | 670-674,14.27% 1274 | 665-669,19.69% 1275 | 705-709,7.90% 1276 | 700-704,21.49% 1277 | 695-699,12.12% 1278 | 660-664,18.25% 1279 | 755-759,6.62% 1280 | 715-719,12.12% 1281 | 690-694,14.09% 1282 | 705-709,13.99% 1283 | 660-664,21.98% 1284 | 690-694,19.05% 1285 | 725-729,7.90% 1286 | 700-704,11.71% 1287 | 675-679,15.31% 1288 | 775-779,11.14% 1289 | 680-684,15.80% 1290 | 750-754,7.62% 1291 | 685-689,12.69% 1292 | 760-764,8.59% 1293 | 765-769,8.00% 1294 | 705-709,12.42% 1295 | 760-764,6.62% 1296 | 725-729,7.90% 1297 | 810-814,7.90% 1298 | 685-689,19.03% 1299 | 715-719,12.12% 1300 | 750-754,5.79% 1301 | 685-689,16.29% 1302 | 700-704,18.49% 1303 | 700-704,12.12% 1304 | 660-664,14.59% 1305 | 665-669,22.45% 1306 | 645-649,14.70% 1307 | 680-684,18.75% 1308 | 695-699,12.12% 1309 | 670-674,15.62% 1310 | 800-804,5.42% 1311 | 735-739,7.62% 1312 | 685-689,17.77% 1313 | 745-749,7.49% 1314 | 665-669,23.76% 1315 | 715-719,7.90% 1316 | 800-804,10.20% 1317 | 695-699,15.62% 1318 | 690-694,15.58% 1319 | 710-714,14.65% 1320 | 765-769,10.59% 1321 | 710-714,9.91% 1322 | 670-674,18.49% 1323 | 785-789,9.76% 1324 | 695-699,6.00% 1325 | 695-699,14.33% 1326 | 705-709,11.49% 1327 | 800-804,6.03% 1328 | 730-734,6.62% 1329 | 700-704,18.55% 1330 | 730-734,8.90% 1331 | 750-754,6.03% 1332 | 705-709,17.27% 1333 | 675-679,15.99% 1334 | 710-714,10.16% 1335 | 775-779,7.74% 1336 | 730-734,11.26% 1337 | 660-664,19.05% 1338 | 720-724,13.11% 1339 | 730-734,14.61% 1340 | 730-734,8.90% 1341 | 785-789,8.90% 1342 | 695-699,15.81% 1343 | 770-774,5.79% 1344 | 705-709,12.69% 1345 | 780-784,6.92% 1346 | 670-674,15.80% 1347 | 685-689,12.12% 1348 | 755-759,13.16% 1349 | 690-694,12.68% 1350 | 760-764,6.76% 1351 | 670-674,13.67% 1352 | 700-704,14.09% 1353 | 705-709,10.74% 1354 | 660-664,16.29% 1355 | 780-784,6.03% 1356 | 665-669,13.67% 1357 | 760-764,6.03% 1358 | 695-699,12.12% 1359 | 665-669,16.29% 1360 | 660-664,22.47% 1361 | 725-729,12.99% 1362 | 675-679,14.09% 1363 | 695-699,13.11% 1364 | 755-759,9.32% 1365 | 685-689,14.09% 1366 | 685-689,20.49% 1367 | 750-754,10.62% 1368 | 705-709,11.14% 1369 | 755-759,6.62% 1370 | 740-744,8.90% 1371 | 720-724,11.14% 1372 | 750-754,5.42% 1373 | 670-674,15.80% 1374 | 700-704,14.09% 1375 | 690-694,14.33% 1376 | 720-724,7.90% 1377 | 660-664,16.29% 1378 | 680-684,16.45% 1379 | 730-734,9.62% 1380 | 720-724,12.12% 1381 | 735-739,5.79% 1382 | 680-684,11.11% 1383 | 735-739,13.06% 1384 | 785-789,7.62% 1385 | 700-704,13.48% 1386 | 645-649,15.13% 1387 | 720-724,8.90% 1388 | 680-684,15.31% 1389 | 750-754,10.37% 1390 | 675-679,18.49% 1391 | 675-679,21.67% 1392 | 725-729,13.11% 1393 | 780-784,7.40% 1394 | 735-739,7.62% 1395 | 660-664,13.24% 1396 | 720-724,7.90% 1397 | 715-719,14.50% 1398 | 735-739,6.92% 1399 | 760-764,10.25% 1400 | 725-729,7.90% 1401 | 735-739,7.66% 1402 | 685-689,19.05% 1403 | 660-664,17.58% 1404 | 720-724,7.29% 1405 | 690-694,12.12% 1406 | 675-679,14.91% 1407 | 670-674,13.87% 1408 | 785-789,15.58% 1409 | 670-674,12.12% 1410 | 715-719,7.49% 1411 | 765-769,11.49% 1412 | 730-734,8.90% 1413 | 690-694,11.14% 1414 | 660-664,16.71% 1415 | 745-749,9.63% 1416 | 765-769,7.29% 1417 | 715-719,13.11% 1418 | 765-769,7.51% 1419 | 700-704,10.99% 1420 | 725-729,7.88% 1421 | 695-699,10.16% 1422 | 785-789,10.75% 1423 | 670-674,16.29% 1424 | 790-794,7.68% 1425 | 725-729,12.61% 1426 | 795-799,8.90% 1427 | 670-674,13.49% 1428 | 730-734,12.42% 1429 | 710-714,21.00% 1430 | 660-664,18.39% 1431 | 695-699,12.84% 1432 | 720-724,10.65% 1433 | 675-679,15.31% 1434 | 705-709,10.62% 1435 | 720-724,11.11% 1436 | 760-764,6.03% 1437 | 685-689,11.71% 1438 | 665-669,12.61% 1439 | 725-729,10.38% 1440 | 660-664,24.70% 1441 | 710-714,6.62% 1442 | 660-664,16.29% 1443 | 750-754,6.03% 1444 | 670-674,17.99% 1445 | 715-719,12.12% 1446 | 720-724,7.90% 1447 | 705-709,12.12% 1448 | 800-804,5.42% 1449 | 665-669,14.61% 1450 | 750-754,6.62% 1451 | 715-719,8.90% 1452 | 705-709,9.76% 1453 | 685-689,22.47% 1454 | 670-674,14.33% 1455 | 665-669,20.25% 1456 | 735-739,7.49% 1457 | 680-684,16.29% 1458 | 680-684,15.31% 1459 | 695-699,18.75% 1460 | 665-669,17.77% 1461 | 670-674,14.27% 1462 | 675-679,15.31% 1463 | 700-704,16.29% 1464 | 680-684,19.72% 1465 | 760-764,12.84% 1466 | 685-689,13.11% 1467 | 685-689,13.11% 1468 | 675-679,15.23% 1469 | 790-794,8.00% 1470 | 675-679,14.33% 1471 | 660-664,15.62% 1472 | 660-664,18.79% 1473 | 740-744,7.49% 1474 | 745-749,7.90% 1475 | 695-699,14.09% 1476 | 665-669,21.48% 1477 | 720-724,8.90% 1478 | 690-694,15.80% 1479 | 675-679,18.49% 1480 | 745-749,15.31% 1481 | 730-734,16.77% 1482 | 680-684,18.25% 1483 | 670-674,11.49% 1484 | 720-724,14.42% 1485 | 680-684,12.84% 1486 | 680-684,15.65% 1487 | 690-694,21.97% 1488 | 695-699,12.12% 1489 | 660-664,18.49% 1490 | 700-704,12.98% 1491 | 690-694,14.09% 1492 | 665-669,11.14% 1493 | 660-664,23.76% 1494 | 730-734,7.90% 1495 | 680-684,13.49% 1496 | 670-674,13.99% 1497 | 705-709,16.32% 1498 | 760-764,9.25% 1499 | 700-704,15.80% 1500 | 675-679,15.80% 1501 | 695-699,12.12% 1502 | 695-699,11.49% 1503 | 675-679,13.67% 1504 | 680-684,14.35% 1505 | 695-699,10.59% 1506 | 665-669,17.27% 1507 | 670-674,15.31% 1508 | 705-709,9.76% 1509 | 725-729,11.71% 1510 | 755-759,12.12% 1511 | 705-709,15.31% 1512 | 695-699,10.16% 1513 | 725-729,10.16% 1514 | 700-704,16.07% 1515 | 725-729,7.90% 1516 | 780-784,6.03% 1517 | 680-684,19.22% 1518 | 680-684,14.27% 1519 | 675-679,18.49% 1520 | 710-714,12.12% 1521 | 685-689,12.12% 1522 | 775-779,6.03% 1523 | 680-684,15.31% 1524 | 705-709,12.98% 1525 | 700-704,10.65% 1526 | 705-709,9.76% 1527 | 700-704,12.12% 1528 | 670-674,15.31% 1529 | 665-669,17.27% 1530 | 735-739,7.62% 1531 | 715-719,11.89% 1532 | 725-729,6.54% 1533 | 695-699,12.12% 1534 | 730-734,12.69% 1535 | 735-739,8.49% 1536 | 680-684,13.48% 1537 | 745-749,5.79% 1538 | 750-754,7.90% 1539 | 685-689,15.80% 1540 | 680-684,13.49% 1541 | 725-729,12.12% 1542 | 670-674,16.29% 1543 | 690-694,12.53% 1544 | 700-704,14.33% 1545 | 665-669,16.29% 1546 | 690-694,18.64% 1547 | 675-679,15.31% 1548 | 705-709,12.23% 1549 | 715-719,10.74% 1550 | 715-719,9.33% 1551 | 735-739,7.51% 1552 | 745-749,7.90% 1553 | 660-664,17.99% 1554 | 665-669,17.27% 1555 | 660-664,15.31% 1556 | 715-719,8.90% 1557 | 685-689,13.11% 1558 | 730-734,9.76% 1559 | 660-664,21.14% 1560 | 770-774,7.90% 1561 | 680-684,13.11% 1562 | 695-699,9.76% 1563 | 735-739,7.51% 1564 | 690-694,13.22% 1565 | 700-704,12.73% 1566 | 715-719,10.74% 1567 | 700-704,11.71% 1568 | 700-704,15.80% 1569 | 685-689,9.99% 1570 | 705-709,10.37% 1571 | 710-714,10.16% 1572 | 765-769,7.75% 1573 | 680-684,13.06% 1574 | 685-689,14.09% 1575 | 695-699,16.32% 1576 | 710-714,17.77% 1577 | 720-724,16.29% 1578 | 685-689,13.11% 1579 | 755-759,7.90% 1580 | 670-674,20.99% 1581 | 745-749,6.62% 1582 | 670-674,12.69% 1583 | 750-754,10.99% 1584 | 660-664,16.29% 1585 | 730-734,12.12% 1586 | 705-709,10.74% 1587 | 775-779,7.62% 1588 | 725-729,17.77% 1589 | 735-739,7.88% 1590 | 795-799,8.00% 1591 | 685-689,14.65% 1592 | 705-709,14.09% 1593 | 660-664,19.99% 1594 | 740-744,6.99% 1595 | 710-714,17.27% 1596 | 685-689,10.28% 1597 | 690-694,19.05% 1598 | 675-679,15.33% 1599 | 660-664,15.27% 1600 | 730-734,7.90% 1601 | 700-704,8.90% 1602 | 720-724,7.91% 1603 | 665-669,23.76% 1604 | 690-694,15.80% 1605 | 705-709,8.90% 1606 | 750-754,5.79% 1607 | 705-709,19.42% 1608 | 720-724,14.33% 1609 | 805-809,6.62% 1610 | 685-689,14.65% 1611 | 670-674,12.86% 1612 | 740-744,13.49% 1613 | 680-684,15.27% 1614 | 725-729,11.99% 1615 | 700-704,13.67% 1616 | 665-669,17.99% 1617 | 755-759,7.90% 1618 | 665-669,15.80% 1619 | 685-689,15.95% 1620 | 750-754,6.03% 1621 | 735-739,11.48% 1622 | 740-744,10.99% 1623 | 815-819,6.03% 1624 | 695-699,12.12% 1625 | 780-784,12.53% 1626 | 695-699,15.31% 1627 | 680-684,11.99% 1628 | 670-674,14.96% 1629 | 735-739,7.90% 1630 | 765-769,6.62% 1631 | 665-669,18.55% 1632 | 690-694,20.49% 1633 | 710-714,12.12% 1634 | 640-644,13.87% 1635 | 730-734,7.62% 1636 | 740-744,8.32% 1637 | 710-714,17.27% 1638 | 725-729,14.91% 1639 | 665-669,15.81% 1640 | 725-729,7.49% 1641 | 695-699,13.11% 1642 | 695-699,14.33% 1643 | 670-674,17.99% 1644 | 760-764,6.62% 1645 | 790-794,7.90% 1646 | 680-684,12.12% 1647 | 695-699,15.31% 1648 | 730-734,7.14% 1649 | 720-724,10.25% 1650 | 665-669,18.49% 1651 | 755-759,9.88% 1652 | 660-664,19.99% 1653 | 745-749,7.66% 1654 | 660-664,13.11% 1655 | 660-664,19.91% 1656 | 760-764,7.14% 1657 | 720-724,7.90% 1658 | 665-669,16.29% 1659 | 755-759,10.36% 1660 | 810-814,7.62% 1661 | 700-704,13.11% 1662 | 765-769,7.88% 1663 | 705-709,12.12% 1664 | 690-694,18.79% 1665 | 820-824,7.90% 1666 | 675-679,16.40% 1667 | 685-689,11.66% 1668 | 685-689,16.49% 1669 | 705-709,18.25% 1670 | 670-674,17.15% 1671 | 670-674,14.33% 1672 | 700-704,12.68% 1673 | 690-694,13.43% 1674 | 675-679,14.65% 1675 | 665-669,16.29% 1676 | 680-684,16.32% 1677 | 715-719,11.14% 1678 | 710-714,7.88% 1679 | 685-689,19.42% 1680 | 705-709,11.14% 1681 | 730-734,6.62% 1682 | 730-734,7.90% 1683 | 675-679,21.98% 1684 | 665-669,21.98% 1685 | 695-699,17.27% 1686 | 700-704,15.23% 1687 | 810-814,11.83% 1688 | 640-644,13.75% 1689 | 680-684,15.31% 1690 | 775-779,8.94% 1691 | 680-684,11.71% 1692 | 680-684,14.27% 1693 | 670-674,14.65% 1694 | 675-679,15.31% 1695 | 725-729,11.71% 1696 | 660-664,16.29% 1697 | 695-699,14.09% 1698 | 670-674,13.99% 1699 | 815-819,6.03% 1700 | 730-734,11.86% 1701 | 685-689,11.49% 1702 | 705-709,12.12% 1703 | 705-709,10.75% 1704 | 715-719,6.91% 1705 | 730-734,7.62% 1706 | 695-699,13.11% 1707 | 660-664,15.31% 1708 | 710-714,13.67% 1709 | 680-684,17.14% 1710 | 705-709,10.16% 1711 | 680-684,13.11% 1712 | 775-779,7.74% 1713 | 685-689,19.05% 1714 | 755-759,7.88% 1715 | 680-684,12.12% 1716 | 680-684,13.11% 1717 | 720-724,18.25% 1718 | 745-749,11.14% 1719 | 735-739,8.90% 1720 | 685-689,15.31% 1721 | 760-764,7.62% 1722 | 685-689,14.65% 1723 | 685-689,10.59% 1724 | 725-729,11.99% 1725 | 695-699,17.51% 1726 | 695-699,18.75% 1727 | 705-709,10.16% 1728 | 775-779,7.90% 1729 | 700-704,13.98% 1730 | 690-694,11.14% 1731 | 715-719,7.88% 1732 | 665-669,15.81% 1733 | 685-689,12.12% 1734 | 690-694,14.09% 1735 | 720-724,14.09% 1736 | 740-744,7.90% 1737 | 670-674,22.47% 1738 | 685-689,14.33% 1739 | 740-744,8.90% 1740 | 700-704,17.27% 1741 | 695-699,13.11% 1742 | 675-679,14.09% 1743 | 720-724,7.90% 1744 | 660-664,17.27% 1745 | 730-734,11.86% 1746 | 665-669,19.69% 1747 | 675-679,15.31% 1748 | 700-704,13.99% 1749 | 680-684,13.11% 1750 | 730-734,6.62% 1751 | 710-714,13.67% 1752 | 690-694,11.14% 1753 | 700-704,14.65% 1754 | 750-754,10.95% 1755 | 665-669,14.65% 1756 | 660-664,23.83% 1757 | 690-694,15.80% 1758 | 670-674,13.99% 1759 | 720-724,13.98% 1760 | 695-699,15.21% 1761 | 675-679,15.80% 1762 | 715-719,11.48% 1763 | 710-714,16.02% 1764 | 660-664,13.30% 1765 | 655-659,13.24% 1766 | 700-704,13.11% 1767 | 675-679,20.50% 1768 | 755-759,7.49% 1769 | 725-729,8.88% 1770 | 670-674,14.07% 1771 | 700-704,13.92% 1772 | 770-774,7.62% 1773 | 720-724,8.90% 1774 | 750-754,14.09% 1775 | 805-809,10.00% 1776 | 680-684,21.49% 1777 | 700-704,13.11% 1778 | 710-714,10.65% 1779 | 700-704,16.29% 1780 | 695-699,11.71% 1781 | 680-684,13.57% 1782 | 745-749,10.83% 1783 | 715-719,11.99% 1784 | 710-714,7.90% 1785 | 800-804,7.49% 1786 | 685-689,14.46% 1787 | 695-699,16.29% 1788 | 675-679,14.09% 1789 | 725-729,10.74% 1790 | 660-664,13.11% 1791 | 715-719,6.91% 1792 | 680-684,14.09% 1793 | 685-689,13.11% 1794 | 685-689,16.82% 1795 | 710-714,7.90% 1796 | 660-664,12.92% 1797 | 700-704,14.33% 1798 | 715-719,13.99% 1799 | 755-759,7.51% 1800 | 705-709,12.12% 1801 | 700-704,10.65% 1802 | 695-699,9.76% 1803 | 690-694,11.86% 1804 | 725-729,10.99% 1805 | 740-744,11.99% 1806 | 665-669,14.35% 1807 | 695-699,13.11% 1808 | 750-754,11.49% 1809 | 715-719,18.55% 1810 | 720-724,13.11% 1811 | 710-714,10.16% 1812 | 715-719,8.49% 1813 | 755-759,6.03% 1814 | 740-744,7.49% 1815 | 705-709,17.27% 1816 | 780-784,6.03% 1817 | 700-704,15.99% 1818 | 705-709,11.78% 1819 | 665-669,15.95% 1820 | 755-759,6.54% 1821 | 695-699,12.12% 1822 | 675-679,16.89% 1823 | 685-689,16.77% 1824 | 680-684,19.91% 1825 | 660-664,16.82% 1826 | 670-674,18.55% 1827 | 710-714,14.33% 1828 | 705-709,11.71% 1829 | 720-724,7.90% 1830 | 725-729,14.33% 1831 | 725-729,7.90% 1832 | 700-704,10.99% 1833 | 695-699,19.72% 1834 | 700-704,15.58% 1835 | 670-674,17.93% 1836 | 700-704,8.90% 1837 | 680-684,17.27% 1838 | 725-729,7.90% 1839 | 670-674,14.33% 1840 | 680-684,15.80% 1841 | 720-724,10.74% 1842 | 700-704,11.58% 1843 | 700-704,19.74% 1844 | 690-694,14.46% 1845 | 685-689,21.97% 1846 | 690-694,20.30% 1847 | 680-684,13.11% 1848 | 670-674,17.77% 1849 | 715-719,8.90% 1850 | 745-749,6.62% 1851 | 685-689,12.12% 1852 | 680-684,14.65% 1853 | 805-809,7.62% 1854 | 810-814,6.99% 1855 | 725-729,7.62% 1856 | 670-674,14.22% 1857 | 685-689,14.09% 1858 | 740-744,14.09% 1859 | 665-669,15.80% 1860 | 805-809,6.03% 1861 | 675-679,14.33% 1862 | 680-684,13.92% 1863 | 730-734,7.62% 1864 | 680-684,21.98% 1865 | 660-664,15.05% 1866 | 680-684,12.12% 1867 | 775-779,5.42% 1868 | 790-794,6.99% 1869 | 675-679,18.49% 1870 | 740-744,14.33% 1871 | 680-684,18.79% 1872 | 715-719,10.38% 1873 | 670-674,15.31% 1874 | 760-764,6.03% 1875 | 710-714,11.71% 1876 | 705-709,7.90% 1877 | 705-709,10.99% 1878 | 670-674,16.29% 1879 | 705-709,12.69% 1880 | 695-699,10.99% 1881 | 690-694,15.80% 1882 | 750-754,9.63% 1883 | 690-694,11.78% 1884 | 725-729,7.90% 1885 | 670-674,22.95% 1886 | 680-684,14.09% 1887 | 745-749,7.88% 1888 | 730-734,12.69% 1889 | 670-674,19.22% 1890 | 705-709,12.87% 1891 | 725-729,7.29% 1892 | 755-759,7.90% 1893 | 780-784,8.49% 1894 | 730-734,12.69% 1895 | 705-709,17.27% 1896 | 665-669,24.20% 1897 | 670-674,15.80% 1898 | 735-739,7.88% 1899 | 740-744,13.11% 1900 | 690-694,11.14% 1901 | 685-689,14.96% 1902 | 705-709,15.96% 1903 | 715-719,11.14% 1904 | 760-764,10.74% 1905 | 675-679,13.67% 1906 | 715-719,12.12% 1907 | 690-694,17.77% 1908 | 695-699,12.12% 1909 | 745-749,6.62% 1910 | 745-749,7.62% 1911 | 715-719,11.14% 1912 | 660-664,21.00% 1913 | 785-789,6.03% 1914 | 700-704,17.58% 1915 | 680-684,13.49% 1916 | 695-699,14.09% 1917 | 690-694,12.12% 1918 | 720-724,7.90% 1919 | 705-709,13.11% 1920 | 700-704,12.12% 1921 | 665-669,15.31% 1922 | 695-699,12.42% 1923 | 685-689,13.11% 1924 | 710-714,7.90% 1925 | 660-664,17.27% 1926 | 665-669,16.29% 1927 | 780-784,7.90% 1928 | 715-719,16.29% 1929 | 705-709,11.36% 1930 | 675-679,18.49% 1931 | 700-704,23.76% 1932 | 720-724,14.27% 1933 | 680-684,12.42% 1934 | 680-684,19.03% 1935 | 700-704,11.14% 1936 | 695-699,12.12% 1937 | 760-764,7.51% 1938 | 675-679,14.33% 1939 | 720-724,10.38% 1940 | 735-739,10.65% 1941 | 685-689,11.03% 1942 | 775-779,6.03% 1943 | 675-679,17.27% 1944 | 750-754,13.11% 1945 | 695-699,13.80% 1946 | 755-759,6.62% 1947 | 660-664,17.27% 1948 | 710-714,17.49% 1949 | 700-704,18.64% 1950 | 735-739,10.99% 1951 | 670-674,17.27% 1952 | 815-819,6.62% 1953 | 730-734,12.23% 1954 | 715-719,6.91% 1955 | 700-704,12.12% 1956 | 690-694,21.49% 1957 | 715-719,14.33% 1958 | 685-689,14.26% 1959 | 720-724,10.74% 1960 | 795-799,8.88% 1961 | 705-709,11.86% 1962 | 665-669,14.65% 1963 | 665-669,21.49% 1964 | 665-669,22.95% 1965 | 710-714,7.90% 1966 | 690-694,17.88% 1967 | 675-679,12.42% 1968 | 725-729,7.90% 1969 | 690-694,10.37% 1970 | 695-699,14.33% 1971 | 720-724,10.38% 1972 | 690-694,21.00% 1973 | 715-719,12.23% 1974 | 705-709,21.00% 1975 | 750-754,14.33% 1976 | 665-669,17.49% 1977 | 710-714,6.62% 1978 | 745-749,8.49% 1979 | 685-689,18.49% 1980 | 675-679,22.47% 1981 | 660-664,19.99% 1982 | 655-659,17.54% 1983 | 755-759,6.92% 1984 | 710-714,10.74% 1985 | 710-714,6.91% 1986 | 745-749,9.32% 1987 | 755-759,7.49% 1988 | 680-684,13.11% 1989 | 680-684,15.27% 1990 | 670-674,18.49% 1991 | 780-784,5.42% 1992 | 715-719,7.49% 1993 | 705-709,13.49% 1994 | 710-714,10.16% 1995 | 710-714,8.90% 1996 | 710-714,10.16% 1997 | 710-714,8.88% 1998 | 750-754,10.25% 1999 | 680-684,13.11% 2000 | 705-709,16.29% 2001 | 675-679,14.09% 2002 | 740-744,10.59% 2003 | 700-704,7.66% 2004 | 680-684,15.80% 2005 | 660-664,19.05% 2006 | 705-709,11.14% 2007 | 670-674,17.93% 2008 | 680-684,15.31% 2009 | 690-694,19.05% 2010 | 710-714,10.59% 2011 | 665-669,15.31% 2012 | 710-714,10.16% 2013 | 680-684,10.36% 2014 | 715-719,13.61% 2015 | 695-699,12.12% 2016 | 670-674,14.09% 2017 | 665-669,13.11% 2018 | 690-694,13.11% 2019 | 750-754,12.21% 2020 | 695-699,12.69% 2021 | 705-709,10.16% 2022 | 675-679,14.27% 2023 | 675-679,12.12% 2024 | 740-744,7.49% 2025 | 675-679,15.80% 2026 | 695-699,11.49% 2027 | 690-694,13.49% 2028 | 720-724,13.11% 2029 | 665-669,13.11% 2030 | 805-809,7.40% 2031 | 790-794,6.03% 2032 | 690-694,17.27% 2033 | 715-719,11.14% 2034 | 680-684,13.11% 2035 | 660-664,17.56% 2036 | 770-774,6.03% 2037 | 735-739,11.86% 2038 | 685-689,13.06% 2039 | 735-739,11.14% 2040 | 665-669,19.69% 2041 | 675-679,17.19% 2042 | 735-739,7.90% 2043 | 720-724,11.49% 2044 | 780-784,6.03% 2045 | 735-739,7.51% 2046 | 690-694,13.11% 2047 | 665-669,17.58% 2048 | 675-679,14.33% 2049 | 755-759,13.11% 2050 | 670-674,12.12% 2051 | 670-674,14.65% 2052 | 750-754,7.90% 2053 | 745-749,10.00% 2054 | 715-719,7.49% 2055 | 690-694,13.11% 2056 | 795-799,7.90% 2057 | 665-669,20.49% 2058 | 675-679,14.83% 2059 | 670-674,14.33% 2060 | 745-749,10.65% 2061 | 805-809,6.03% 2062 | 735-739,6.62% 2063 | 720-724,16.29% 2064 | 770-774,6.03% 2065 | 670-674,18.64% 2066 | 750-754,7.49% 2067 | 695-699,21.00% 2068 | 725-729,17.49% 2069 | 680-684,16.89% 2070 | 730-734,12.42% 2071 | 680-684,13.79% 2072 | 775-779,13.49% 2073 | 665-669,17.99% 2074 | 675-679,13.49% 2075 | 795-799,5.42% 2076 | 680-684,14.09% 2077 | 665-669,21.98% 2078 | 740-744,7.62% 2079 | 785-789,6.03% 2080 | 665-669,13.99% 2081 | 800-804,9.99% 2082 | 755-759,17.27% 2083 | 665-669,18.49% 2084 | 675-679,15.31% 2085 | 670-674,16.29% 2086 | 665-669,14.09% 2087 | 685-689,13.11% 2088 | 745-749,12.42% 2089 | 695-699,13.67% 2090 | 680-684,13.99% 2091 | 685-689,13.23% 2092 | 670-674,14.33% 2093 | 675-679,12.12% 2094 | 675-679,15.31% 2095 | 660-664,17.27% 2096 | 670-674,21.27% 2097 | 665-669,19.72% 2098 | 765-769,9.99% 2099 | 660-664,15.05% 2100 | 705-709,11.14% 2101 | 685-689,14.33% 2102 | 690-694,11.14% 2103 | 775-779,8.90% 2104 | 670-674,21.98% 2105 | 705-709,7.90% 2106 | 680-684,13.49% 2107 | 685-689,13.11% 2108 | 680-684,14.83% 2109 | 705-709,13.11% 2110 | 725-729,8.90% 2111 | 695-699,12.12% 2112 | 690-694,13.11% 2113 | 710-714,11.12% 2114 | 720-724,10.74% 2115 | 665-669,17.27% 2116 | 660-664,23.83% 2117 | 700-704,15.27% 2118 | 670-674,17.49% 2119 | 715-719,11.14% 2120 | 700-704,10.16% 2121 | 665-669,16.82% 2122 | 695-699,12.12% 2123 | 680-684,13.99% 2124 | 815-819,7.90% 2125 | 755-759,7.51% 2126 | 700-704,13.11% 2127 | 715-719,12.87% 2128 | 675-679,14.09% 2129 | 685-689,17.27% 2130 | 680-684,12.12% 2131 | 690-694,10.74% 2132 | 660-664,18.25% 2133 | 750-754,5.79% 2134 | 795-799,13.11% 2135 | 685-689,8.90% 2136 | 795-799,6.03% 2137 | 665-669,20.30% 2138 | 730-734,7.49% 2139 | 740-744,13.11% 2140 | 675-679,20.49% 2141 | 720-724,7.49% 2142 | 710-714,7.49% 2143 | 705-709,9.63% 2144 | 700-704,17.27% 2145 | 690-694,15.31% 2146 | 705-709,18.39% 2147 | 690-694,11.14% 2148 | 660-664,22.47% 2149 | 695-699,21.98% 2150 | 675-679,21.49% 2151 | 695-699,9.63% 2152 | 660-664,19.99% 2153 | 690-694,14.33% 2154 | 660-664,18.49% 2155 | 740-744,7.49% 2156 | 720-724,13.11% 2157 | 670-674,13.43% 2158 | 700-704,8.90% 2159 | 735-739,6.03% 2160 | 695-699,12.61% 2161 | 705-709,9.25% 2162 | 665-669,19.69% 2163 | 685-689,12.12% 2164 | 640-644,18.29% 2165 | 700-704,11.14% 2166 | 765-769,5.99% 2167 | 745-749,11.71% 2168 | 660-664,18.49% 2169 | 690-694,12.42% 2170 | 720-724,11.71% 2171 | 705-709,11.99% 2172 | 730-734,13.23% 2173 | 675-679,13.11% 2174 | 675-679,18.49% 2175 | 785-789,7.68% 2176 | 755-759,7.14% 2177 | 685-689,11.14% 2178 | 740-744,8.90% 2179 | 715-719,20.49% 2180 | 795-799,8.49% 2181 | 675-679,17.27% 2182 | 740-744,7.51% 2183 | 760-764,14.96% 2184 | 725-729,6.54% 2185 | 750-754,6.62% 2186 | 800-804,6.03% 2187 | 760-764,10.99% 2188 | 695-699,11.03% 2189 | 700-704,20.49% 2190 | 685-689,20.49% 2191 | 695-699,13.98% 2192 | 675-679,15.80% 2193 | 785-789,8.94% 2194 | 660-664,13.30% 2195 | 675-679,14.65% 2196 | 660-664,22.11% 2197 | 765-769,7.62% 2198 | 680-684,21.00% 2199 | 705-709,13.06% 2200 | 670-674,13.11% 2201 | 670-674,17.77% 2202 | 705-709,13.11% 2203 | 675-679,19.22% 2204 | 725-729,7.90% 2205 | 730-734,13.49% 2206 | 670-674,14.33% 2207 | 720-724,7.49% 2208 | 675-679,15.31% 2209 | 725-729,7.90% 2210 | 705-709,10.99% 2211 | 660-664,18.25% 2212 | 695-699,12.21% 2213 | 765-769,6.03% 2214 | 680-684,14.33% 2215 | 685-689,17.77% 2216 | 765-769,11.11% 2217 | 710-714,7.90% 2218 | 695-699,20.25% 2219 | 695-699,14.65% 2220 | 670-674,15.27% 2221 | 745-749,10.62% 2222 | 705-709,11.36% 2223 | 735-739,6.62% 2224 | 755-759,9.63% 2225 | 690-694,19.22% 2226 | 660-664,23.76% 2227 | 780-784,5.99% 2228 | 735-739,10.99% 2229 | 680-684,13.67% 2230 | 700-704,12.68% 2231 | 665-669,17.77% 2232 | 705-709,7.90% 2233 | 675-679,11.71% 2234 | 685-689,13.67% 2235 | 695-699,11.49% 2236 | 660-664,17.14% 2237 | 660-664,23.28% 2238 | 690-694,17.93% 2239 | 695-699,14.27% 2240 | 680-684,13.99% 2241 | 660-664,17.77% 2242 | 795-799,10.75% 2243 | 720-724,11.14% 2244 | 750-754,7.51% 2245 | 730-734,17.27% 2246 | 700-704,13.11% 2247 | 695-699,16.29% 2248 | 665-669,14.33% 2249 | 690-694,15.80% 2250 | 675-679,14.09% 2251 | 685-689,21.00% 2252 | 690-694,11.71% 2253 | 665-669,14.22% 2254 | 735-739,8.59% 2255 | 670-674,15.80% 2256 | 720-724,8.49% 2257 | 670-674,21.49% 2258 | 710-714,10.75% 2259 | 710-714,18.75% 2260 | 680-684,12.12% 2261 | 675-679,18.25% 2262 | 675-679,15.31% 2263 | 675-679,13.17% 2264 | 680-684,16.82% 2265 | 760-764,7.90% 2266 | 730-734,9.63% 2267 | 735-739,13.23% 2268 | 745-749,11.89% 2269 | 775-779,10.75% 2270 | 750-754,6.03% 2271 | 670-674,20.49% 2272 | 695-699,14.09% 2273 | 685-689,11.71% 2274 | 670-674,16.29% 2275 | 760-764,9.63% 2276 | 700-704,12.69% 2277 | 735-739,8.90% 2278 | 705-709,11.14% 2279 | 725-729,9.76% 2280 | 735-739,11.89% 2281 | 680-684,13.99% 2282 | 675-679,15.81% 2283 | 695-699,14.74% 2284 | 740-744,9.07% 2285 | 670-674,13.99% 2286 | 695-699,11.49% 2287 | 695-699,18.75% 2288 | 685-689,20.49% 2289 | 700-704,11.14% 2290 | 690-694,13.67% 2291 | 675-679,17.99% 2292 | 675-679,15.31% 2293 | 665-669,19.72% 2294 | 680-684,15.31% 2295 | 660-664,17.77% 2296 | 670-674,14.11% 2297 | 660-664,17.27% 2298 | 660-664,20.49% 2299 | 720-724,11.99% 2300 | 705-709,17.58% 2301 | 670-674,15.81% 2302 | 720-724,16.29% 2303 | 690-694,12.12% 2304 | 665-669,15.05% 2305 | 665-669,16.29% 2306 | 680-684,17.51% 2307 | 795-799,6.03% 2308 | 725-729,11.89% 2309 | 695-699,8.90% 2310 | 765-769,7.90% 2311 | 670-674,12.42% 2312 | 740-744,8.94% 2313 | 695-699,12.53% 2314 | 705-709,16.29% 2315 | 690-694,12.12% 2316 | 715-719,12.42% 2317 | 695-699,11.71% 2318 | 675-679,14.33% 2319 | 770-774,6.03% 2320 | 715-719,14.27% 2321 | 745-749,12.42% 2322 | 670-674,13.11% 2323 | 670-674,13.99% 2324 | 670-674,12.92% 2325 | 695-699,16.29% 2326 | 720-724,14.26% 2327 | 710-714,11.99% 2328 | 660-664,20.49% 2329 | 750-754,5.79% 2330 | 785-789,12.68% 2331 | 695-699,12.12% 2332 | 720-724,8.90% 2333 | 695-699,18.64% 2334 | 730-734,8.49% 2335 | 720-724,11.49% 2336 | 665-669,18.49% 2337 | 660-664,16.02% 2338 | 725-729,13.49% 2339 | 680-684,12.12% 2340 | 745-749,7.90% 2341 | 680-684,14.33% 2342 | 700-704,15.80% 2343 | 810-814,8.59% 2344 | 680-684,15.31% 2345 | 780-784,7.68% 2346 | 675-679,17.77% 2347 | 665-669,14.61% 2348 | 690-694,14.33% 2349 | 750-754,6.62% 2350 | 780-784,6.99% 2351 | 690-694,14.91% 2352 | 690-694,13.11% 2353 | 725-729,7.91% 2354 | 705-709,13.72% 2355 | 690-694,13.49% 2356 | 660-664,19.41% 2357 | 690-694,14.09% 2358 | 700-704,12.53% 2359 | 705-709,10.37% 2360 | 805-809,6.62% 2361 | 680-684,12.12% 2362 | 725-729,7.90% 2363 | 805-809,7.62% 2364 | 730-734,14.96% 2365 | 695-699,19.72% 2366 | 665-669,18.25% 2367 | 785-789,10.37% 2368 | 695-699,12.21% 2369 | 670-674,14.33% 2370 | 725-729,18.25% 2371 | 735-739,7.74% 2372 | 715-719,11.49% 2373 | 660-664,22.45% 2374 | 680-684,14.27% 2375 | 710-714,9.91% 2376 | 765-769,7.51% 2377 | 725-729,7.90% 2378 | 705-709,9.76% 2379 | 725-729,6.03% 2380 | 670-674,15.27% 2381 | 710-714,16.29% 2382 | 710-714,13.67% 2383 | 710-714,12.12% 2384 | 680-684,16.83% 2385 | 665-669,19.72% 2386 | 740-744,7.29% 2387 | 755-759,11.83% 2388 | 710-714,12.69% 2389 | 695-699,12.12% 2390 | 775-779,6.03% 2391 | 705-709,12.42% 2392 | 695-699,17.51% 2393 | 675-679,12.12% 2394 | 695-699,20.89% 2395 | 675-679,17.77% 2396 | 660-664,19.03% 2397 | 800-804,7.88% 2398 | 750-754,14.09% 2399 | 680-684,19.03% 2400 | 815-819,8.94% 2401 | 745-749,13.99% 2402 | 705-709,12.69% 2403 | 690-694,14.17% 2404 | 660-664,13.93% 2405 | 690-694,14.09% 2406 | 705-709,13.67% 2407 | 725-729,6.92% 2408 | 675-679,12.12% 2409 | 710-714,11.14% 2410 | 675-679,22.47% 2411 | 775-779,6.99% 2412 | 790-794,6.03% 2413 | 710-714,20.49% 2414 | 740-744,10.99% 2415 | 735-739,8.90% 2416 | 660-664,24.89% 2417 | 660-664,15.96% 2418 | 730-734,10.59% 2419 | 660-664,18.55% 2420 | 680-684,14.11% 2421 | 670-674,18.49% 2422 | 695-699,11.71% 2423 | 760-764,5.79% 2424 | 665-669,22.78% 2425 | 720-724,7.88% 2426 | 670-674,17.77% 2427 | 665-669,14.09% 2428 | 660-664,18.49% 2429 | 690-694,20.49% 2430 | 745-749,13.49% 2431 | 670-674,17.77% 2432 | 660-664,19.05% 2433 | 675-679,21.00% 2434 | 700-704,11.49% 2435 | 695-699,10.16% 2436 | 700-704,14.33% 2437 | 675-679,20.52% 2438 | 710-714,14.79% 2439 | 750-754,9.88% 2440 | 685-689,13.11% 2441 | 665-669,17.27% 2442 | 690-694,18.30% 2443 | 705-709,12.68% 2444 | 675-679,14.33% 2445 | 660-664,14.59% 2446 | 750-754,6.62% 2447 | 725-729,8.90% 2448 | 740-744,8.90% 2449 | 715-719,18.49% 2450 | 680-684,19.04% 2451 | 665-669,22.95% 2452 | 710-714,14.79% 2453 | 665-669,16.00% 2454 | 765-769,7.90% 2455 | 780-784,10.16% 2456 | 680-684,18.64% 2457 | 690-694,16.29% 2458 | 710-714,17.27% 2459 | 705-709,13.22% 2460 | 685-689,13.85% 2461 | 720-724,11.14% 2462 | 710-714,14.09% 2463 | 720-724,7.90% 2464 | 690-694,11.49% 2465 | 735-739,7.90% 2466 | 685-689,13.67% 2467 | 755-759,6.03% 2468 | 670-674,22.47% 2469 | 730-734,7.29% 2470 | 715-719,7.29% 2471 | 755-759,10.74% 2472 | 665-669,22.95% 2473 | 770-774,7.90% 2474 | 685-689,22.45% 2475 | 650-654,15.13% 2476 | 660-664,18.75% 2477 | 675-679,14.09% 2478 | 675-679,14.09% 2479 | 730-734,8.90% 2480 | 725-729,11.71% 2481 | 680-684,15.80% 2482 | 760-764,6.03% 2483 | 810-814,6.62% 2484 | 720-724,7.51% 2485 | 675-679,14.33% 2486 | 690-694,10.16% 2487 | 765-769,10.75% 2488 | 665-669,17.27% 2489 | 660-664,19.99% 2490 | 685-689,15.81% 2491 | 670-674,18.75% 2492 | 710-714,11.71% 2493 | 720-724,7.62% 2494 | 710-714,10.08% 2495 | 675-679,23.28% 2496 | 685-689,14.65% 2497 | 705-709,16.77% 2498 | 740-744,14.09% 2499 | 680-684,13.99% 2500 | 675-679,12.42% 2501 | 670-674,13.79% 2502 | -------------------------------------------------------------------------------- /logistic_regression/example1/example1.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "image/color" 5 | "log" 6 | "math" 7 | 8 | "gonum.org/v1/plot" 9 | "gonum.org/v1/plot/plotter" 10 | "gonum.org/v1/plot/vg" 11 | ) 12 | 13 | // logistic implements the logistic function, which 14 | // is used in logistic regression. 15 | func logistic(x float64) float64 { 16 | return 1 / (1 + math.Exp(-x)) 17 | } 18 | 19 | func main() { 20 | 21 | // Create a new plot. 22 | p, err := plot.New() 23 | if err != nil { 24 | log.Fatal(err) 25 | } 26 | p.Title.Text = "Logistic Function" 27 | p.X.Label.Text = "x" 28 | p.Y.Label.Text = "f(x)" 29 | 30 | // Create the plotter function. 31 | logisticPlotter := plotter.NewFunction(func(x float64) float64 { return logistic(x) }) 32 | logisticPlotter.Color = color.RGBA{B: 255, A: 255} 33 | 34 | // Add the plotter function to the plot. 35 | p.Add(logisticPlotter) 36 | 37 | // Set the axis ranges. Unlike other data sets, 38 | // functions don't set the axis ranges automatically 39 | // since functions don't necessarily have a 40 | // finite range of x and y values. 41 | p.X.Min = -10 42 | p.X.Max = 10 43 | p.Y.Min = -0.1 44 | p.Y.Max = 1.1 45 | 46 | // Save the plot to a PNG file. 47 | if err := p.Save(4*vg.Inch, 4*vg.Inch, "logistic.png"); err != nil { 48 | log.Fatal(err) 49 | } 50 | } 51 | -------------------------------------------------------------------------------- /logistic_regression/example2/example2.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "log" 6 | "os" 7 | "strconv" 8 | "strings" 9 | ) 10 | 11 | func main() { 12 | 13 | // Open the loan dataset file. 14 | f, err := os.Open("../data/loan_data.csv") 15 | if err != nil { 16 | log.Fatal(err) 17 | } 18 | defer f.Close() 19 | 20 | // Create a new CSV reader reading from the opened file. 21 | reader := csv.NewReader(f) 22 | reader.FieldsPerRecord = 2 23 | 24 | // Read in all of the CSV records 25 | rawCSVData, err := reader.ReadAll() 26 | if err != nil { 27 | log.Fatal(err) 28 | } 29 | 30 | // Create the output file. 31 | f, err = os.Create("../data/clean_loan_data.csv") 32 | if err != nil { 33 | log.Fatal(err) 34 | } 35 | defer f.Close() 36 | 37 | // Create a CSV writer. 38 | w := csv.NewWriter(f) 39 | 40 | // Sequentially move the rows writing out the parsed values. 41 | for idx, record := range rawCSVData { 42 | 43 | // Skip the header row. 44 | if idx == 0 { 45 | 46 | // Write the header to the output file. 47 | if err := w.Write([]string{"FICO_score", "class"}); err != nil { 48 | log.Fatal(err) 49 | } 50 | continue 51 | } 52 | 53 | // Initialize a slice to hold our parsed values. 54 | outRecord := make([]string, 2) 55 | 56 | // Parse and normalize the FICO score. 57 | score, err := strconv.ParseFloat(strings.Split(record[0], "-")[0], 64) 58 | if err != nil { 59 | log.Fatal(err) 60 | } 61 | 62 | outRecord[0] = strconv.FormatFloat((score-640.0)/(830.0-640.0), 'f', 4, 64) 63 | 64 | // Parse the Interest rate class. 65 | rate, err := strconv.ParseFloat(strings.TrimSuffix(record[1], "%"), 64) 66 | if err != nil { 67 | log.Fatal(err) 68 | } 69 | 70 | if rate <= 12.0 { 71 | outRecord[1] = "1.0" 72 | 73 | // Write the record to the output file. 74 | if err := w.Write(outRecord); err != nil { 75 | log.Fatal(err) 76 | } 77 | continue 78 | } 79 | 80 | outRecord[1] = "0.0" 81 | 82 | // Write the record to the output file. 83 | if err := w.Write(outRecord); err != nil { 84 | log.Fatal(err) 85 | } 86 | } 87 | 88 | // Write any buffered data to the underlying writer (standard output). 89 | w.Flush() 90 | 91 | if err := w.Error(); err != nil { 92 | log.Fatal(err) 93 | } 94 | } 95 | -------------------------------------------------------------------------------- /logistic_regression/example3/example3.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "fmt" 5 | "log" 6 | "os" 7 | 8 | "github.com/go-gota/gota/dataframe" 9 | "gonum.org/v1/plot" 10 | "gonum.org/v1/plot/plotter" 11 | "gonum.org/v1/plot/vg" 12 | ) 13 | 14 | func main() { 15 | 16 | // Open the CSV file. 17 | loanDataFile, err := os.Open("../data/clean_loan_data.csv") 18 | if err != nil { 19 | log.Fatal(err) 20 | } 21 | defer loanDataFile.Close() 22 | 23 | // Create a dataframe from the CSV file. 24 | loanDF := dataframe.ReadCSV(loanDataFile) 25 | 26 | // Use the Describe method to calculate summary statistics 27 | // for all of the columns in one shot. 28 | loanSummary := loanDF.Describe() 29 | 30 | // Output the summary statistics to stdout. 31 | fmt.Println(loanSummary) 32 | 33 | // Create a histogram for each of the columns in the dataset. 34 | for _, colName := range loanDF.Names() { 35 | 36 | // Create a plotter.Values value and fill it with the 37 | // values from the respective column of the dataframe. 38 | plotVals := make(plotter.Values, loanDF.Nrow()) 39 | for i, floatVal := range loanDF.Col(colName).Float() { 40 | plotVals[i] = floatVal 41 | } 42 | 43 | // Make a plot and set its title. 44 | p, err := plot.New() 45 | if err != nil { 46 | log.Fatal(err) 47 | } 48 | p.Title.Text = fmt.Sprintf("Histogram of %s", colName) 49 | 50 | // Create a histogram of our values. 51 | h, err := plotter.NewHist(plotVals, 16) 52 | if err != nil { 53 | log.Fatal(err) 54 | } 55 | 56 | // Normalize the histogram. 57 | h.Normalize(1) 58 | 59 | // Add the histogram to the plot. 60 | p.Add(h) 61 | 62 | // Save the plot to a PNG file. 63 | if err := p.Save(4*vg.Inch, 4*vg.Inch, colName+"_hist.png"); err != nil { 64 | log.Fatal(err) 65 | } 66 | } 67 | } 68 | -------------------------------------------------------------------------------- /logistic_regression/example4/example4.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "bufio" 5 | "log" 6 | "os" 7 | 8 | "github.com/go-gota/gota/dataframe" 9 | ) 10 | 11 | func main() { 12 | 13 | // Open the clean loan dataset file. 14 | f, err := os.Open("../data/clean_loan_data.csv") 15 | if err != nil { 16 | log.Fatal(err) 17 | } 18 | defer f.Close() 19 | 20 | // Create a dataframe from the CSV file. 21 | // The types of the columns will be inferred. 22 | loanDF := dataframe.ReadCSV(f) 23 | 24 | // Calculate the number of elements in each set. 25 | trainingNum := (4 * loanDF.Nrow()) / 5 26 | testNum := loanDF.Nrow() / 5 27 | if trainingNum+testNum < loanDF.Nrow() { 28 | trainingNum++ 29 | } 30 | 31 | // Create the subset indices. 32 | trainingIdx := make([]int, trainingNum) 33 | testIdx := make([]int, testNum) 34 | 35 | // Enumerate the training indices. 36 | for i := 0; i < trainingNum; i++ { 37 | trainingIdx[i] = i 38 | } 39 | 40 | // Enumerate the test indices. 41 | for i := 0; i < testNum; i++ { 42 | testIdx[i] = trainingNum + i 43 | } 44 | 45 | // Create the subset dataframes. 46 | trainingDF := loanDF.Subset(trainingIdx) 47 | testDF := loanDF.Subset(testIdx) 48 | 49 | // Create a map that will be used in writing the data 50 | // to files. 51 | setMap := map[int]dataframe.DataFrame{ 52 | 0: trainingDF, 53 | 1: testDF, 54 | } 55 | 56 | // Create the respective files. 57 | for idx, setName := range []string{"../data/training.csv", "../data/test.csv"} { 58 | 59 | // Save the filtered dataset file. 60 | f, err := os.Create(setName) 61 | if err != nil { 62 | log.Fatal(err) 63 | } 64 | 65 | // Create a buffered writer. 66 | w := bufio.NewWriter(f) 67 | 68 | // Write the dataframe out as a CSV. 69 | if err := setMap[idx].WriteCSV(w); err != nil { 70 | log.Fatal(err) 71 | } 72 | } 73 | } 74 | -------------------------------------------------------------------------------- /logistic_regression/example5/example5.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "fmt" 6 | "log" 7 | "math" 8 | "math/rand" 9 | "os" 10 | "strconv" 11 | "time" 12 | 13 | "github.com/gonum/matrix/mat64" 14 | ) 15 | 16 | func main() { 17 | 18 | // Open the training dataset file. 19 | f, err := os.Open("../data/training.csv") 20 | if err != nil { 21 | log.Fatal(err) 22 | } 23 | defer f.Close() 24 | 25 | // Create a new CSV reader reading from the opened file. 26 | reader := csv.NewReader(f) 27 | reader.FieldsPerRecord = 2 28 | 29 | // Read in all of the CSV records 30 | rawCSVData, err := reader.ReadAll() 31 | if err != nil { 32 | log.Fatal(err) 33 | } 34 | 35 | // featureData and labels will hold all the float values that 36 | // will eventually be used in our training. 37 | featureData := make([]float64, 2*(len(rawCSVData)-1)) 38 | labels := make([]float64, len(rawCSVData)-1) 39 | 40 | // featureIndex will track the current index of the features 41 | // matrix values. 42 | var featureIndex int 43 | 44 | // Sequentially move the rows into the slices of floats. 45 | for idx, record := range rawCSVData { 46 | 47 | // Skip the header row. 48 | if idx == 0 { 49 | continue 50 | } 51 | 52 | // Add the FICO score feature. 53 | featureVal, err := strconv.ParseFloat(record[0], 64) 54 | if err != nil { 55 | log.Fatal(err) 56 | } 57 | 58 | featureData[featureIndex] = featureVal 59 | 60 | // Add an intercept. 61 | featureData[featureIndex+1] = 1.0 62 | 63 | // Increment our feature row. 64 | featureIndex += 2 65 | 66 | // Add the class label. 67 | labelVal, err := strconv.ParseFloat(record[1], 64) 68 | if err != nil { 69 | log.Fatal(err) 70 | } 71 | 72 | labels[idx-1] = labelVal 73 | } 74 | 75 | // Form a matrix from the features. 76 | features := mat64.NewDense(len(rawCSVData)-1, 2, featureData) 77 | 78 | // Train the logistic regression model. 79 | weights := logisticRegression(features, labels, 4000, 1e-4) 80 | 81 | // Output the Logistic Regression model formula to stdout. 82 | formula := "p = 1 / ( 1 + exp(- m1 * FICO.score - m2) )" 83 | fmt.Printf("\n%s\n\nm1 = %0.2f\nm2 = %0.2f\n\n", formula, weights[0], weights[1]) 84 | } 85 | 86 | // logistic implements the logistic function, which 87 | // is used in logistic regression. 88 | func logistic(x float64) float64 { 89 | return 1.0 / (1.0 + math.Exp(-x)) 90 | } 91 | 92 | // logisticRegression fits a logistic regression model 93 | // for the given data. 94 | func logisticRegression(features *mat64.Dense, labels []float64, numSteps int, learningRate float64) []float64 { 95 | 96 | // Initialize random weights. 97 | _, numWeights := features.Dims() 98 | weights := make([]float64, numWeights) 99 | 100 | s := rand.NewSource(time.Now().UnixNano()) 101 | r := rand.New(s) 102 | 103 | for idx := range weights { 104 | weights[idx] = r.Float64() 105 | } 106 | 107 | // Iteratively optimize the weights. 108 | for i := 0; i < numSteps; i++ { 109 | 110 | // Initialize a variable to accumulate error for this iteration. 111 | var sumError float64 112 | 113 | // Make predictions for each label and accumlate error. 114 | for idx, label := range labels { 115 | 116 | // Get the features corresponding to this label. 117 | featureRow := mat64.Row(nil, idx, features) 118 | 119 | // Calculate the error for this iteration's weights. 120 | pred := logistic(featureRow[0]*weights[0] + featureRow[1]*weights[1]) 121 | predError := label - pred 122 | sumError += math.Pow(predError, 2) 123 | 124 | // Update the feature weights. 125 | for j := 0; j < len(featureRow); j++ { 126 | weights[j] += learningRate * predError * pred * (1 - pred) * featureRow[j] 127 | } 128 | } 129 | } 130 | 131 | return weights 132 | } 133 | -------------------------------------------------------------------------------- /logistic_regression/example6/example6.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "fmt" 6 | "io" 7 | "log" 8 | "math" 9 | "os" 10 | "strconv" 11 | ) 12 | 13 | func main() { 14 | 15 | // Open the test examples. 16 | f, err := os.Open("../data/test.csv") 17 | if err != nil { 18 | log.Fatal(err) 19 | } 20 | defer f.Close() 21 | 22 | // Create a new CSV reader reading from the opened file. 23 | reader := csv.NewReader(f) 24 | 25 | // observed and predicted will hold the parsed observed and predicted values 26 | // form the labeled data file. 27 | var observed []float64 28 | var predicted []float64 29 | 30 | // line will track row numbers for logging. 31 | line := 1 32 | 33 | // Read in the records looking for unexpected types in the columns. 34 | for { 35 | 36 | // Read in a row. Check if we are at the end of the file. 37 | record, err := reader.Read() 38 | if err == io.EOF { 39 | break 40 | } 41 | 42 | // Skip the header. 43 | if line == 1 { 44 | line++ 45 | continue 46 | } 47 | 48 | // Read in the observed value. 49 | observedVal, err := strconv.ParseFloat(record[1], 64) 50 | if err != nil { 51 | log.Printf("Parsing line %d failed, unexpected type\n", line) 52 | continue 53 | } 54 | 55 | // Make the corresponding prediction. 56 | score, err := strconv.ParseFloat(record[0], 64) 57 | if err != nil { 58 | log.Printf("Parsing line %d failed, unexpected type\n", line) 59 | continue 60 | } 61 | 62 | predictedVal := predict(score) 63 | 64 | // Append the record to our slice, if it has the expected type. 65 | observed = append(observed, observedVal) 66 | predicted = append(predicted, predictedVal) 67 | line++ 68 | } 69 | 70 | // This variable will hold our count of true positive and 71 | // true negative values. 72 | var truePosNeg int 73 | 74 | // Accumulate the true positive/negative count. 75 | for idx, oVal := range observed { 76 | if oVal == predicted[idx] { 77 | truePosNeg++ 78 | } 79 | } 80 | 81 | // Calculate the accuracy (subset accuracy). 82 | accuracy := float64(truePosNeg) / float64(len(observed)) 83 | 84 | // Output the Accuracy value to standard out. 85 | fmt.Printf("\nAccuracy = %0.2f\n\n", accuracy) 86 | } 87 | 88 | // predict makes a prediction based on our 89 | // trained logistic regression model. 90 | func predict(score float64) float64 { 91 | 92 | // Calculate the predicted probability. 93 | p := 1 / (1 + math.Exp(-5.45*score+2.23)) 94 | 95 | // Output the corresponding class. 96 | if p >= 0.5 { 97 | return 1.0 98 | } 99 | 100 | return 0.0 101 | } 102 | -------------------------------------------------------------------------------- /logistic_regression/example7/example7.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "log" 6 | "os" 7 | "strconv" 8 | 9 | "github.com/cdipaolo/goml/base" 10 | "github.com/cdipaolo/goml/linear" 11 | ) 12 | 13 | func main() { 14 | 15 | // Open the training dataset file. 16 | f, err := os.Open("../data/training.csv") 17 | if err != nil { 18 | log.Fatal(err) 19 | } 20 | defer f.Close() 21 | 22 | // Create a new CSV reader reading from the opened file. 23 | reader := csv.NewReader(f) 24 | reader.FieldsPerRecord = 2 25 | 26 | // Read in all of the CSV records 27 | rawCSVData, err := reader.ReadAll() 28 | if err != nil { 29 | log.Fatal(err) 30 | } 31 | 32 | // features and labels will hold all the float values that 33 | // will eventually be used in our training. 34 | var features [][]float64 35 | var labels []float64 36 | 37 | // Sequentially move the rows into the slices of floats. 38 | for idx, record := range rawCSVData { 39 | 40 | // Skip the header row. 41 | if idx == 0 { 42 | continue 43 | } 44 | 45 | // Add the FICO score feature. 46 | featureVal, err := strconv.ParseFloat(record[0], 64) 47 | if err != nil { 48 | log.Fatal(err) 49 | } 50 | 51 | features = append(features, []float64{featureVal}) 52 | 53 | // Add the class label. 54 | labelVal, err := strconv.ParseFloat(record[1], 64) 55 | if err != nil { 56 | log.Fatal(err) 57 | } 58 | 59 | labels = append(labels, labelVal) 60 | } 61 | 62 | // New Logistic regression model from goml (optimization method: 63 | // Batch Gradient Ascent, Learning rate: 1e-4, Regulatization term: 6, 64 | // Max Iterations: 800, Dataset to learn fron: features, 65 | // Expected results dataset: labels) 66 | model := linear.NewLogistic(base.BatchGA, 1e-4, 1, 800, features, labels) 67 | 68 | // Train the logistic regression model. 69 | if err = model.Learn(); err != nil { 70 | log.Fatal(err) 71 | } 72 | } 73 | -------------------------------------------------------------------------------- /neural_networks/data/test.csv: -------------------------------------------------------------------------------- 1 | sepal_length,sepal_width,petal_length,petal_width,setosa,virginica,versicolor 2 | 0.583333333333,0.291666666667,0.728813559322,0.75,0.0,1.0,0.0 3 | 0.0833333333333,0.458333333333,0.0847457627119,0.0416666666667,1.0,0.0,0.0 4 | 0.194444444444,0.583333333333,0.101694915254,0.125,1.0,0.0,0.0 5 | 0.916666666667,0.416666666667,0.949152542373,0.833333333333,0.0,1.0,0.0 6 | 0.611111111111,0.416666666667,0.813559322034,0.875,0.0,1.0,0.0 7 | 0.694444444444,0.333333333333,0.64406779661,0.541666666667,0.0,0.0,1.0 8 | 0.166666666667,0.458333333333,0.0847457627119,0.0,1.0,0.0,0.0 9 | 0.138888888889,0.416666666667,0.0677966101695,0.0,1.0,0.0,0.0 10 | 0.222222222222,0.625,0.0677966101695,0.0833333333333,1.0,0.0,0.0 11 | 0.5,0.25,0.779661016949,0.541666666667,0.0,1.0,0.0 12 | 0.555555555556,0.208333333333,0.661016949153,0.583333333333,0.0,0.0,1.0 13 | 0.472222222222,0.583333333333,0.593220338983,0.625,0.0,0.0,1.0 14 | 0.222222222222,0.75,0.0847457627119,0.0833333333333,1.0,0.0,0.0 15 | 0.194444444444,0.0,0.423728813559,0.375,0.0,0.0,1.0 16 | 0.416666666667,0.25,0.508474576271,0.458333333333,0.0,0.0,1.0 17 | 0.444444444444,0.416666666667,0.542372881356,0.583333333333,0.0,0.0,1.0 18 | 0.722222222222,0.458333333333,0.661016949153,0.583333333333,0.0,0.0,1.0 19 | 0.805555555556,0.666666666667,0.864406779661,1.0,0.0,1.0,0.0 20 | 0.5,0.333333333333,0.508474576271,0.5,0.0,0.0,1.0 21 | 0.138888888889,0.458333333333,0.101694915254,0.0416666666667,1.0,0.0,0.0 22 | 0.416666666667,0.833333333333,0.0338983050847,0.0416666666667,1.0,0.0,0.0 23 | 0.944444444444,0.25,1.0,0.916666666667,0.0,1.0,0.0 24 | 0.222222222222,0.583333333333,0.0847457627119,0.0416666666667,1.0,0.0,0.0 25 | 0.527777777778,0.375,0.559322033898,0.5,0.0,0.0,1.0 26 | 0.638888888889,0.416666666667,0.576271186441,0.541666666667,0.0,0.0,1.0 27 | 0.0277777777778,0.416666666667,0.0508474576271,0.0416666666667,1.0,0.0,0.0 28 | 0.0,0.416666666667,0.0169491525424,0.0,1.0,0.0,0.0 29 | 0.555555555556,0.583333333333,0.779661016949,0.958333333333,0.0,1.0,0.0 30 | 0.361111111111,0.291666666667,0.542372881356,0.5,0.0,0.0,1.0 31 | 0.194444444444,0.125,0.389830508475,0.375,0.0,0.0,1.0 32 | -------------------------------------------------------------------------------- /neural_networks/data/train.csv: -------------------------------------------------------------------------------- 1 | sepal_length,sepal_width,petal_length,petal_width,setosa,virginica,versicolor 2 | 0.0833333333333,0.666666666667,0.0,0.0416666666667,1.0,0.0,0.0 3 | 0.722222222222,0.458333333333,0.694915254237,0.916666666667,0.0,1.0,0.0 4 | 0.666666666667,0.416666666667,0.677966101695,0.666666666667,0.0,0.0,1.0 5 | 0.777777777778,0.416666666667,0.830508474576,0.833333333333,0.0,1.0,0.0 6 | 0.666666666667,0.458333333333,0.779661016949,0.958333333333,0.0,1.0,0.0 7 | 0.388888888889,0.416666666667,0.542372881356,0.458333333333,0.0,0.0,1.0 8 | 0.666666666667,0.541666666667,0.796610169492,0.833333333333,0.0,1.0,0.0 9 | 0.305555555556,0.583333333333,0.0847457627119,0.125,1.0,0.0,0.0 10 | 0.416666666667,0.291666666667,0.525423728814,0.375,0.0,0.0,1.0 11 | 0.0833333333333,0.583333333333,0.0677966101695,0.0833333333333,1.0,0.0,0.0 12 | 0.222222222222,0.208333333333,0.338983050847,0.416666666667,0.0,0.0,1.0 13 | 0.555555555556,0.208333333333,0.677966101695,0.75,0.0,1.0,0.0 14 | 0.333333333333,0.166666666667,0.457627118644,0.375,0.0,0.0,1.0 15 | 0.333333333333,0.625,0.0508474576271,0.0416666666667,1.0,0.0,0.0 16 | 0.555555555556,0.541666666667,0.627118644068,0.625,0.0,0.0,1.0 17 | 0.388888888889,1.0,0.0847457627119,0.125,1.0,0.0,0.0 18 | 0.527777777778,0.583333333333,0.745762711864,0.916666666667,0.0,1.0,0.0 19 | 0.361111111111,0.375,0.440677966102,0.5,0.0,0.0,1.0 20 | 0.944444444444,0.75,0.966101694915,0.875,0.0,1.0,0.0 21 | 0.361111111111,0.416666666667,0.593220338983,0.583333333333,0.0,0.0,1.0 22 | 0.166666666667,0.208333333333,0.593220338983,0.666666666667,0.0,1.0,0.0 23 | 0.416666666667,0.291666666667,0.694915254237,0.75,0.0,1.0,0.0 24 | 0.388888888889,0.75,0.118644067797,0.0833333333333,1.0,0.0,0.0 25 | 0.472222222222,0.291666666667,0.694915254237,0.625,0.0,0.0,1.0 26 | 0.194444444444,0.625,0.101694915254,0.208333333333,1.0,0.0,0.0 27 | 0.333333333333,0.166666666667,0.474576271186,0.416666666667,0.0,0.0,1.0 28 | 0.611111111111,0.416666666667,0.762711864407,0.708333333333,0.0,1.0,0.0 29 | 0.555555555556,0.291666666667,0.661016949153,0.708333333333,0.0,1.0,0.0 30 | 0.472222222222,0.416666666667,0.64406779661,0.708333333333,0.0,1.0,0.0 31 | 0.277777777778,0.708333333333,0.0847457627119,0.0416666666667,1.0,0.0,0.0 32 | 0.138888888889,0.583333333333,0.152542372881,0.0416666666667,1.0,0.0,0.0 33 | 0.25,0.583333333333,0.0677966101695,0.0416666666667,1.0,0.0,0.0 34 | 0.388888888889,0.333333333333,0.593220338983,0.5,0.0,0.0,1.0 35 | 0.222222222222,0.75,0.152542372881,0.125,1.0,0.0,0.0 36 | 0.361111111111,0.416666666667,0.525423728814,0.5,0.0,0.0,1.0 37 | 0.0277777777778,0.375,0.0677966101695,0.0416666666667,1.0,0.0,0.0 38 | 0.166666666667,0.166666666667,0.389830508475,0.375,0.0,0.0,1.0 39 | 0.722222222222,0.458333333333,0.745762711864,0.833333333333,0.0,1.0,0.0 40 | 0.527777777778,0.333333333333,0.64406779661,0.708333333333,0.0,1.0,0.0 41 | 0.0277777777778,0.5,0.0508474576271,0.0416666666667,1.0,0.0,0.0 42 | 0.666666666667,0.458333333333,0.576271186441,0.541666666667,0.0,0.0,1.0 43 | 0.333333333333,0.208333333333,0.508474576271,0.5,0.0,0.0,1.0 44 | 0.0833333333333,0.5,0.0677966101695,0.0416666666667,1.0,0.0,0.0 45 | 0.194444444444,0.416666666667,0.101694915254,0.0416666666667,1.0,0.0,0.0 46 | 0.5,0.375,0.627118644068,0.541666666667,0.0,0.0,1.0 47 | 0.305555555556,0.791666666667,0.118644067797,0.125,1.0,0.0,0.0 48 | 0.583333333333,0.375,0.559322033898,0.5,0.0,0.0,1.0 49 | 0.166666666667,0.416666666667,0.0677966101695,0.0416666666667,1.0,0.0,0.0 50 | 0.583333333333,0.458333333333,0.762711864407,0.708333333333,0.0,1.0,0.0 51 | 0.666666666667,0.208333333333,0.813559322034,0.708333333333,0.0,1.0,0.0 52 | 0.222222222222,0.625,0.0677966101695,0.0416666666667,1.0,0.0,0.0 53 | 0.611111111111,0.333333333333,0.610169491525,0.583333333333,0.0,0.0,1.0 54 | 0.5,0.416666666667,0.610169491525,0.541666666667,0.0,0.0,1.0 55 | 0.416666666667,0.333333333333,0.694915254237,0.958333333333,0.0,1.0,0.0 56 | 0.388888888889,0.375,0.542372881356,0.5,0.0,0.0,1.0 57 | 0.166666666667,0.458333333333,0.0847457627119,0.0,1.0,0.0,0.0 58 | 0.472222222222,0.0833333333333,0.677966101695,0.583333333333,0.0,1.0,0.0 59 | 0.444444444444,0.416666666667,0.694915254237,0.708333333333,0.0,1.0,0.0 60 | 0.361111111111,0.333333333333,0.661016949153,0.791666666667,0.0,1.0,0.0 61 | 0.555555555556,0.125,0.576271186441,0.5,0.0,0.0,1.0 62 | 0.944444444444,0.416666666667,0.864406779661,0.916666666667,0.0,1.0,0.0 63 | 0.611111111111,0.5,0.694915254237,0.791666666667,0.0,1.0,0.0 64 | 0.555555555556,0.375,0.779661016949,0.708333333333,0.0,1.0,0.0 65 | 0.25,0.291666666667,0.491525423729,0.541666666667,0.0,0.0,1.0 66 | 0.25,0.875,0.0847457627119,0.0,1.0,0.0,0.0 67 | 0.166666666667,0.458333333333,0.0847457627119,0.0,1.0,0.0,0.0 68 | 0.138888888889,0.583333333333,0.101694915254,0.0416666666667,1.0,0.0,0.0 69 | 0.666666666667,0.541666666667,0.796610169492,1.0,0.0,1.0,0.0 70 | 0.305555555556,0.791666666667,0.0508474576271,0.125,1.0,0.0,0.0 71 | 0.75,0.5,0.627118644068,0.541666666667,0.0,0.0,1.0 72 | 0.638888888889,0.375,0.610169491525,0.5,0.0,0.0,1.0 73 | 0.833333333333,0.375,0.898305084746,0.708333333333,0.0,1.0,0.0 74 | 0.222222222222,0.541666666667,0.118644067797,0.166666666667,1.0,0.0,0.0 75 | 0.583333333333,0.333333333333,0.779661016949,0.875,0.0,1.0,0.0 76 | 0.333333333333,0.25,0.576271186441,0.458333333333,0.0,0.0,1.0 77 | 0.555555555556,0.333333333333,0.694915254237,0.583333333333,0.0,1.0,0.0 78 | 0.333333333333,0.916666666667,0.0677966101695,0.0416666666667,1.0,0.0,0.0 79 | 0.861111111111,0.333333333333,0.864406779661,0.75,0.0,1.0,0.0 80 | 0.388888888889,0.208333333333,0.677966101695,0.791666666667,0.0,1.0,0.0 81 | 0.944444444444,0.333333333333,0.966101694915,0.791666666667,0.0,1.0,0.0 82 | 0.416666666667,0.291666666667,0.694915254237,0.75,0.0,1.0,0.0 83 | 0.388888888889,0.25,0.423728813559,0.375,0.0,0.0,1.0 84 | 0.194444444444,0.625,0.0508474576271,0.0833333333333,1.0,0.0,0.0 85 | 0.0555555555556,0.125,0.0508474576271,0.0833333333333,1.0,0.0,0.0 86 | 0.222222222222,0.75,0.101694915254,0.0416666666667,1.0,0.0,0.0 87 | 0.694444444444,0.416666666667,0.762711864407,0.833333333333,0.0,1.0,0.0 88 | 0.361111111111,0.208333333333,0.491525423729,0.416666666667,0.0,0.0,1.0 89 | 0.583333333333,0.333333333333,0.779661016949,0.833333333333,0.0,1.0,0.0 90 | 0.305555555556,0.416666666667,0.593220338983,0.583333333333,0.0,0.0,1.0 91 | 0.111111111111,0.5,0.101694915254,0.0416666666667,1.0,0.0,0.0 92 | 0.694444444444,0.5,0.830508474576,0.916666666667,0.0,1.0,0.0 93 | 0.305555555556,0.708333333333,0.0847457627119,0.0416666666667,1.0,0.0,0.0 94 | 0.111111111111,0.5,0.0508474576271,0.0416666666667,1.0,0.0,0.0 95 | 0.555555555556,0.541666666667,0.847457627119,1.0,0.0,1.0,0.0 96 | 0.138888888889,0.416666666667,0.0677966101695,0.0833333333333,1.0,0.0,0.0 97 | 0.611111111111,0.416666666667,0.71186440678,0.791666666667,0.0,1.0,0.0 98 | 0.472222222222,0.0833333333333,0.508474576271,0.375,0.0,0.0,1.0 99 | 0.416666666667,0.291666666667,0.491525423729,0.458333333333,0.0,0.0,1.0 100 | 0.194444444444,0.541666666667,0.0677966101695,0.0416666666667,1.0,0.0,0.0 101 | 0.666666666667,0.416666666667,0.71186440678,0.916666666667,0.0,1.0,0.0 102 | 0.805555555556,0.416666666667,0.813559322034,0.625,0.0,1.0,0.0 103 | 0.722222222222,0.5,0.796610169492,0.916666666667,0.0,1.0,0.0 104 | 0.444444444444,0.5,0.64406779661,0.708333333333,0.0,0.0,1.0 105 | 0.527777777778,0.0833333333333,0.593220338983,0.583333333333,0.0,0.0,1.0 106 | 0.805555555556,0.5,0.847457627119,0.708333333333,0.0,1.0,0.0 107 | 0.194444444444,0.583333333333,0.0847457627119,0.0416666666667,1.0,0.0,0.0 108 | 0.5,0.416666666667,0.661016949153,0.708333333333,0.0,1.0,0.0 109 | 0.583333333333,0.5,0.728813559322,0.916666666667,0.0,1.0,0.0 110 | 0.583333333333,0.5,0.593220338983,0.583333333333,0.0,0.0,1.0 111 | 0.194444444444,0.5,0.0338983050847,0.0416666666667,1.0,0.0,0.0 112 | 0.194444444444,0.666666666667,0.0677966101695,0.0416666666667,1.0,0.0,0.0 113 | 0.222222222222,0.708333333333,0.0847457627119,0.125,1.0,0.0,0.0 114 | 0.666666666667,0.458333333333,0.627118644068,0.583333333333,0.0,0.0,1.0 115 | 0.388888888889,0.333333333333,0.525423728814,0.5,0.0,0.0,1.0 116 | 0.25,0.625,0.0847457627119,0.0416666666667,1.0,0.0,0.0 117 | 0.5,0.333333333333,0.627118644068,0.458333333333,0.0,0.0,1.0 118 | 0.333333333333,0.125,0.508474576271,0.5,0.0,0.0,1.0 119 | 1.0,0.75,0.915254237288,0.791666666667,0.0,1.0,0.0 120 | 0.472222222222,0.375,0.593220338983,0.583333333333,0.0,0.0,1.0 121 | 0.305555555556,0.583333333333,0.118644067797,0.0416666666667,1.0,0.0,0.0 122 | -------------------------------------------------------------------------------- /neural_networks/example1/example1.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "errors" 5 | "fmt" 6 | "log" 7 | "math" 8 | "math/rand" 9 | "time" 10 | 11 | "gonum.org/v1/gonum/floats" 12 | "gonum.org/v1/gonum/mat" 13 | ) 14 | 15 | // neuralNet contains all of the information 16 | // that defines a trained neural network. 17 | type neuralNet struct { 18 | config neuralNetConfig 19 | wHidden *mat.Dense 20 | bHidden *mat.Dense 21 | wOut *mat.Dense 22 | bOut *mat.Dense 23 | } 24 | 25 | // neuralNetConfig defines our nueral network 26 | // architecture and learning parameters. 27 | type neuralNetConfig struct { 28 | inputNeurons int 29 | outputNeurons int 30 | hiddenNeurons int 31 | numEpochs int 32 | learningRate float64 33 | } 34 | 35 | func main() { 36 | 37 | // Define our input attributes. 38 | input := mat.NewDense(3, 4, []float64{ 39 | 1.0, 0.0, 1.0, 0.0, 40 | 1.0, 0.0, 1.0, 1.0, 41 | 0.0, 1.0, 0.0, 1.0, 42 | }) 43 | 44 | // Define our labels. 45 | labels := mat.NewDense(3, 1, []float64{1.0, 1.0, 0.0}) 46 | 47 | // Define our network architecture and 48 | // learning parameters. 49 | config := neuralNetConfig{ 50 | inputNeurons: 4, 51 | outputNeurons: 1, 52 | hiddenNeurons: 3, 53 | numEpochs: 5000, 54 | learningRate: 0.3, 55 | } 56 | 57 | // Train the neural network. 58 | network := newNetwork(config) 59 | if err := network.train(input, labels); err != nil { 60 | log.Fatal(err) 61 | } 62 | 63 | // Output the weights that define our network! 64 | f := mat.Formatted(network.wHidden, mat.Prefix(" ")) 65 | fmt.Printf("\nwHidden = % v\n\n", f) 66 | 67 | f = mat.Formatted(network.bHidden, mat.Prefix(" ")) 68 | fmt.Printf("\nbHidden = % v\n\n", f) 69 | 70 | f = mat.Formatted(network.wOut, mat.Prefix(" ")) 71 | fmt.Printf("\nwOut = % v\n\n", f) 72 | 73 | f = mat.Formatted(network.bOut, mat.Prefix(" ")) 74 | fmt.Printf("\nbOut = % v\n\n", f) 75 | } 76 | 77 | // NewNetwork initializes a new neural network. 78 | func newNetwork(config neuralNetConfig) *neuralNet { 79 | return &neuralNet{config: config} 80 | } 81 | 82 | // Train trains a neural network using backpropagation. 83 | func (nn *neuralNet) train(x, y *mat.Dense) error { 84 | 85 | // Initialize biases/weights. 86 | randSource := rand.NewSource(time.Now().UnixNano()) 87 | randGen := rand.New(randSource) 88 | 89 | wHiddenRaw := make([]float64, nn.config.hiddenNeurons*nn.config.inputNeurons) 90 | bHiddenRaw := make([]float64, nn.config.hiddenNeurons) 91 | wOutRaw := make([]float64, nn.config.outputNeurons*nn.config.hiddenNeurons) 92 | bOutRaw := make([]float64, nn.config.outputNeurons) 93 | 94 | for _, param := range [][]float64{wHiddenRaw, bHiddenRaw, wOutRaw, bOutRaw} { 95 | for i := range param { 96 | param[i] = randGen.Float64() 97 | } 98 | } 99 | 100 | wHidden := mat.NewDense(nn.config.inputNeurons, nn.config.hiddenNeurons, wHiddenRaw) 101 | bHidden := mat.NewDense(1, nn.config.hiddenNeurons, bHiddenRaw) 102 | wOut := mat.NewDense(nn.config.hiddenNeurons, nn.config.outputNeurons, wOutRaw) 103 | bOut := mat.NewDense(1, nn.config.outputNeurons, bOutRaw) 104 | 105 | // Define the output of the neural network. 106 | var output mat.Dense 107 | 108 | // Loop over the number of epochs utilizing 109 | // backpropagation to train our model. 110 | for i := 0; i < nn.config.numEpochs; i++ { 111 | 112 | // Complete the feed forward process. 113 | var hiddenLayerInput mat.Dense 114 | hiddenLayerInput.Mul(x, wHidden) 115 | addBHidden := func(_, col int, v float64) float64 { return v + bHidden.At(0, col) } 116 | hiddenLayerInput.Apply(addBHidden, &hiddenLayerInput) 117 | 118 | var hiddenLayerActivations mat.Dense 119 | applySigmoid := func(_, _ int, v float64) float64 { return sigmoid(v) } 120 | hiddenLayerActivations.Apply(applySigmoid, &hiddenLayerInput) 121 | 122 | var outputLayerInput mat.Dense 123 | outputLayerInput.Mul(&hiddenLayerActivations, wOut) 124 | addBOut := func(_, col int, v float64) float64 { return v + bOut.At(0, col) } 125 | outputLayerInput.Apply(addBOut, &outputLayerInput) 126 | output.Apply(applySigmoid, &outputLayerInput) 127 | 128 | // Complete the backpropagation. 129 | var networkError mat.Dense 130 | networkError.Sub(y, &output) 131 | 132 | var slopeOutputLayer mat.Dense 133 | applySigmoidPrime := func(_, _ int, v float64) float64 { return sigmoidPrime(v) } 134 | slopeOutputLayer.Apply(applySigmoidPrime, &output) 135 | var slopeHiddenLayer mat.Dense 136 | slopeHiddenLayer.Apply(applySigmoidPrime, &hiddenLayerActivations) 137 | 138 | var dOutput mat.Dense 139 | dOutput.MulElem(&networkError, &slopeOutputLayer) 140 | var errorAtHiddenLayer mat.Dense 141 | errorAtHiddenLayer.Mul(&dOutput, wOut.T()) 142 | 143 | var dHiddenLayer mat.Dense 144 | dHiddenLayer.MulElem(&errorAtHiddenLayer, &slopeHiddenLayer) 145 | 146 | // Adjust the parameters. 147 | var wOutAdj mat.Dense 148 | wOutAdj.Mul(hiddenLayerActivations.T(), &dOutput) 149 | wOutAdj.Scale(nn.config.learningRate, &wOutAdj) 150 | wOut.Add(wOut, &wOutAdj) 151 | 152 | bOutAdj, err := sumAlongAxis(0, &dOutput) 153 | if err != nil { 154 | return err 155 | } 156 | bOutAdj.Scale(nn.config.learningRate, bOutAdj) 157 | bOut.Add(bOut, bOutAdj) 158 | 159 | var wHiddenAdj mat.Dense 160 | wHiddenAdj.Mul(x.T(), &dHiddenLayer) 161 | wHiddenAdj.Scale(nn.config.learningRate, &wHiddenAdj) 162 | wHidden.Add(wHidden, &wHiddenAdj) 163 | 164 | bHiddenAdj, err := sumAlongAxis(0, &dHiddenLayer) 165 | if err != nil { 166 | return err 167 | } 168 | bHiddenAdj.Scale(nn.config.learningRate, bHiddenAdj) 169 | bHidden.Add(bHidden, bHiddenAdj) 170 | } 171 | 172 | // Define our trained neural network. 173 | nn.wHidden = wHidden 174 | nn.bHidden = bHidden 175 | nn.wOut = wOut 176 | nn.bOut = bOut 177 | 178 | return nil 179 | } 180 | 181 | // sigmoid implements the sigmoid function 182 | // for use in activation functions. 183 | func sigmoid(x float64) float64 { 184 | return 1.0 / (1.0 + math.Exp(-x)) 185 | } 186 | 187 | // sigmoidPrime implements the derivative 188 | // of the sigmoid function for backpropagation. 189 | func sigmoidPrime(x float64) float64 { 190 | return sigmoid(x) * (1.0 - sigmoid(x)) 191 | } 192 | 193 | // sumAlongAxis sums a matrix along a 194 | // particular dimension, preserving the 195 | // other dimension. 196 | func sumAlongAxis(axis int, m *mat.Dense) (*mat.Dense, error) { 197 | 198 | numRows, numCols := m.Dims() 199 | 200 | var output *mat.Dense 201 | 202 | switch axis { 203 | case 0: 204 | data := make([]float64, numCols) 205 | for i := 0; i < numCols; i++ { 206 | col := mat.Col(nil, i, m) 207 | data[i] = floats.Sum(col) 208 | } 209 | output = mat.NewDense(1, numCols, data) 210 | case 1: 211 | data := make([]float64, numRows) 212 | for i := 0; i < numRows; i++ { 213 | row := mat.Row(nil, i, m) 214 | data[i] = floats.Sum(row) 215 | } 216 | output = mat.NewDense(numRows, 1, data) 217 | default: 218 | return nil, errors.New("invalid axis, must be 0 or 1") 219 | } 220 | 221 | return output, nil 222 | } 223 | -------------------------------------------------------------------------------- /neural_networks/example2/example2.go: -------------------------------------------------------------------------------- 1 | package main 2 | 3 | import ( 4 | "encoding/csv" 5 | "errors" 6 | "fmt" 7 | "log" 8 | "math" 9 | "math/rand" 10 | "os" 11 | "strconv" 12 | "time" 13 | 14 | "gonum.org/v1/gonum/floats" 15 | "gonum.org/v1/gonum/mat" 16 | ) 17 | 18 | // neuralNet contains all of the information 19 | // that defines a trained neural network. 20 | type neuralNet struct { 21 | config neuralNetConfig 22 | wHidden *mat.Dense 23 | bHidden *mat.Dense 24 | wOut *mat.Dense 25 | bOut *mat.Dense 26 | } 27 | 28 | // neuralNetConfig defines our nueral network 29 | // architecture and learning parameters. 30 | type neuralNetConfig struct { 31 | inputNeurons int 32 | outputNeurons int 33 | hiddenNeurons int 34 | numEpochs int 35 | learningRate float64 36 | } 37 | 38 | func main() { 39 | 40 | // Open the training dataset file. 41 | f, err := os.Open("../data/train.csv") 42 | if err != nil { 43 | log.Fatal(err) 44 | } 45 | defer f.Close() 46 | 47 | // Create a new CSV reader reading from the opened file. 48 | reader := csv.NewReader(f) 49 | reader.FieldsPerRecord = 7 50 | 51 | // Read in all of the CSV records 52 | rawCSVData, err := reader.ReadAll() 53 | if err != nil { 54 | log.Fatal(err) 55 | } 56 | 57 | // inputsData and labelsData will hold all the 58 | // float values that will eventually be 59 | // used to form our matrices. 60 | inputsData := make([]float64, 4*len(rawCSVData)) 61 | labelsData := make([]float64, 3*len(rawCSVData)) 62 | 63 | // inputsIndex will track the current index of 64 | // inputs matrix values. 65 | var inputsIndex int 66 | var labelsIndex int 67 | 68 | // Sequentially move the rows into a slice of floats. 69 | for idx, record := range rawCSVData { 70 | 71 | // Skip the header row. 72 | if idx == 0 { 73 | continue 74 | } 75 | 76 | // Loop over the float columns. 77 | for i, val := range record { 78 | 79 | // Convert the value to a float. 80 | parsedVal, err := strconv.ParseFloat(val, 64) 81 | if err != nil { 82 | log.Fatal(err) 83 | } 84 | 85 | // Add to the labelsData if relevant. 86 | if i == 4 || i == 5 || i == 6 { 87 | labelsData[labelsIndex] = parsedVal 88 | labelsIndex++ 89 | continue 90 | } 91 | 92 | // Add the float value to the slice of floats. 93 | inputsData[inputsIndex] = parsedVal 94 | inputsIndex++ 95 | } 96 | } 97 | 98 | // Form the matrices. 99 | inputs := mat.NewDense(len(rawCSVData), 4, inputsData) 100 | labels := mat.NewDense(len(rawCSVData), 3, labelsData) 101 | 102 | // Define our network architecture and 103 | // learning parameters. 104 | config := neuralNetConfig{ 105 | inputNeurons: 4, 106 | outputNeurons: 3, 107 | hiddenNeurons: 3, 108 | numEpochs: 5000, 109 | learningRate: 0.3, 110 | } 111 | 112 | // Train the neural network. 113 | network := newNetwork(config) 114 | if err := network.train(inputs, labels); err != nil { 115 | log.Fatal(err) 116 | } 117 | 118 | // Open the test dataset file. 119 | f, err = os.Open("../data/test.csv") 120 | if err != nil { 121 | log.Fatal(err) 122 | } 123 | defer f.Close() 124 | 125 | // Create a new CSV reader reading from the opened file. 126 | reader = csv.NewReader(f) 127 | reader.FieldsPerRecord = 7 128 | 129 | // Read in all of the test CSV records 130 | rawCSVData, err = reader.ReadAll() 131 | if err != nil { 132 | log.Fatal(err) 133 | } 134 | 135 | // inputsData and labelsData will hold all the 136 | // float values that will eventually be 137 | // used to form our matrices. 138 | inputsData = make([]float64, 4*len(rawCSVData)) 139 | labelsData = make([]float64, 3*len(rawCSVData)) 140 | 141 | // inputsIndex will track the current index of 142 | // inputs matrix values. 143 | inputsIndex = 0 144 | labelsIndex = 0 145 | 146 | // Sequentially move the rows into a slice of floats. 147 | for idx, record := range rawCSVData { 148 | 149 | // Skip the header row. 150 | if idx == 0 { 151 | continue 152 | } 153 | 154 | // Loop over the float columns. 155 | for i, val := range record { 156 | 157 | // Convert the value to a float. 158 | parsedVal, err := strconv.ParseFloat(val, 64) 159 | if err != nil { 160 | log.Fatal(err) 161 | } 162 | 163 | // Add to the labelsData if relevant. 164 | if i == 4 || i == 5 || i == 6 { 165 | labelsData[labelsIndex] = parsedVal 166 | labelsIndex++ 167 | continue 168 | } 169 | 170 | // Add the float value to the slice of floats. 171 | inputsData[inputsIndex] = parsedVal 172 | inputsIndex++ 173 | } 174 | } 175 | 176 | // Form the matrices. 177 | testInputs := mat.NewDense(len(rawCSVData), 4, inputsData) 178 | testLabels := mat.NewDense(len(rawCSVData), 3, labelsData) 179 | 180 | // Make the predictions using the trained model. 181 | predictions, err := network.predict(testInputs) 182 | if err != nil { 183 | log.Fatal(err) 184 | } 185 | 186 | // Calculate the accuracy of our model. 187 | var truePosNeg int 188 | numPreds, _ := predictions.Dims() 189 | for i := 0; i < numPreds; i++ { 190 | 191 | // Get the label. 192 | labelRow := mat.Row(nil, i, testLabels) 193 | var species int 194 | for idx, label := range labelRow { 195 | if label == 1.0 { 196 | species = idx 197 | break 198 | } 199 | } 200 | 201 | // Accumulate the true positive/negative count. 202 | if predictions.At(i, species) == floats.Max(mat.Row(nil, i, predictions)) { 203 | truePosNeg++ 204 | } 205 | } 206 | 207 | // Calculate the accuracy (subset accuracy). 208 | accuracy := float64(truePosNeg) / float64(numPreds) 209 | 210 | // Output the Accuracy value to standard out. 211 | fmt.Printf("\nAccuracy = %0.2f\n\n", accuracy) 212 | } 213 | 214 | // NewNetwork initializes a new neural network. 215 | func newNetwork(config neuralNetConfig) *neuralNet { 216 | return &neuralNet{config: config} 217 | } 218 | 219 | // Train trains a neural network using backpropagation. 220 | func (nn *neuralNet) train(x, y *mat.Dense) error { 221 | 222 | // Initialize biases/weights. 223 | randSource := rand.NewSource(time.Now().UnixNano()) 224 | randGen := rand.New(randSource) 225 | 226 | wHiddenRaw := make([]float64, nn.config.hiddenNeurons*nn.config.inputNeurons) 227 | bHiddenRaw := make([]float64, nn.config.hiddenNeurons) 228 | wOutRaw := make([]float64, nn.config.outputNeurons*nn.config.hiddenNeurons) 229 | bOutRaw := make([]float64, nn.config.outputNeurons) 230 | 231 | for _, param := range [][]float64{wHiddenRaw, bHiddenRaw, wOutRaw, bOutRaw} { 232 | for i := range param { 233 | param[i] = randGen.Float64() 234 | } 235 | } 236 | 237 | wHidden := mat.NewDense(nn.config.inputNeurons, nn.config.hiddenNeurons, wHiddenRaw) 238 | bHidden := mat.NewDense(1, nn.config.hiddenNeurons, bHiddenRaw) 239 | wOut := mat.NewDense(nn.config.hiddenNeurons, nn.config.outputNeurons, wOutRaw) 240 | bOut := mat.NewDense(1, nn.config.outputNeurons, bOutRaw) 241 | 242 | // Define the output of the neural network. 243 | var output mat.Dense 244 | 245 | // Loop over the number of epochs utilizing 246 | // backpropagation to train our model. 247 | for i := 0; i < nn.config.numEpochs; i++ { 248 | 249 | // Complete the feed forward process. 250 | var hiddenLayerInput mat.Dense 251 | hiddenLayerInput.Mul(x, wHidden) 252 | addBHidden := func(_, col int, v float64) float64 { return v + bHidden.At(0, col) } 253 | hiddenLayerInput.Apply(addBHidden, &hiddenLayerInput) 254 | 255 | var hiddenLayerActivations mat.Dense 256 | applySigmoid := func(_, _ int, v float64) float64 { return sigmoid(v) } 257 | hiddenLayerActivations.Apply(applySigmoid, &hiddenLayerInput) 258 | 259 | var outputLayerInput mat.Dense 260 | outputLayerInput.Mul(&hiddenLayerActivations, wOut) 261 | addBOut := func(_, col int, v float64) float64 { return v + bOut.At(0, col) } 262 | outputLayerInput.Apply(addBOut, &outputLayerInput) 263 | output.Apply(applySigmoid, &outputLayerInput) 264 | 265 | // Complete the backpropagation. 266 | var networkError mat.Dense 267 | networkError.Sub(y, &output) 268 | 269 | var slopeOutputLayer mat.Dense 270 | applySigmoidPrime := func(_, _ int, v float64) float64 { return sigmoidPrime(v) } 271 | slopeOutputLayer.Apply(applySigmoidPrime, &output) 272 | var slopeHiddenLayer mat.Dense 273 | slopeHiddenLayer.Apply(applySigmoidPrime, &hiddenLayerActivations) 274 | 275 | var dOutput mat.Dense 276 | dOutput.MulElem(&networkError, &slopeOutputLayer) 277 | var errorAtHiddenLayer mat.Dense 278 | errorAtHiddenLayer.Mul(&dOutput, wOut.T()) 279 | 280 | var dHiddenLayer mat.Dense 281 | dHiddenLayer.MulElem(&errorAtHiddenLayer, &slopeHiddenLayer) 282 | 283 | // Adjust the parameters. 284 | var wOutAdj mat.Dense 285 | wOutAdj.Mul(hiddenLayerActivations.T(), &dOutput) 286 | wOutAdj.Scale(nn.config.learningRate, &wOutAdj) 287 | wOut.Add(wOut, &wOutAdj) 288 | 289 | bOutAdj, err := sumAlongAxis(0, &dOutput) 290 | if err != nil { 291 | return err 292 | } 293 | bOutAdj.Scale(nn.config.learningRate, bOutAdj) 294 | bOut.Add(bOut, bOutAdj) 295 | 296 | var wHiddenAdj mat.Dense 297 | wHiddenAdj.Mul(x.T(), &dHiddenLayer) 298 | wHiddenAdj.Scale(nn.config.learningRate, &wHiddenAdj) 299 | wHidden.Add(wHidden, &wHiddenAdj) 300 | 301 | bHiddenAdj, err := sumAlongAxis(0, &dHiddenLayer) 302 | if err != nil { 303 | return err 304 | } 305 | bHiddenAdj.Scale(nn.config.learningRate, bHiddenAdj) 306 | bHidden.Add(bHidden, bHiddenAdj) 307 | } 308 | 309 | // Define our trained neural network. 310 | nn.wHidden = wHidden 311 | nn.bHidden = bHidden 312 | nn.wOut = wOut 313 | nn.bOut = bOut 314 | 315 | return nil 316 | } 317 | 318 | // predict makes a prediction based on a trained 319 | // neural network. 320 | func (nn *neuralNet) predict(x *mat.Dense) (*mat.Dense, error) { 321 | 322 | // Check to make sure that our neuralNet value 323 | // represents a trained model. 324 | if nn.wHidden == nil || nn.wOut == nil || nn.bHidden == nil || nn.bOut == nil { 325 | return nil, errors.New("the supplied neurnal net weights and biases are empty") 326 | } 327 | 328 | // Define the output of the neural network. 329 | var output mat.Dense 330 | 331 | // Complete the feed forward process. 332 | var hiddenLayerInput mat.Dense 333 | hiddenLayerInput.Mul(x, nn.wHidden) 334 | addBHidden := func(_, col int, v float64) float64 { return v + nn.bHidden.At(0, col) } 335 | hiddenLayerInput.Apply(addBHidden, &hiddenLayerInput) 336 | 337 | var hiddenLayerActivations mat.Dense 338 | applySigmoid := func(_, _ int, v float64) float64 { return sigmoid(v) } 339 | hiddenLayerActivations.Apply(applySigmoid, &hiddenLayerInput) 340 | 341 | var outputLayerInput mat.Dense 342 | outputLayerInput.Mul(&hiddenLayerActivations, nn.wOut) 343 | addBOut := func(_, col int, v float64) float64 { return v + nn.bOut.At(0, col) } 344 | outputLayerInput.Apply(addBOut, &outputLayerInput) 345 | output.Apply(applySigmoid, &outputLayerInput) 346 | 347 | return &output, nil 348 | } 349 | 350 | // sigmoid implements the sigmoid function 351 | // for use in activation functions. 352 | func sigmoid(x float64) float64 { 353 | return 1.0 / (1.0 + math.Exp(-x)) 354 | } 355 | 356 | // sigmoidPrime implements the derivative 357 | // of the sigmoid function for backpropagation. 358 | func sigmoidPrime(x float64) float64 { 359 | return sigmoid(x) * (1.0 - sigmoid(x)) 360 | } 361 | 362 | // sumAlongAxis sums a matrix along a 363 | // particular dimension, preserving the 364 | // other dimension. 365 | func sumAlongAxis(axis int, m *mat.Dense) (*mat.Dense, error) { 366 | 367 | numRows, numCols := m.Dims() 368 | 369 | var output *mat.Dense 370 | 371 | switch axis { 372 | case 0: 373 | data := make([]float64, numCols) 374 | for i := 0; i < numCols; i++ { 375 | col := mat.Col(nil, i, m) 376 | data[i] = floats.Sum(col) 377 | } 378 | output = mat.NewDense(1, numCols, data) 379 | case 1: 380 | data := make([]float64, numRows) 381 | for i := 0; i < numRows; i++ { 382 | row := mat.Row(nil, i, m) 383 | data[i] = floats.Sum(row) 384 | } 385 | output = mat.NewDense(numRows, 1, data) 386 | default: 387 | return nil, errors.New("invalid axis, must be 0 or 1") 388 | } 389 | 390 | return output, nil 391 | } 392 | -------------------------------------------------------------------------------- /projects.md: -------------------------------------------------------------------------------- 1 | # Project Ideas and Data Sets 2 | 3 | Awesome! You've chosen to train and evaluate some of your own ML/AI models using new data sets. We have compiled a set of problems with corresponding public data available (listed below). 4 | 5 | 1. Assemble a group of 3-4 people (try to introduce yourself to some new gopher friends) 6 | 2. Pick a project/problem 7 | 3. Explore the data 8 | 4. Train/evaluate/integrate your model 9 | 10 | _Hint: If you don't now where to start check out [this step by step guide for ML projects](https://github.com/ageron/handson-ml/blob/master/ml-project-checklist.md)_ 11 | 12 | _Hint: If you are looking for more relevant Go tooling, see [here](https://github.com/gopherdata/resources/blob/master/tooling/README.md)._ 13 | 14 | ## Deep Learning for Sentiment Analysis 15 | 16 | Sentiment analysis is a method that tries to determine whether a piece of text is generally _positive_ or _negative_ in tone and/or content. Certain implementations do this by predicting a score/number that is indicative of sentiment, while other implementations just return binary indications of sentiment. Gather some text data from one of the sources below and try to determine the sentiment of that text data. Some cool application/project ideas would be to: 17 | 18 | - Write a Go program that streams in tweets from Twitter and aggregates or updates a measure of sentiment on a certain topic or hashtag. 19 | - Answer some questions about the content on Hacker News. What is the overall sentiment of comments on the front page Hacker news articles? Which ones have the most positive sentiment? 20 | 21 | ### Datasets: 22 | 23 | * [twitter - api](https://developer.twitter.com/en/use-cases/analyze) 24 | * [twitter - Max-plank institute](http://twitter.mpi-sws.org/) 25 | * [hacker news comments](https://console.cloud.google.com/marketplace/details/y-combinator/hacker-news?filter=solution-type%3Adataset&id=5227103e-0eb9-4744-872b-325a8df50bee) 26 | * [stack exchange](https://console.cloud.google.com/marketplace/details/stack-exchange/stack-overflow?filter=solution-type:dataset&id=46a148ff-896d-444c-b08d-360169911f59) 27 | 28 | ### Tools: 29 | * [Machine Box's Textbox](https://docs.veritone.com/#/developer/machine-box/boxes/textbox) - Machine Box provides state of the art machine learning technology inside a Docker container which you can run, deploy and scale. Their textbox offering allows you to do sentiment analysis of text with a deep learning model. Moreover, they have a [Go SDK](https://github.com/machinebox/sdk-go) so you can easily integrate this into your Go applications. 30 | 31 | ## Deep Learning for Image/Object Classification 32 | 33 | If you are following the news, you have probably heard a bunch about deep learning methods in computer vision (think deep fakes or FaceApp). So why not try your hand at doing a little bit of image processing? Some cool application/project ideas would be to: 34 | 35 | - Write a Go program that can detect your face in an image. 36 | - Use MachineBox's Tagbox to automatically tag the content of images. Then try to add your own custom tags and images to update the model. 37 | 38 | ### Datasets: 39 | 40 | * [mnist](https://www.kaggle.com/c/digit-recognizer/data) 41 | * [open images object detection](https://www.kaggle.com/c/open-images-2019-object-detection/data) 42 | * [malaria detection](https://ceb.nlm.nih.gov/repositories/malaria-datasets/) 43 | * [crowdsourced object detection data](https://ai.google/tools/datasets/open-images-extended-crowdsourced/) 44 | ### Tools: 45 | * [Machinebox's Objectbox](https://docs.veritone.com/#/developer/machine-box/boxes/objectbox) - Machine Box provides state of the art machine learning technology inside a Docker container which you can run, deploy and scale. Their object box offering allows you to do object detection using a deep learning model. Moreover, they have a [Go SDK](https://github.com/machinebox/sdk-go) so you can easily integrate this into your Go applications. 46 | * [Machinebox's Classificationbox](https://docs.veritone.com/#/developer/machine-box/boxes/classificationbox) - Classificationbox allows you to classify text, images, structured and unstructured data using a deep learning model. 47 | * [Machinebox's Facebox](https://docs.veritone.com/#/developer/machine-box/boxes/facebox-overview) - Facebox allows you to teach and recognize faces in images or photographs using a deep learning model. 48 | * [Machinebox's Tagbox](https://docs.veritone.com/#/developer/machine-box/boxes/tagbox) - Tagbox allows you to teach and automatically understand the content of images using a deep learning model. 49 | 50 | ## Linear and Logistic Regression 51 | 52 | These methods might not be as hyped as the above methods, but they are certainly useful and a little easier to implement. You could take some of the workshop code from earlier in the day and adapt it to these new data sets! Some cool application/project ideas would be to: 53 | 54 | - Use census data to try and predict whether a person makes over 50K a year based on certain demographic info. 55 | - Use logistic regression to predict Iris flower species 56 | - Predict outcomes of an NCAA basketball matchup. 57 | 58 | ### Datasets: 59 | 60 | * [census income data](https://archive.ics.uci.edu/ml/datasets/census+income) 61 | * [iris classification](https://archive.ics.uci.edu/ml/datasets/Iris) 62 | * [Bureau of Labor income statistics](https://console.cloud.google.com/marketplace/details/bls-public-data/bureau-of-labor-statistics?filter=solution-type:dataset&id=e632a715-857e-4c41-8257-da123607ea89) 63 | * [NCAA basketball statistics](https://console.cloud.google.com/marketplace/details/ncaa-bb-public/ncaa-basketball?filter=solution-type%3Adataset&id=f262fa22-2021-44c6-a628-15eab8237de5) 64 | 65 | ### Tools: 66 | * [Workshop examples of linear regression](linear_regression) 67 | * [Workshop examples of logistic regression](logistic_regression) 68 | * [goml](https://github.com/cdipaolo/goml) 69 | * [golearn](https://github.com/sjwhitworth/golearn) 70 | 71 | --------------------------------------------------------------------------------