├── .gitignore ├── README.md ├── accordframework ├── 1_af_mnist.workbook ├── 2_af_clustering_gmm.workbook ├── README.md ├── af_gmm.png ├── af_tsne_mnist.png └── tmp │ └── mnist_tsne_result.bin ├── cntk ├── 1_cntk_getting_started.workbook ├── 2_cntk_logistic_regression.workbook ├── 3_cntk_lstm_sequence_classify.workbook ├── README.md ├── screenshot.cntk.gs1.png ├── screenshot.cntk.gs2.png └── screenshot.cntk.log_reg1.png ├── plotlib └── plotlib.workbook ├── screenshot.tfsharp.png ├── screenshot.xplot1.png ├── screenshot.xplot2.png ├── tensorflow ├── 1_tf_setup.workbook ├── 2_tf_basics.workbook ├── 2_tf_getting_started.workbook ├── 3_tf_mnist.workbook ├── 4_tf_image_recognistion.workbook ├── README.md ├── demofiles │ ├── example.png │ └── rio.jpg ├── screenshot.tf.gs.a.png └── screenshot.tf.gs.b.png └── xplot ├── xplot.3DLineData.txt ├── xplot.mt.bruno.txt └── xplot.workbook /.gitignore: -------------------------------------------------------------------------------- 1 | /.vs 2 | /tensorflow/tmp 3 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Introduction 2 | This is the LarcAI playground for Cognitive Sciences (AI, ML) in the .Net C# world 3 | It is basically the kaggle of the C# world 4 | 5 | ## Xamarin Workbooks 6 | The workbooks are build in 7 | [Xamarin Workbooks](https://developer.xamarin.com/guides/cross-platform/workbooks/) 8 | 9 | ## Nuget 10 | - Each wokbook may require packages available on Nuget www.nuget.org 11 | - The workbooks will have comment and refeneces to guide on the packages needed 12 | 13 | ### XPlot Wokbooks 14 | This workbook show of the Xplot 15 | ![Screenshot.Xplot](screenshot.xplot1.png) 16 | ![Screenshot.Xplot2](screenshot.xplot2.png) 17 | 18 | ### Tensorflow 19 | 20 | ![Screenshot.Tfsharp](screenshot.tfsharp.png) 21 | 22 | ![Screenshot.Tf.Gs.A](./tensorflow/screenshot.tf.gs.a.png) 23 | 24 | ![Screenshot.Tf.Gs.B](./tensorflow/screenshot.tf.gs.b.png) 25 | 26 | 27 | ### CNTK 28 | 29 | Unfortunately CNTK does not have proper C# bindings yet 30 | 31 | ### Accord Framework 32 | 33 | Example showing the GMM clustering 34 | ![Af Gmm](./accordframework/af_gmm.png) 35 | 36 | Example showing t-SNE on MNIST dataset 37 | ![Af Tsne Mnist](./accordframework/af_tsne_mnist.png) 38 | 39 | -------------------------------------------------------------------------------- /accordframework/1_af_mnist.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: Accord.DataSets 7 | version: 3.5.0 8 | - id: XPlot.Plotly 9 | version: 1.4.2 10 | - id: Newtonsoft.Json 11 | version: 9.0.1 12 | - id: Accord.Neuro 13 | version: 3.5.0 14 | - id: Accord.Audio 15 | version: 3.5.0 16 | - id: Accord.Vision 17 | version: 3.5.0 18 | - id: Accord.Statistics 19 | version: 3.5.0 20 | - id: System.ValueTuple 21 | version: 4.3.1 22 | - id: Accord 23 | version: 3.5.0 24 | - id: Accord.MachineLearning 25 | version: 3.5.0 26 | - id: FSharp.Core 27 | version: 4.2.1 28 | - id: Accord.IO 29 | version: 3.5.0 30 | - id: Accord.Video 31 | version: 3.5.0 32 | - id: Accord.Imaging 33 | version: 3.5.0 34 | - id: SharpZipLib 35 | version: 0.86.0 36 | - id: Accord.Math 37 | version: 3.5.0 38 | --- 39 | 40 | When one learns how to program, there's a tradition that the first thing you do is print "Hello World." Just like programming has Hello World, machine learning has MNIST. 41 | 42 | MNIST is a simple computer vision dataset. It consists of images of handwritten digits like these: 43 | 44 | ![](https://www.tensorflow.org/images/MNIST.png) 45 | 46 | It also includes labels for each image, telling us which digit it is. For example, the labels for the above images are 5, 0, 4, and 1. 47 | 48 | In this tutorial, we're going to train a model to look at images and predict what digits they are. Our goal isn't to train a really elaborate model that achieves state-of-the-art performance -- although we'll give you code to do that later! -- but rather to dip a toe into using TensorFlow. As such, we're going to start with a very simple model, called a Softmax Regression. 49 | 50 | The actual code for this tutorial is very short, and all the interesting stuff happens in just three lines. However, it is very important to understand the ideas behind it: both how TensorFlow works and the core machine learning concepts. Because of this, we are going to very carefully work through the code. 51 | 52 | ```csharp 53 | #r "FSharp.Core" 54 | #r "XPlot.Plotly" 55 | 56 | #r "System.Numerics" 57 | 58 | #r "Accord" 59 | #r "Accord.IO" 60 | #r "Accord.Math" 61 | #r "Accord.Statistics" 62 | #r "Accord.MachineLearning" 63 | #r "Accord.Neuro" 64 | #r "Accord.DataSets" 65 | ``` 66 | 67 | The MNIST data is hosted on [Yann LeCun's website](http://yann.lecun.com/exdb/mnist/). If you are copying and pasting in the code from this tutorial, start here with these two lines of code which will download and read in the data automatically: 68 | 69 | ```csharp 70 | using XPlot.Plotly; 71 | using System.Linq; 72 | using Accord.Math; 73 | 74 | string dir = @".\tmp"; 75 | 76 | var mnistdataset = new Accord.DataSets.MNIST(dir); 77 | Console.WriteLine($"Training Set Count {mnistdataset.Training.Item1.Count()}"); 78 | mnistdataset 79 | ``` 80 | 81 | ```csharp 82 | //create 3D result 83 | string tsneresulrfile = $"{dir}\\mnist_tsne_result.bin"; 84 | 85 | double[][] result = null; 86 | if (System.IO.File.Exists(tsneresulrfile)) 87 | { 88 | Console.WriteLine($"Loading cached t-SNE result"); 89 | result = Accord.IO.Serializer.Load(tsneresulrfile); 90 | } 91 | else 92 | { 93 | result = new double[mnistdataset.Training.Item1.Count()][]; 94 | for (int i = 0; i < result.Length; i ++) result[i] = new double[3]; 95 | 96 | double[][] mnistinput = mnistdataset.Training.Item1.ToList().ConvertAll(r => r.ToDense(28*28)).ToArray(); 97 | Console.WriteLine($"MNIST Input Count {mnistinput.Length} "); 98 | 99 | Console.WriteLine($"Please be patient while t-SNE is running"); 100 | Accord.MachineLearning.Clustering.TSNE tsne = new Accord.MachineLearning.Clustering.TSNE(); 101 | tsne.Transform(mnistinput, result); 102 | 103 | Console.WriteLine($"Lets cache t-SNE results"); 104 | Accord.IO.Serializer.Save(result, tsneresulrfile); 105 | } 106 | List traces = new List(); 107 | 108 | Random rnd = new Random(); 109 | for (int digit = 0; digit <= 9; digit ++) 110 | { 111 | List x = new List(); 112 | List y = new List(); 113 | List z = new List(); 114 | 115 | for (int j = 0; j < mnistdataset.Training.Item2.Count(); j ++) 116 | { 117 | if (mnistdataset.Training.Item2[j] == digit) 118 | { 119 | x.Add(result[j][0]); 120 | y.Add(result[j][1]); 121 | z.Add(result[j][2]); 122 | } 123 | } 124 | string randomcolor = $"rgb({rnd.Next(50, 200)},{rnd.Next(50, 200)},{rnd.Next(50, 200)})"; 125 | var trace = new Graph.Scatter3d() { x = x, y = y, z = z, 126 | name = $"Digit {digit}", 127 | text = $"Digit {digit}", 128 | mode = "markers", 129 | marker = new Graph.Marker() { color = randomcolor, size = 2.0, symbol = "circle", 130 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 131 | }, 132 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 133 | }; 134 | traces.Add(trace); 135 | } 136 | 137 | var layout = new Layout.Layout() { title = "tSNE on MNIST", autosize = false, margin = new Graph.Margin() { l = 0, r = 0, b = 0, t = 65 }}; 138 | 139 | var scatterplot = Chart.Plot(traces.ToArray(), layout); 140 | scatterplot.WithWidth(800); 141 | scatterplot.WithHeight(500); 142 | scatterplot.GetHtml().AsHtml(); 143 | ``` -------------------------------------------------------------------------------- /accordframework/2_af_clustering_gmm.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: Accord.DataSets 7 | version: 3.5.0 8 | - id: XPlot.Plotly 9 | version: 1.4.2 10 | - id: Newtonsoft.Json 11 | version: 9.0.1 12 | - id: Accord.Neuro 13 | version: 3.5.0 14 | - id: Accord.Audio 15 | version: 3.5.0 16 | - id: Accord.Vision 17 | version: 3.5.0 18 | - id: Accord.Statistics 19 | version: 3.5.0 20 | - id: System.ValueTuple 21 | version: 4.3.1 22 | - id: Accord 23 | version: 3.5.0 24 | - id: Accord.MachineLearning 25 | version: 3.5.0 26 | - id: FSharp.Core 27 | version: 4.2.1 28 | - id: Accord.IO 29 | version: 3.5.0 30 | - id: Accord.Video 31 | version: 3.5.0 32 | - id: Accord.Imaging 33 | version: 3.5.0 34 | - id: SharpZipLib 35 | version: 0.86.0 36 | - id: Accord.Math 37 | version: 3.5.0 38 | --- 39 | 40 | Clustering (Gausian Mixture Models) 41 | 42 | This sample application shows how to use [Gaussian Mixture Models](http://accord-framework.net/docs/html/T_Accord_MachineLearning_GaussianMixtureModel.htm) to perform clustering and classification using soft-decision margins. 43 | 44 | ```csharp 45 | #r "FSharp.Core" 46 | #r "XPlot.Plotly" 47 | 48 | #r "System.Numerics" 49 | 50 | #r "Accord" 51 | #r "Accord.IO" 52 | #r "Accord.Math" 53 | #r "Accord.Statistics" 54 | #r "Accord.MachineLearning" 55 | #r "Accord.Neuro" 56 | ``` 57 | 58 | Create some random cluster data 59 | 60 | ```csharp 61 | using XPlot.Plotly; 62 | using System.Linq; 63 | using Accord.Math; 64 | using Accord.MachineLearning; 65 | using Accord.Statistics.Distributions.Multivariate; 66 | 67 | 68 | int k = 5;//number of clusters 69 | 70 | // Generate data with n Gaussian distributions 71 | double[][][] data = new double[k][][]; 72 | 73 | for (int i = 0; i < k; i++) 74 | { 75 | // Create random centroid to place the Gaussian distribution 76 | double[] mean = Vector.Random(2, -6.0, +6.0); 77 | 78 | // Create random covariance matrix for the distribution 79 | double[,] covariance = Accord.Statistics.Tools.RandomCovariance(2, -5, 5); 80 | 81 | // Create the Gaussian distribution 82 | var gaussian = new MultivariateNormalDistribution(mean, covariance); 83 | 84 | int samples = Accord.Math.Random.Generator.Random.Next(150, 250); 85 | data[i] = gaussian.Generate(samples); 86 | } 87 | //created 5 clusters 88 | data 89 | ``` 90 | 91 | Now we remove clustering info and create a scatterplot of all the random data 92 | 93 | ```csharp 94 | Random rnd = new Random(); 95 | 96 | // Join the generated data, and remove cluster info 97 | // what remains is x, y values 98 | double[][] observations = Matrix.Stack(data); 99 | 100 | var trace = new Graph.Scatter() 101 | { 102 | x = observations.ToList().Select(r => r[0]), 103 | y = observations.ToList().Select(r => r[1]), 104 | name = $"Mixed", 105 | mode = "markers", 106 | marker = new Graph.Marker() { color = "rgb(100,100,100)", size = 2.0, symbol = "circle", 107 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 108 | } 109 | }; 110 | 111 | var layout = new Layout.Layout() { title = "Random Data before Clustering (GMM)", autosize = false, margin = new Graph.Margin() { l = 0, r = 0, b = 0, t = 65 }}; 112 | 113 | var scatterplot = Chart.Plot(trace, layout); 114 | scatterplot.WithWidth(800); 115 | scatterplot.WithHeight(500); 116 | scatterplot.GetHtml().AsHtml(); 117 | ``` 118 | 119 | Now we use GMM clustering to cluster the data 120 | 121 | ```csharp 122 | var gmm = new GaussianMixtureModel(k); 123 | 124 | // Compute the model 125 | GaussianClusterCollection clustering = gmm.Learn(observations); 126 | 127 | // Classify all instances in mixture data 128 | double[] classifications = clustering.Decide(observations).Convert(c => (double)c); 129 | ``` 130 | 131 | No we bring the insert the resulting clusering column to the observation data 132 | 133 | ```csharp 134 | double[][] results = observations.InsertColumn(classifications); 135 | ``` 136 | 137 | Nex we genarate a scatter plot based on the clustered data 138 | 139 | ```csharp 140 | Random rnd = new Random(); 141 | List traces = new List(); 142 | 143 | var groupedbycluster = (from d in results 144 | orderby d[2] 145 | group d by d[2] into g 146 | select new {cluster = g.Key, data = g.Select(r => new {x = r[0], y = r[1]})}); 147 | foreach (var g in groupedbycluster) 148 | { 149 | string randomcolor = $"rgb({rnd.Next(50, 200)},{rnd.Next(50, 200)},{rnd.Next(50, 200)})"; 150 | var trace = new Graph.Scatter() 151 | { 152 | x = g.data.Select(r => r.x), 153 | y = g.data.Select(r => r.y), 154 | name = $"Cluster {g.cluster}", 155 | mode = "markers", 156 | marker = new Graph.Marker() 157 | { 158 | color = randomcolor, size = 5.0, symbol = "circle", 159 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 160 | } 161 | }; 162 | traces.Add(trace); 163 | } 164 | 165 | var scatterplot = Chart.Plot(traces.ToArray(), layout); 166 | scatterplot.WithWidth(700); 167 | scatterplot.WithHeight(500); 168 | scatterplot.GetHtml().AsHtml(); 169 | ``` -------------------------------------------------------------------------------- /accordframework/README.md: -------------------------------------------------------------------------------- 1 | # Introduction 2 | 3 | This is a list of Xamarin Workbooks for getting started with Accord Framework for .Net 4 | 5 | ## Xamarin Workbooks 6 | Download 7 | [Xamarin Workbooks](https://developer.xamarin.com/guides/cross-platform/workbooks/) 8 | 9 | ## Nuget 10 | - Each wokbook may require packages available on Nuget www.nuget.org 11 | - The workbooks will have comment and refeneces to guide on the packages needed 12 | 13 | ### Accord Framework 14 | 15 | Click here for more information on the [Accord Framework](http://accord-framework.net/) and on [Accord Github](https://github.com/accord-net/framework) 16 | 17 | Example showing the GMM clustering 18 | ![Af Gmm](../accordframework/af_gmm.png) 19 | 20 | Example showing t-SNE on MNIST dataset 21 | ![Af Tsne Mnist](../accordframework/af_tsne_mnist.png) 22 | 23 | -------------------------------------------------------------------------------- /accordframework/af_gmm.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/accordframework/af_gmm.png -------------------------------------------------------------------------------- /accordframework/af_tsne_mnist.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/accordframework/af_tsne_mnist.png -------------------------------------------------------------------------------- /accordframework/tmp/mnist_tsne_result.bin: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/accordframework/tmp/mnist_tsne_result.bin -------------------------------------------------------------------------------- /cntk/1_cntk_getting_started.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | packages: 3 | - id: FSharp.Core 4 | version: 4.2.3 5 | - id: XPlot.Plotly 6 | version: 1.4.2 7 | uti: com.xamarin.workbook 8 | id: b45738b5-f298-4a26-b2e2-5c48b0f30aa1 9 | title: GettingStarted 10 | platforms: 11 | - WPF 12 | --- 13 | 14 | Make sure CNTK is in your path 15 | 16 | ```csharp 17 | #r "FSharp.Core" 18 | #r "XPlot.Plotly" 19 | #r "J:\AI\cntk\cntk\Cntk.Core.Managed-2.2.dll" 20 | ``` 21 | 22 | Check if managed assembly loaded correctly 23 | 24 | ```csharp 25 | using System; 26 | using System.Collections.Generic; 27 | using System.Linq; 28 | 29 | using XPlot.Plotly; 30 | using CNTK; 31 | 32 | typeof(CNTKLib).Assembly 33 | ``` 34 | 35 | ```csharp 36 | var devices = CNTK.DeviceDescriptor.AllDevices(); 37 | ``` 38 | 39 | Ok, CNTK found int CPU and one GPU, lets check out the GPU 40 | 41 | ```csharp 42 | var device = CNTK.DeviceDescriptor.GPUDevice(0); 43 | device.AsString(); 44 | ``` 45 | 46 | Lets create a Variable with 3 dimensions and init the record with Posative random values 47 | 48 | After that we build a Simple CNTK model to Negate the values 49 | 50 | ```csharp 51 | //Variable values = Variable.InputVariable(new int[] { 3 }, DataType.Float); 52 | int Count = 100; 53 | int Dim = 3; 54 | var data = new float[Count * Dim]; 55 | var dataO = new float[Count * Dim]; 56 | 57 | //we generate random points on a 3d posative quadrant 58 | Random random = new Random(); 59 | for (int i = 0; i < Count; i++) 60 | { 61 | for (int d = 0; d < Dim; d++) 62 | { 63 | data[i * Dim + d] = (float)random.Next(0, 100); 64 | } 65 | } 66 | 67 | Value value = Value.CreateBatch(new int[] { Dim }, data, device); 68 | Variable valueVar = Variable.InputVariable(new int[] { Dim }, DataType.Float); 69 | List> densedataIn = value.GetDenseData(valueVar).ToList(); 70 | 71 | Dictionary inputMap = new Dictionary() { {valueVar, value} }; 72 | ``` 73 | 74 | Now we draw a 3d graph to show the posative and negative values 75 | 76 | ```csharp 77 | var fnneg = CNTKLib.Negate(valueVar); 78 | Dictionary outputMap = new Dictionary() { {fnneg.Output, null} }; 79 | fnneg.Evaluate(inputMap, outputMap, device); 80 | 81 | var outputValue = outputMap[fnneg.Output]; 82 | List> densedataOut = outputValue.GetDenseData(fnneg.Output).ToList(); 83 | 84 | var scatter3din = new Graph.Scatter3d() 85 | { 86 | name = $"Random Data", 87 | text = $"Random Data", 88 | x = densedataIn.Select(f => f[0]).ToList(), 89 | y = densedataIn.Select(f => f[1]).ToList(), 90 | z = densedataIn.Select(f => f[2]).ToList(), 91 | mode = "markers", 92 | marker = new Graph.Marker() { 93 | color = "red", 94 | size = 5.0, 95 | symbol = "circle", 96 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 97 | }, 98 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 99 | }; 100 | 101 | var scatter3dout = new Graph.Scatter3d() 102 | { 103 | name = $"Negative Random Data", 104 | text = $"Negative Random Data", 105 | x = densedataOut.Select(f => f[0]).ToList(), 106 | y = densedataOut.Select(f => f[1]).ToList(), 107 | z = densedataOut.Select(f => f[2]).ToList(), 108 | mode = "markers", 109 | marker = new Graph.Marker() { 110 | color = "blue", 111 | size = 5.0, 112 | symbol = "circle", 113 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 114 | }, 115 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 116 | }; 117 | 118 | var layout = new Layout.Layout() { title = "Data", width = 600, height = 600, margin = new Graph.Margin() { l = 0, r = 0, b = 0, t = 65 }}; 119 | var scatterplot = Chart.Plot(new Graph.Scatter3d[] {scatter3din, scatter3dout}, layout); 120 | 121 | scatterplot.WithWidth(600); 122 | scatterplot.WithHeight(600); 123 | scatterplot.GetHtml().AsHtml(); 124 | 125 | ``` 126 | 127 | ```csharp 128 | var fn = CNTKLib.ElementTimes(fnneg, Constant.Scalar(3.0F, device)); 129 | 130 | outputMap = new Dictionary() { {fn.Output, null} }; 131 | fn.Evaluate(inputMap, outputMap, device); 132 | 133 | var outputValue = outputMap[fn.Output]; 134 | List> densedataOut = outputValue.GetDenseData(fn.Output).ToList(); 135 | 136 | var scatter3dout2 = new Graph.Scatter3d() 137 | { 138 | name = $"Negative Random Data Scaled", 139 | text = $"Negative Random Data Scaled", 140 | x = densedataOut.Select(f => f[0]).ToList(), 141 | y = densedataOut.Select(f => f[1]).ToList(), 142 | z = densedataOut.Select(f => f[2]).ToList(), 143 | mode = "markers", 144 | marker = new Graph.Marker() { 145 | color = "green", 146 | size = 5.0, 147 | symbol = "circle", 148 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 149 | }, 150 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 151 | }; 152 | 153 | var layout = new Layout.Layout() { title = "Data", width = 600, height = 600, margin = new Graph.Margin() { l = 0, r = 0, b = 0, t = 65 }}; 154 | var scatterplot = Chart.Plot(new Graph.Scatter3d[] {scatter3din, scatter3dout, scatter3dout2}, layout); 155 | 156 | scatterplot.WithWidth(600); 157 | scatterplot.WithHeight(600); 158 | scatterplot.GetHtml().AsHtml(); 159 | ``` -------------------------------------------------------------------------------- /cntk/2_cntk_logistic_regression.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | packages: 3 | - id: FSharp.Core 4 | version: 4.2.3 5 | - id: XPlot.Plotly 6 | version: 1.4.2 7 | uti: com.xamarin.workbook 8 | id: b45738b5-f298-4a26-b2e2-5c48b0f30aa1 9 | title: Logistic Regression 10 | platforms: 11 | - WPF 12 | --- 13 | 14 | Make sure CNTK is in your path 15 | 16 | ```csharp 17 | #r "FSharp.Core" 18 | #r "XPlot.Plotly" 19 | #r "J:\AI\cntk\cntk\Cntk.Core.Managed-2.2.dll" 20 | ``` 21 | 22 | Check if managed assembly loaded correctly 23 | 24 | ```csharp 25 | using System.Linq; 26 | using CNTK; 27 | using XPlot.Plotly; 28 | 29 | var device = CNTK.DeviceDescriptor.GPUDevice(0); 30 | device.AsString(); 31 | ``` 32 | 33 | ```csharp 34 | static int inputDim = 3; 35 | static int numOutputClasses = 2; 36 | 37 | Variable featureVariable = Variable.InputVariable(new int[] { inputDim }, DataType.Float); 38 | Variable labelVariable = Variable.InputVariable(new int[] { numOutputClasses }, DataType.Float); 39 | 40 | var weightParam = new Parameter(new int[] { numOutputClasses, inputDim }, DataType.Float, 1, device, "w"); 41 | var biasParam = new Parameter(new int[] { numOutputClasses }, DataType.Float, 0, device, "b"); 42 | 43 | var classifierOutput = CNTKLib.Times(weightParam, featureVariable) + biasParam; 44 | var loss = CNTKLib.CrossEntropyWithSoftmax(classifierOutput, labelVariable); 45 | var evalError = CNTKLib.ClassificationError(classifierOutput, labelVariable); 46 | 47 | CNTK.TrainingParameterScheduleDouble learningRatePerSample = new CNTK.TrainingParameterScheduleDouble(0.02, 1); 48 | IList parameterLearners = new List() { Learner.SGDLearner(classifierOutput.Parameters(), learningRatePerSample) }; 49 | 50 | var trainer = Trainer.CreateTrainer(classifierOutput, loss, evalError, parameterLearners); 51 | ``` 52 | 53 | ```csharp 54 | private static void GenerateValueData(int sampleSize, int inputDim, int numOutputClasses, 55 | out Value featureValue, out Value labelValue, DeviceDescriptor device) 56 | { 57 | float[] features; 58 | float[] oneHotLabels; 59 | GenerateRawDataSamples(sampleSize, inputDim, numOutputClasses, out features, out oneHotLabels); 60 | 61 | featureValue = Value.CreateBatch(new int[] { inputDim }, features, device); 62 | labelValue = Value.CreateBatch(new int[] { numOutputClasses }, oneHotLabels, device); 63 | } 64 | 65 | private static void GenerateRawDataSamples(int sampleSize, int inputDim, int numOutputClasses, 66 | out float[] features, out float[] oneHotLabels) 67 | { 68 | Random random = new Random(0); 69 | 70 | features = new float[sampleSize * inputDim]; 71 | oneHotLabels = new float[sampleSize * numOutputClasses]; 72 | 73 | for (int sample = 0; sample < sampleSize; sample++) 74 | { 75 | int label = random.Next(numOutputClasses); 76 | for (int i = 0; i < numOutputClasses; i++) 77 | { 78 | oneHotLabels[sample * numOutputClasses + i] = label == i ? 1 : 0; 79 | } 80 | 81 | for (int i = 0; i < inputDim; i++) 82 | { 83 | features[sample * inputDim + i] = (float)GenerateGaussianNoise(3, 1, random) * (label + 1); 84 | } 85 | } 86 | } 87 | 88 | /// 89 | /// https://en.wikipedia.org/wiki/Box%E2%80%93Muller_transform 90 | /// https://stackoverflow.com/questions/218060/random-gaussian-variables 91 | /// 92 | /// 93 | static double GenerateGaussianNoise(double mean, double stdDev, Random random) 94 | { 95 | double u1 = 1.0 - random.NextDouble(); 96 | double u2 = 1.0 - random.NextDouble(); 97 | double stdNormalRandomValue = Math.Sqrt(-2.0 * Math.Log(u1)) * Math.Sin(2.0 * Math.PI * u2); 98 | return mean + stdDev * stdNormalRandomValue; 99 | } 100 | ``` 101 | 102 | Now we train the model 103 | 104 | ```csharp 105 | int minibatchSize = 64; 106 | int numMinibatchesToTrain = 1000; 107 | int updatePerMinibatches = 50; 108 | 109 | // train the model 110 | for (int minibatchCount = 0; minibatchCount < numMinibatchesToTrain; minibatchCount++) 111 | { 112 | Value features, labels; 113 | GenerateValueData(minibatchSize, inputDim, numOutputClasses, out features, out labels, device); 114 | //TODO: sweepEnd should be set properly instead of false. 115 | #pragma warning disable 618 116 | trainer.TrainMinibatch(new Dictionary() { { featureVariable, features }, { labelVariable, labels } }, device); 117 | #pragma warning restore 618 118 | 119 | } 120 | Console.WriteLine($"Features: {featureVariable.AsString()}"); 121 | Console.WriteLine($"Labels: {labelVariable.AsString()}"); 122 | 123 | int testSize = 100; 124 | Value testFeatureValue, expectedLabelValue; 125 | GenerateValueData(testSize, inputDim, numOutputClasses, out testFeatureValue, out expectedLabelValue, device); 126 | 127 | Console.WriteLine($"Test Feature Value: {testFeatureValue.AsString()}"); 128 | Console.WriteLine($"Test Label Value: {expectedLabelValue.AsString()}"); 129 | 130 | // GetDenseData just needs the variable's shape 131 | IList> expectedOneHot = expectedLabelValue.GetDenseData(labelVariable); 132 | List expectedLabels = expectedOneHot.Select(l => l.IndexOf(1.0F)).ToList(); 133 | 134 | Console.WriteLine($"Expected: {string.Join(", ", expectedLabels)}"); 135 | 136 | List> features = testFeatureValue.GetDenseData(featureVariable).ToList(); 137 | 138 | var scatter3d = new Graph.Scatter3d() 139 | { 140 | name = $"Data", 141 | text = $"Data", 142 | x = features.Select(f => f[0]).ToList(), 143 | y = features.Select(f => f[1]).ToList(), 144 | z = features.Select(f => f[2]).ToList(), 145 | mode = "markers", 146 | marker = new Graph.Marker() { 147 | color = "black", 148 | size = 5.0, 149 | symbol = "circle", 150 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 151 | }, 152 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 153 | }; 154 | 155 | var layout = new Layout.Layout() { title = "Data", width = 600, height = 600, margin = new Graph.Margin() { l = 0, r = 0, b = 0, t = 65 }}; 156 | var scatterplot = Chart.Plot(scatter3d, layout); 157 | 158 | scatterplot.WithWidth(600); 159 | scatterplot.WithHeight(600); 160 | scatterplot.GetHtml().AsHtml(); 161 | ``` 162 | 163 | Now we test the model 164 | 165 | ```csharp 166 | 167 | 168 | 169 | var inputDataMap = new Dictionary() { { featureVariable, testFeatureValue } }; 170 | var outputDataMap = new Dictionary() { { classifierOutput.Output, null } }; 171 | classifierOutput.Evaluate(inputDataMap, outputDataMap, device); 172 | var outputValue = outputDataMap[classifierOutput.Output]; 173 | List> actualLabelSoftMax = outputValue.GetDenseData(classifierOutput.Output).ToList(); 174 | var actualLabels = actualLabelSoftMax.Select((IList l) => l.IndexOf(l.Max())).ToList(); 175 | int misMatches = actualLabels.Zip(expectedLabels, (a, b) => a.Equals(b) ? 0 : 1).Sum(); 176 | 177 | Console.WriteLine($"Validating Model: Total Samples = {testSize}, Misclassify Count = {misMatches}"); 178 | 179 | 180 | List colors = new List() { "red", "blue" }; 181 | Random rnd = new Random(); 182 | List traces = new List(); 183 | foreach (int series in actualLabels.Distinct()) 184 | { 185 | traces.Add(new Graph.Scatter3d() 186 | { 187 | name = $"Class {series}", 188 | text = $"Class {series}", 189 | x = features.Where(f => series == actualLabels[features.IndexOf(f)]).Select(f => f[0]).ToList(), 190 | y = features.Where(f => series == actualLabels[features.IndexOf(f)]).Select(f => f[1]).ToList(), 191 | z = features.Where(f => series == actualLabels[features.IndexOf(f)]).Select(f => f[2]).ToList(), 192 | mode = "markers", 193 | marker = new Graph.Marker() { 194 | color = series < colors.Count ? colors[series] : $"rgb({rnd.Next(50, 200)},{rnd.Next(50, 200)},{rnd.Next(50, 200)})", 195 | size = 5.0, 196 | symbol = "circle", 197 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 198 | }, 199 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 200 | }); 201 | } 202 | 203 | colors = new List() { "rgb(0, 100, 0)", "rgb(0, 200, 0)" }; 204 | foreach (int series in actualLabels.Distinct()) 205 | { 206 | traces.Add(new Graph.Scatter3d() 207 | { 208 | name = $"Mismatch {series}", 209 | text = $"Mismatch {series}", 210 | x = features.Where(f => series == actualLabels[features.IndexOf(f)] && actualLabels[features.IndexOf(f)] != expectedLabels[features.IndexOf(f)]).Select(f => f[0]).ToList(), 211 | y = features.Where(f => series == actualLabels[features.IndexOf(f)] && actualLabels[features.IndexOf(f)] != expectedLabels[features.IndexOf(f)]).Select(f => f[1]).ToList(), 212 | z = features.Where(f => series == actualLabels[features.IndexOf(f)] && actualLabels[features.IndexOf(f)] != expectedLabels[features.IndexOf(f)]).Select(f => f[2]).ToList(), 213 | mode = "markers", 214 | marker = new Graph.Marker() { 215 | color = series < colors.Count ? colors[series] : $"rgb({rnd.Next(50, 200)},{rnd.Next(50, 200)},{rnd.Next(50, 200)})", 216 | size = 10.0, 217 | symbol = "circle", 218 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 219 | }, 220 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 221 | }); 222 | } 223 | 224 | 225 | //traces.ToArray(); 226 | 227 | var layout = new Layout.Layout() { title = "Data", width = 600, height = 600, margin = new Graph.Margin() { l = 0, r = 0, b = 0, t = 65 }}; 228 | var scatterplot = Chart.Plot(traces.ToArray(), layout); 229 | 230 | scatterplot.WithWidth(600); 231 | scatterplot.WithHeight(600); 232 | scatterplot.GetHtml().AsHtml(); 233 | ``` -------------------------------------------------------------------------------- /cntk/3_cntk_lstm_sequence_classify.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | packages: 3 | - id: FSharp.Core 4 | version: 4.2.3 5 | - id: XPlot.Plotly 6 | version: 1.4.2 7 | uti: com.xamarin.workbook 8 | id: b45738b5-f298-4a26-b2e2-5c48b0f30aa1 9 | title: CNTK LSTM Sequence Classification 10 | platforms: 11 | - WPF 12 | --- 13 | 14 | Make sure CNTK is in your path 15 | 16 | ```csharp 17 | #r "FSharp.Core" 18 | #r "XPlot.Plotly" 19 | #r "J:\AI\cntk\cntk\Cntk.Core.Managed-2.2.dll" 20 | ``` 21 | 22 | Check if managed assembly loaded correctly 23 | 24 | ```csharp 25 | using System.IO; 26 | using System.Net; 27 | using System.Linq; 28 | 29 | using CNTK; 30 | using XPlot.Plotly; 31 | 32 | var device = CNTK.DeviceDescriptor.GPUDevice(0); 33 | device.AsString(); 34 | ``` 35 | 36 | First we download the traning data 37 | 38 | ```csharp 39 | const string DataFolder = @".\data"; 40 | const string baseurl = "https://raw.githubusercontent.com/Microsoft/CNTK/master/Tests/EndToEndTests/Text/SequenceClassification/Data"; 41 | Dictionary files = new Dictionary() { 42 | {$"{DataFolder}\\Train.ctf", $"{baseurl}/Train.ctf"}, 43 | {$"{DataFolder}\\Train.txt", $"{baseurl}/Train.txt"}, 44 | {$"{DataFolder}\\embeddingmatrix.txt", $"{baseurl}/embeddingmatrix.txt"} 45 | }; 46 | 47 | var web = new WebClient(); 48 | files.ToList().ForEach(f => { 49 | if (!File.Exists(f.Key)) 50 | { 51 | web.DownloadFile(f.Value, f.Key); 52 | Console.WriteLine($"Downloaded {f.Key} {(new FileInfo(f.Key)).FullName}"); 53 | } else { 54 | Console.WriteLine($"File Exists {f.Key} {(new FileInfo(f.Key)).FullName}"); 55 | } 56 | }); 57 | ``` 58 | 59 | ```csharp 60 | static Function Stabilize(Variable x, DeviceDescriptor device) 61 | { 62 | bool isFloatType = typeof(ElementType).Equals(typeof(float)); 63 | Constant f, fInv; 64 | if (isFloatType) 65 | { 66 | f = Constant.Scalar(4.0f, device); 67 | fInv = Constant.Scalar(f.DataType, 1.0 / 4.0f); 68 | } 69 | else 70 | { 71 | f = Constant.Scalar(4.0, device); 72 | fInv = Constant.Scalar(f.DataType, 1.0 / 4.0f); 73 | } 74 | 75 | var beta = CNTKLib.ElementTimes( 76 | fInv, 77 | CNTKLib.Log( 78 | Constant.Scalar(f.DataType, 1.0) + 79 | CNTKLib.Exp(CNTKLib.ElementTimes(f, new Parameter(new NDShape(), f.DataType, 0.99537863 /* 1/f*ln (e^f-1) */, device))))); 80 | return CNTKLib.ElementTimes(beta, x); 81 | } 82 | 83 | static Tuple LSTMPCellWithSelfStabilization( 84 | Variable input, Variable prevOutput, Variable prevCellState, DeviceDescriptor device) 85 | { 86 | int outputDim = prevOutput.Shape[0]; 87 | int cellDim = prevCellState.Shape[0]; 88 | 89 | bool isFloatType = typeof(ElementType).Equals(typeof(float)); 90 | DataType dataType = isFloatType ? DataType.Float : DataType.Double; 91 | 92 | Func createBiasParam; 93 | if (isFloatType) createBiasParam = (dim) => new Parameter(new int[] { dim }, 0.01f, device, ""); 94 | else createBiasParam = (dim) => new Parameter(new int[] { dim }, 0.01, device, ""); 95 | 96 | uint seed2 = 1; 97 | Func createProjectionParam = (oDim) => new Parameter(new int[] { oDim, NDShape.InferredDimension }, 98 | dataType, CNTKLib.GlorotUniformInitializer(1.0, 1, 0, seed2++), device); 99 | 100 | Func createDiagWeightParam = (dim) => 101 | new Parameter(new int[] { dim }, dataType, CNTKLib.GlorotUniformInitializer(1.0, 1, 0, seed2++), device); 102 | 103 | Function stabilizedPrevOutput = Stabilize(prevOutput, device); 104 | Function stabilizedPrevCellState = Stabilize(prevCellState, device); 105 | 106 | Func projectInput = () => 107 | createBiasParam(cellDim) + (createProjectionParam(cellDim) * input); 108 | 109 | // Input gate 110 | Function it = CNTKLib.Sigmoid( 111 | (Variable)(projectInput() + (createProjectionParam(cellDim) * stabilizedPrevOutput)) + 112 | CNTKLib.ElementTimes(createDiagWeightParam(cellDim), stabilizedPrevCellState)); 113 | 114 | Function bit = CNTKLib.ElementTimes(it, 115 | CNTKLib.Tanh(projectInput() + (createProjectionParam(cellDim) * stabilizedPrevOutput))); 116 | 117 | // Forget-me-not gate 118 | Function ft = CNTKLib.Sigmoid((Variable)( 119 | projectInput() + (createProjectionParam(cellDim) * stabilizedPrevOutput)) + 120 | CNTKLib.ElementTimes(createDiagWeightParam(cellDim), stabilizedPrevCellState)); 121 | 122 | Function bft = CNTKLib.ElementTimes(ft, prevCellState); 123 | 124 | Function ct = (Variable)bft + bit; 125 | 126 | // Output gate 127 | Function ot = CNTKLib.Sigmoid( 128 | (Variable)(projectInput() + (createProjectionParam(cellDim) * stabilizedPrevOutput)) + 129 | CNTKLib.ElementTimes(createDiagWeightParam(cellDim), Stabilize(ct, device))); 130 | Function ht = CNTKLib.ElementTimes(ot, CNTKLib.Tanh(ct)); 131 | 132 | Function c = ct; 133 | Function h = (outputDim != cellDim) ? (createProjectionParam(outputDim) * Stabilize(ht, device)) : ht; 134 | 135 | return new Tuple(h, c); 136 | } 137 | 138 | static Tuple LSTMPComponentWithSelfStabilization(Variable input, 139 | NDShape outputShape, NDShape cellShape, 140 | Func recurrenceHookH, 141 | Func recurrenceHookC, 142 | DeviceDescriptor device) 143 | { 144 | var dh = Variable.PlaceholderVariable(outputShape, input.DynamicAxes); 145 | var dc = Variable.PlaceholderVariable(cellShape, input.DynamicAxes); 146 | 147 | var LSTMCell = LSTMPCellWithSelfStabilization(input, dh, dc, device); 148 | var actualDh = recurrenceHookH(LSTMCell.Item1); 149 | var actualDc = recurrenceHookC(LSTMCell.Item2); 150 | 151 | // Form the recurrence loop by replacing the dh and dc placeholders with the actualDh and actualDc 152 | (LSTMCell.Item1).ReplacePlaceholders(new Dictionary { { dh, actualDh }, { dc, actualDc } }); 153 | 154 | return new Tuple(LSTMCell.Item1, LSTMCell.Item2); 155 | } 156 | 157 | private static Function Embedding(Variable input, int embeddingDim, DeviceDescriptor device) 158 | { 159 | System.Diagnostics.Debug.Assert(input.Shape.Rank == 1); 160 | int inputDim = input.Shape[0]; 161 | var embeddingParameters = new Parameter(new int[] { embeddingDim, inputDim }, DataType.Float, CNTKLib.GlorotUniformInitializer(), device); 162 | return CNTKLib.Times(embeddingParameters, input); 163 | } 164 | 165 | public static bool MiniBatchDataIsSweepEnd(ICollection minibatchValues) 166 | { 167 | return minibatchValues.Any(a => a.sweepEnd); 168 | } 169 | 170 | public static Function FullyConnectedLinearLayer(Variable input, int outputDim, DeviceDescriptor device, string outputName = "") 171 | { 172 | System.Diagnostics.Debug.Assert(input.Shape.Rank == 1); 173 | int inputDim = input.Shape[0]; 174 | 175 | int[] s = { outputDim, inputDim }; 176 | var timesParam = new Parameter((NDShape)s, DataType.Float, 177 | CNTKLib.GlorotUniformInitializer( 178 | CNTKLib.DefaultParamInitScale, 179 | CNTKLib.SentinelValueForInferParamInitRank, 180 | CNTKLib.SentinelValueForInferParamInitRank, 1), 181 | device, "timesParam"); 182 | var timesFunction = CNTKLib.Times(timesParam, input, "times"); 183 | 184 | int[] s2 = { outputDim }; 185 | var plusParam = new Parameter(s2, 0.0f, device, "plusParam"); 186 | return CNTKLib.Plus(plusParam, timesFunction, outputName); 187 | } 188 | 189 | static Function LSTMSequenceClassifierNet(Variable input, int numOutputClasses, int embeddingDim, int LSTMDim, int cellDim, DeviceDescriptor device, string outputName) 190 | { 191 | Function embeddingFunction = Embedding(input, embeddingDim, device); 192 | Func pastValueRecurrenceHook = (x) => CNTKLib.PastValue(x); 193 | Function LSTMFunction = LSTMPComponentWithSelfStabilization( 194 | embeddingFunction, 195 | new int[] { LSTMDim }, 196 | new int[] { cellDim }, 197 | pastValueRecurrenceHook, 198 | pastValueRecurrenceHook, 199 | device).Item1; 200 | Function thoughtVectorFunction = CNTKLib.SequenceLast(LSTMFunction); 201 | 202 | return FullyConnectedLinearLayer(thoughtVectorFunction, numOutputClasses, device, outputName); 203 | } 204 | 205 | public static void PrintTrainingProgress(Trainer trainer, int minibatchIdx, int outputFrequencyInMinibatches) 206 | { 207 | if ((minibatchIdx % outputFrequencyInMinibatches) == 0 && trainer.PreviousMinibatchSampleCount() != 0) 208 | { 209 | float trainLossValue = (float)trainer.PreviousMinibatchLossAverage(); 210 | float evaluationValue = (float)trainer.PreviousMinibatchEvaluationAverage(); 211 | Console.WriteLine($"Minibatch: {minibatchIdx} CrossEntropyLoss = {trainLossValue}, EvaluationCriterion = {evaluationValue}"); 212 | } 213 | } 214 | ``` 215 | 216 | ```csharp 217 | const int inputDim = 2000; 218 | const int cellDim = 25; 219 | const int hiddenDim = 25; 220 | const int embeddingDim = 50; 221 | const int numOutputClasses = 5; 222 | 223 | // build the model 224 | var featuresName = "features"; 225 | var features = Variable.InputVariable(new int[] { inputDim }, DataType.Float, featuresName, null, true /*isSparse*/); 226 | var labelsName = "labels"; 227 | var labels = Variable.InputVariable(new int[] { numOutputClasses }, DataType.Float, labelsName, 228 | new List() { Axis.DefaultBatchAxis() }, true); 229 | 230 | Console.WriteLine("Build LSTM"); 231 | var classifierOutput = LSTMSequenceClassifierNet(features, numOutputClasses, embeddingDim, hiddenDim, cellDim, device, "classifierOutput"); 232 | Function trainingLoss = CNTKLib.CrossEntropyWithSoftmax(classifierOutput, labels, "lossFunction"); 233 | Function prediction = CNTKLib.ClassificationError(classifierOutput, labels, "classificationError"); 234 | 235 | Console.WriteLine("Load data..."); 236 | // prepare training data 237 | IList streamConfigurations = new StreamConfiguration[] 238 | { new StreamConfiguration(featuresName, inputDim, true, "x"), new StreamConfiguration(labelsName, numOutputClasses, false, "y") }; 239 | var minibatchSource = MinibatchSource.TextFormatMinibatchSource( 240 | Path.Combine(DataFolder, "Train.ctf"), streamConfigurations, 241 | MinibatchSource.InfinitelyRepeat, true); 242 | var featureStreamInfo = minibatchSource.StreamInfo(featuresName); 243 | var labelStreamInfo = minibatchSource.StreamInfo(labelsName); 244 | 245 | // prepare for training 246 | TrainingParameterScheduleDouble learningRatePerSample = new TrainingParameterScheduleDouble(0.0005, 1); 247 | TrainingParameterScheduleDouble momentumTimeConstant = CNTKLib.MomentumAsTimeConstantSchedule(256); 248 | IList parameterLearners = new List() {Learner.MomentumSGDLearner(classifierOutput.Parameters(), learningRatePerSample, momentumTimeConstant, /*unitGainMomentum = */true) }; 249 | var trainer = Trainer.CreateTrainer(classifierOutput, trainingLoss, prediction, parameterLearners); 250 | 251 | // train the model 252 | uint minibatchSize = 200; 253 | int outputFrequencyInMinibatches = 20; 254 | int miniBatchCount = 0; 255 | Console.WriteLine("Start training..."); 256 | for (int numEpochs = 5; numEpochs > 0;) 257 | { 258 | var minibatchData = minibatchSource.GetNextMinibatch(minibatchSize, device); 259 | 260 | var arguments = new Dictionary 261 | { 262 | { features, minibatchData[featureStreamInfo] }, 263 | { labels, minibatchData[labelStreamInfo] } 264 | }; 265 | 266 | trainer.TrainMinibatch(arguments, device); 267 | PrintTrainingProgress(trainer, miniBatchCount++, outputFrequencyInMinibatches); 268 | 269 | // Because minibatchSource is created with MinibatchSource.InfinitelyRepeat, 270 | // batching will not end. Each time minibatchSource completes an sweep (epoch), 271 | // the last minibatch data will be marked as end of a sweep. We use this flag 272 | // to count number of epochs. 273 | if (MiniBatchDataIsSweepEnd(minibatchData.Values)) 274 | { 275 | numEpochs--; 276 | } 277 | } 278 | 279 | Console.WriteLine("Done training..."); 280 | ``` -------------------------------------------------------------------------------- /cntk/README.md: -------------------------------------------------------------------------------- 1 | # Introduction 2 | 3 | This is a list of Xamarin Workbooks for getting started with CNTK for .Net 4 | 5 | ## Xamarin Workbooks 6 | Download 7 | [Xamarin Workbooks](https://developer.xamarin.com/guides/cross-platform/workbooks/) 8 | 9 | ## Nuget 10 | - Each wokbook may require packages available on Nuget www.nuget.org 11 | - The workbooks will have comment and refeneces to guide on the packages needed 12 | 13 | ### CNTK 14 | 15 | 16 | Getting Started with CNTK 17 | ![GS1](../cntk/screenshot.cntk.gs1.png) 18 | 19 | Getting Started with CNTK 20 | ![GS2](../cntk/screenshot.cntk.gs2.png) 21 | 22 | Logistic regression with CNTK 23 | ![LR2](../cntk/screenshot.cntk.log_reg1.png) 24 | 25 | -------------------------------------------------------------------------------- /cntk/screenshot.cntk.gs1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/cntk/screenshot.cntk.gs1.png -------------------------------------------------------------------------------- /cntk/screenshot.cntk.gs2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/cntk/screenshot.cntk.gs2.png -------------------------------------------------------------------------------- /cntk/screenshot.cntk.log_reg1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/cntk/screenshot.cntk.log_reg1.png -------------------------------------------------------------------------------- /plotlib/plotlib.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: XPlot.Plotly 7 | version: 1.4.2 8 | - id: Newtonsoft.Json 9 | version: 9.0.1 10 | - id: System.ValueTuple 11 | version: 4.3.0 12 | - id: FSharp.Core 13 | version: 4.1.17 14 | - id: MatplotlibCS 15 | version: 1.0.45 16 | - id: NLog 17 | version: 4.4.9 18 | --- 19 | 20 | ```csharp 21 | #r "FSharp.Core" 22 | #r "XPlot.Plotly" 23 | ``` 24 | 25 | ```csharp 26 | using XPlot.Plotly; 27 | using System.IO; 28 | using System.Linq; 29 | ``` 30 | 31 | ```csharp 32 | var bar = new Graph.Bar() 33 | { 34 | x = new int[] {20,14, 23}, 35 | y = new string[] { "giraffes", "orangutans", "monkeys"}, 36 | orientation = "h" 37 | }; 38 | Chart.Plot(bar).GetHtml().AsHtml() 39 | ``` 40 | 41 | ```csharp 42 | 43 | string[] text = System.IO.File.ReadAllLines("xplot.mt.bruno.txt"); 44 | float[][] z = text 45 | .ToList() 46 | .ConvertAll(t => t.Split(',').ToList().ConvertAll(r => Convert.ToSingle(r)).ToArray()) 47 | .ToArray(); 48 | 49 | var layout = new Layout.Layout() { title = "Mt Bruno Elevation", autosize = true, margin = new Graph.Margin() { l = 65, r = 50, b = 65, t = 90 }}; 50 | var surfaceplot = Chart.Plot(new Graph.Surface() { z = z }, layout); 51 | surfaceplot.WithWidth(700); 52 | surfaceplot.WithHeight(500); 53 | surfaceplot.GetHtml().AsHtml(); 54 | ``` 55 | 56 | ```csharp 57 | string[] text = System.IO.File.ReadAllLines("xplot.3DLineData.txt"); 58 | float[][] data = text 59 | .ToList() 60 | .ConvertAll(t => t.Split(',').ToList().ConvertAll(r => Convert.ToSingle(r)).ToArray()) 61 | .ToArray(); 62 | 63 | float[] x1 = data[0], y1 = data[1], z1 = data[2]; 64 | float[] x2 = data[3], y2 = data[4], z2 = data[5]; 65 | float[] x3 = data[6], y3 = data[7], z3 = data[8]; 66 | 67 | var trace1 = new Graph.Scatter3d() { x = x1, y = y1, z = z1, mode = "lines", 68 | marker = new Graph.Marker() { color = "#1f77b4", size = 12.0, symbol = "circle", 69 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 70 | }, 71 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 72 | }; 73 | 74 | var trace2 = new Graph.Scatter3d() { x = x2, y = y2, z = z2, mode = "lines", 75 | marker = new Graph.Marker() { color = "#9467bd", size = 12.0, symbol = "circle", 76 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 77 | }, 78 | line = new Graph.Line() { color = "rgb(44, 160, 44)", width = 1.0} 79 | }; 80 | 81 | var trace3 = new Graph.Scatter3d() { x = x3, y = y3, z = z3, mode = "lines", 82 | marker = new Graph.Marker() { color = "#bcbd22", size = 12.0, symbol = "circle", 83 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 84 | }, 85 | line = new Graph.Line() { color = "#bcbd22", width = 1.0} 86 | }; 87 | 88 | var layout = new Layout.Layout() { title = "3D Random Walk", autosize = false, margin = new Graph.Margin() { l = 0, r = 0, b = 0, t = 65 }}; 89 | 90 | var scatterplot = Chart.Plot(new Graph.Scatter3d[] {trace1, trace2, trace3}, layout); 91 | scatterplot.WithWidth(700); 92 | scatterplot.WithHeight(500); 93 | scatterplot.GetHtml().AsHtml(); 94 | ``` 95 | 96 | ```csharp 97 | #r "MatplotlibCS" 98 | #r "NLog" 99 | ``` 100 | 101 | ```csharp 102 | using MatplotlibCS; 103 | using MatplotlibCS.PlotItems; 104 | 105 | var matplotlibCs = new MatplotlibCS.MatplotlibCS(@"C:\ProgramData\Anaconda3", @"J:\Git\python\MatplotlibCS\MatplotlibCS\Python"); 106 | 107 | const int N = 100; 108 | var X = new double[N]; 109 | var Y1 = new double[N]; 110 | var Y2 = new double[N]; 111 | var x = 0.0; 112 | const double h = 2 * Math.PI / N; 113 | var rnd = new Random(); 114 | for (var i = 0; i < N; i++) 115 | { 116 | var y = Math.Sin(x); 117 | X[i] = x; 118 | Y1[i] = y; 119 | y = Math.Sin(2 * x); 120 | Y2[i] = y + rnd.NextDouble() / 10.0; 121 | x += h; 122 | } 123 | 124 | var figure = new MatplotlibCS.Figure(1, 1) 125 | { 126 | FileName = "ExampleSin.png", 127 | OnlySaveImage = true, 128 | DPI = 150, 129 | Subplots = 130 | { 131 | new MatplotlibCS.Axes(1, "The X axis", "The Y axis") 132 | { 133 | Title = "Sin(x), Sin(2x), VLines, HLines, Annotations", 134 | Grid = new MatplotlibCS.Grid() 135 | { 136 | MinorAlpha = 0.2, 137 | MajorAlpha = 1.0, 138 | XMajorTicks = new[] {0.0, 7.6, 0.5}, 139 | YMajorTicks = new[] {-1, 2.5, 0.25}, 140 | XMinorTicks = new[] {0.0, 7.25, 0.25}, 141 | YMinorTicks = new[] {-1, 2.5, 0.125} 142 | }, 143 | PlotItems = 144 | { 145 | new MatplotlibCS.PlotItems.Line2D("Sin") 146 | { 147 | X = X.Cast().ToList(), 148 | Y = Y1.ToList(), 149 | LineStyle = LineStyle.Dashed 150 | }, 151 | 152 | new MatplotlibCS.PlotItems.Line2D("Sin 2x") 153 | { 154 | X = X.Cast().ToList(), 155 | Y = Y2.ToList(), 156 | LineStyle = LineStyle.Solid, 157 | LineWidth = 0.5f, 158 | Color = Color.Green, 159 | Markevery = 5, 160 | MarkerSize = 10, 161 | Marker = Marker.Circle 162 | }, 163 | 164 | new Text("ant1", "Text annotation", 4.5, 0.76) 165 | { 166 | FontSize = 17 167 | }, 168 | 169 | new Annotation("ant2", "Arrow text annotation", 0.5, -0.7, 3, 0) 170 | { 171 | Color = Color.Blue 172 | }, 173 | 174 | new Vline("vert line", new object[] {3.0}, -1, 1), 175 | new Hline("hrzt line", new[] {0.1, 0.25, 0.375}, 0, 5) {LineStyle = LineStyle.Dashed, Color = Color.Magenta} 176 | } 177 | } 178 | 179 | } 180 | }; 181 | ``` 182 | 183 | ```csharp 184 | var t = matplotlibCs.BuildFigure(figure); 185 | //t.Wait(); 186 | ``` -------------------------------------------------------------------------------- /screenshot.tfsharp.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/screenshot.tfsharp.png -------------------------------------------------------------------------------- /screenshot.xplot1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/screenshot.xplot1.png -------------------------------------------------------------------------------- /screenshot.xplot2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/screenshot.xplot2.png -------------------------------------------------------------------------------- /tensorflow/1_tf_setup.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: System.ValueTuple 7 | version: 4.3.1 8 | --- 9 | 10 | Tensorflow Sharp 11 | 12 | How to get **TensoflowSharp.dll** and **libtensorflow.dll** with GPU support on windows? 13 | 14 | First install Tensorflow on Windows with GPU support 15 | 16 | 1. Install Anaconda 17 | 18 | 2. Create python 3.5 environment 19 | 20 | 3. Install Tensorflow with Anaconda in py35 environment `https://www.tensorflow.org/install/install_windows` 21 | 22 | 4. Find `_pywrap_tensorflow_internal.pyd` in `C:\ProgramData\Anaconda3\envs\py35\Lib\site-packages\tensorflow\python` 23 | 24 | 5. Copy `_pywrap_tensorflow_internal.pyd` to this workbook folder 25 | 26 | 6. Rename `_pywrap_tensorflow_internal.pyd` to `libtensorflow.dll` 27 | 28 | Now get the **TensorflowSharp.dll** 29 | 30 | 1. Get the Latest TensorflowSharp from source and compile. `https://github.com/migueldeicaza/TensorFlowSharp` 31 | 32 | 2. Copy the TensorflowSharp.dll to this workbook local directory. 33 | 34 | ```csharp 35 | #r "System.ValueTuple" 36 | #r "TensorFlowSharp" 37 | ``` 38 | 39 | ```csharp 40 | using System.IO; 41 | using TensorFlow; 42 | ``` 43 | 44 | Add Anaconda py35 environment to the system path 45 | 46 | ```csharp 47 | var envpaths = new List { @"C:\ProgramData\Anaconda3\envs\py35" } 48 | .Union(Environment.GetEnvironmentVariable("PATH").Split(new char[] { ';' }, StringSplitOptions.RemoveEmptyEntries)); 49 | Environment.SetEnvironmentVariable("PATH", string.Join(";", envpaths)); 50 | ``` 51 | 52 | Get Tensorflow version… 53 | 54 | ```csharp 55 | TFCore.Version 56 | ``` 57 | 58 | Check the assembly location… 59 | 60 | ```csharp 61 | typeof(TFCore).Assembly.CodeBase 62 | ``` 63 | 64 | Starting the first session… 65 | 66 | ```csharp 67 | using (var g = new TFGraph ()) 68 | { 69 | var s = new TFSession (g); 70 | 71 | TFOutput a = g.Const(2); 72 | TFOutput b = g.Const(3); 73 | Console.WriteLine("a=2 b=3"); 74 | 75 | // Add two constants 76 | TFOutput tfadd = g.Add(a, b); 77 | TFTensor results = s.GetRunner().Run(tfadd); 78 | Console.WriteLine($"a+b={results.GetValue()}"); 79 | 80 | // Multiply two constants 81 | TFOutput tfmult = g.Mul(a, b); 82 | results = s.GetRunner().Run(tfmult); 83 | Console.WriteLine($"a*b={results.GetValue()}"); 84 | } 85 | ``` 86 | -------------------------------------------------------------------------------- /tensorflow/2_tf_basics.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: System.ValueTuple 7 | version: 4.3.1 8 | --- 9 | 10 | Tensorflow Example 11 | 12 | Please see the 1\_tf\_setup.workbook for setup instructions. 13 | 14 | ```csharp 15 | #r "System.ValueTuple" 16 | #r "TensorFlowSharp" 17 | ``` 18 | 19 | ```csharp 20 | using System.IO; 21 | using TensorFlow; 22 | 23 | var envpaths = new List { @"C:\ProgramData\Anaconda3\envs\py35" } 24 | .Union(Environment.GetEnvironmentVariable("PATH").Split(new char[] { ';' }, StringSplitOptions.RemoveEmptyEntries)); 25 | Environment.SetEnvironmentVariable("PATH", string.Join(";", envpaths)); 26 | 27 | System.Console.WriteLine($"Version: {TFCore.Version} at {typeof(TFCore).Assembly.CodeBase}") 28 | ``` 29 | 30 | From https://github.com/aymericdamien/TensorFlow-Examples 31 | 32 | Getting Started, Tensor Addition and Mutliply 33 | 34 | ```csharp 35 | using (var g = new TFGraph ()) 36 | { 37 | var s = new TFSession (g); 38 | 39 | TFOutput a = g.Const(2); 40 | TFOutput b = g.Const(3); 41 | Console.WriteLine("a=2 b=3"); 42 | 43 | // Add two constants 44 | TFOutput tfadd = g.Add(a, b); 45 | TFTensor results = s.GetRunner().Run(tfadd); 46 | Console.WriteLine($"a+b={results.GetValue()}"); 47 | 48 | // Multiply two constants 49 | TFOutput tfmult = g.Mul(a, b); 50 | results = s.GetRunner().Run(tfmult); 51 | Console.WriteLine($"a*b={results.GetValue()}"); 52 | } 53 | ``` 54 | 55 | Shows how to use placeholders to pass values 56 | 57 | ```csharp 58 | using (var g = new TFGraph ()) 59 | { 60 | var s = new TFSession (g); 61 | 62 | // We use "shorts" here, so notice the casting to short to get the 63 | // tensor with the right data type. 64 | 65 | var var_a = g.Placeholder (TFDataType.Int16); 66 | var var_b = g.Placeholder (TFDataType.Int16); 67 | 68 | var add = g.Add (var_a, var_b); 69 | var mul = g.Mul (var_a, var_b); 70 | 71 | var runner = s.GetRunner (); 72 | runner.AddInput (var_a, new TFTensor ((short)3)); 73 | runner.AddInput (var_b, new TFTensor ((short)2)); 74 | Console.WriteLine ("a+b={0}", runner.Run (add).GetValue ()); 75 | 76 | runner = s.GetRunner (); 77 | runner.AddInput (var_a, new TFTensor ((short)3)); 78 | runner.AddInput (var_b, new TFTensor ((short)2)); 79 | 80 | Console.WriteLine ("a*b={0}", runner.Run (mul).GetValue ()); 81 | } 82 | ``` 83 | 84 | Shows the use of Variable 85 | 86 | ```csharp 87 | var status = new TFStatus (); 88 | using (var g = new TFGraph ()) 89 | { 90 | var s = new TFSession (g); 91 | 92 | var initValue = g.Const (1.5); 93 | var increment = g.Const (0.5); 94 | TFOperation init; 95 | TFOutput value; 96 | var handle = g.Variable (initValue, out init, out value); 97 | 98 | // Add 0.5 and assign to the variable. 99 | // Perhaps using op.AssignAddVariable would be better, 100 | // but demonstrating with Add and Assign for now. 101 | var update = g.AssignVariableOp (handle, g.Add (value, increment)); 102 | 103 | // Must first initialize all the variables. 104 | s.GetRunner ().AddTarget (init).Run (status); 105 | 106 | // Now print the value, run the update op and repeat 107 | // Ignore errors. 108 | for (int i = 0; i < 5; i++) { 109 | // Read and update 110 | var result = s.GetRunner ().Fetch (value).AddTarget (update).Run (); 111 | 112 | Console.WriteLine ("Result of variable read {0} -> {1}", i, result [0].GetValue ()); 113 | } 114 | } 115 | ``` 116 | 117 | Basic Multidimensional Array 118 | 119 | ```csharp 120 | private static string RowOrderJoin(int[,,] array) => string.Join (", ", array.Cast ()); 121 | 122 | using (var g = new TFGraph ()) 123 | { 124 | var s = new TFSession (g); 125 | 126 | var var_a = g.Placeholder (TFDataType.Int32); 127 | var mul = g.Mul (var_a, g.Const (2)); 128 | 129 | var a = new int[,,] { { { 0, 1 } , { 2, 3 } } , { { 4, 5 }, { 6, 7 } } }; 130 | var result = s.GetRunner ().AddInput (var_a, a).Fetch (mul).Run () [0]; 131 | 132 | var actual = (int[,,])result.GetValue (); 133 | var expected = new int[,,] { { { 0, 2 } , { 4, 6 } } , { { 8, 10 }, { 12, 14 } } }; 134 | 135 | Console.WriteLine ("Actual: " + RowOrderJoin (actual)); 136 | Console.WriteLine ("Expected: " + RowOrderJoin (expected)); 137 | } 138 | ``` 139 | 140 | Basic matrix multiply 141 | 142 | ```csharp 143 | using (var g = new TFGraph ()) 144 | { 145 | var s = new TFSession (g); 146 | // 1x2 matrix 147 | var matrix1 = g.Const (new double [,] { { 3, 3 } }); 148 | // 2x1 matrix 149 | var matrix2 = g.Const (new double [,] { { 2 }, { 2 } }); 150 | // multiply 151 | var product = g.MatMul (matrix1, matrix2); 152 | 153 | var results = s.GetRunner ().Run (product); 154 | 155 | Console.WriteLine ($"Tensor ToString={results}"); 156 | Console.WriteLine ($"Value [0,0]={((double[,])results.GetValue())[0,0]}"); 157 | } 158 | ``` 159 | 160 | Linear Regression 161 | 162 | Port of https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/2\_BasicModels/linear\_regression.py 163 | 164 | ```csharp 165 | var learning_rate = 0.01; 166 | var training_epochs = 1000; 167 | var display_step = 50; 168 | 169 | // Training data 170 | var train_x = new double [] { 171 | 3.3, 4.4, 5.5, 6.71, 6.93, 4.168, 9.779, 6.182, 7.59, 2.167,7.042, 10.791, 5.313, 7.997, 5.654, 9.27, 3.1 172 | }; 173 | var train_y = new double [] { 174 | 1.7,2.76,2.09,3.19,1.694,1.573,3.366,2.596,2.53,1.221,2.827,3.465,1.65,2.904,2.42,2.94,1.3 175 | }; 176 | var n_samples = train_x.Length; 177 | using (var g = new TFGraph ()) 178 | { 179 | var s = new TFSession (g); 180 | var rng = new Random (); 181 | // tf Graph Input 182 | 183 | var X = g.Placeholder (TFDataType.Float); 184 | var Y = g.Placeholder (TFDataType.Float); 185 | var W = g.Variable (g.Const (rng.Next ()), operName: "weight"); 186 | var b = g.Variable (g.Const (rng.Next ()), operName: "bias"); 187 | var pred = g.Add (g.Mul (X, W), b); 188 | 189 | var cost = g.Div (g.ReduceSum (g.Pow (g.Sub (pred, Y), g.Const (2))), g.Mul (g.Const (2), g.Const (n_samples))); 190 | 191 | // STuck here: TensorFlow bindings need to surface gradient support 192 | // waiting on Google for this 193 | // https://github.com/migueldeicaza/TensorFlowSharp/issues/25 194 | } 195 | ``` -------------------------------------------------------------------------------- /tensorflow/2_tf_getting_started.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: System.ValueTuple 7 | version: 4.3.1 8 | --- 9 | 10 | Tensorflow Example 11 | 12 | Please see the 1\_tf\_setup.workbook for setup instructions. 13 | 14 | ```csharp 15 | #r "System.ValueTuple" 16 | #r "TensorFlowSharp" 17 | ``` 18 | 19 | ```csharp 20 | using System.IO; 21 | using TensorFlow; 22 | 23 | var envpaths = new List { @"C:\ProgramData\Anaconda3\envs\py35" } 24 | .Union(Environment.GetEnvironmentVariable("PATH").Split(new char[] { ';' }, StringSplitOptions.RemoveEmptyEntries)); 25 | Environment.SetEnvironmentVariable("PATH", string.Join(";", envpaths)); 26 | 27 | System.Console.WriteLine($"Version: {TFCore.Version} at {typeof(TFCore).Assembly.CodeBase}") 28 | ``` 29 | 30 | Basic tensors 31 | 32 | ```csharp 33 | var g = new TFGraph (); 34 | var s = new TFSession (g); 35 | var runner = s.GetRunner(); 36 | 37 | TFTensor rank0 = 3;//shape[] 38 | TFTensor rank1 = new double[] { 1.0, 2.0, 3.0 };//shape [3] 39 | TFTensor rank2 = new double[,] { { 1, 2, 3 }, { 4, 5, 6 } };//shape[2,3] 40 | TFTensor rank3 = new double[,,] 41 | { 42 | { { 1, 2, 3 } }, 43 | { { 4, 5, 6 } } 44 | };//shape [2, 1, 3] 45 | ``` 46 | 47 | ### The Computational Graph 48 | 49 | You might think of TensorFlow Core programs as consisting of two discrete sections: 50 | 51 | 1. Building the computational graph. 52 | 53 | 2. Running the computational graph. 54 | 55 | A **computational graph** is a series of TensorFlow operations arranged into a graph of nodes. Let's build a simple computational graph. Each node takes zero or more tensors as inputs and produces a tensor as an output. One type of node is a constant. Like all TensorFlow constants, it takes no inputs, and it outputs a value it stores internally. We can create two floating point Tensors `node1` and `node2` as follows: 56 | 57 | Notice that printing the nodes does not output the values `3.0` and `4.0` as you might expect. Instead, they are nodes that, when evaluated, would produce 3.0 and 4.0, respectively. To actually evaluate the nodes, we must run the computational graph within a **session**. A session encapsulates the control and state of the TensorFlow runtime. 58 | 59 | The following code creates a `Session` object and then invokes its `run` method to run enough of the computational graph to evaluate `node1` and `node2`. By running the computational graph in a session as follows 60 | 61 | ```csharp 62 | TFOutput node1 = g.Const(3.0F, TFDataType.Float); 63 | TFOutput node2 = g.Const(4.0F); 64 | 65 | TFTensor[] results = runner.Run(new TFOutput[] { node1, node2 }); 66 | ``` 67 | 68 | We can build more complicated computations by combining `Tensor` nodes with operations (Operations are also nodes.). For example, we can add our two constant nodes and produce a new graph as follows: 69 | 70 | ```csharp 71 | TFOutput node3 = g.Add(node1, node2); 72 | TFTensor result = runner.Run(node3); 73 | result.GetValue(); 74 | ``` 75 | 76 | TensorFlow provides a utility called TensorBoard that can display a picture of the computational graph. Here is a screenshot showing how TensorBoard visualizes the graph: 77 | 78 | ![TensorBoard screenshot](https://www.tensorflow.org/images/getting_started_add.png) 79 | 80 | As it stands, this graph is not especially interesting because it always produces a constant result. A graph can be parameterized to accept external inputs, known as **placeholders**. A **placeholder** is a promise to provide a value later. 81 | 82 | ```csharp 83 | 84 | TFOutput a = g.Placeholder(TFDataType.Float); 85 | TFOutput b = g.Placeholder(TFDataType.Float); 86 | 87 | TFOutput adder_node = g.Add(a, b); 88 | 89 | TFTensor result = runner 90 | .AddInput(a, 3.0F) 91 | .AddInput(b, 4.5F) 92 | .Run(adder_node); 93 | 94 | result.GetValue(); 95 | ``` 96 | 97 | In TensorBoard, the graph looks like this: 98 | 99 | ![TensorBoard screenshot](https://www.tensorflow.org/images/getting_started_adder.png) 100 | 101 | We can make the computational graph more complex by adding another operation. For example, 102 | 103 | ```csharp 104 | TFOutput a = g.Placeholder(TFDataType.Float); 105 | TFOutput b = g.Placeholder(TFDataType.Float); 106 | 107 | TFOutput adder_node = g.Add(a, b); 108 | TFOutput add_and_triple = g.Mul(g.Const(3.0F, TFDataType.Float), adder_node); 109 | 110 | TFTensor result = s.GetRunner() 111 | .AddInput(a, 3.0F) 112 | .AddInput(b, 4.5F) 113 | .Run(add_and_triple); 114 | 115 | result.GetValue(); 116 | ``` 117 | 118 | The preceding computational graph would look as follows in TensorBoard: 119 | 120 | ![TensorBoard screenshot](https://www.tensorflow.org/images/getting_started_triple.png) 121 | 122 | In machine learning we will typically want a model that can take arbitrary inputs, such as the one above.  To make the model trainable, we need to be able to modify the graph to get new outputs with the same input.  **Variables** allow us to add trainable parameters to a graph.  They are constructed with a type and initial value: 123 | 124 | Constants are initialized when you call `tf.constant`, and their value can never change. By contrast, variables are not initialized when you call `tf.Variable`. To initialize all the variables in a TensorFlow program, you must explicitly call a special operation as follows: 125 | 126 | ```csharp 127 | TFTensor[] results; 128 | { 129 | TFStatus status = new TFStatus(); 130 | var runner = s.GetRunner(); 131 | 132 | TFOutput vW, vb, vlinearmodel; 133 | var x = g.Placeholder(TFDataType.Float); 134 | var hW = g.Variable(g.Const(0.3F, TFDataType.Float), out vW); 135 | var hb = g.Variable(g.Const(-0.3F, TFDataType.Float), out vb); 136 | var hlinearmodel = g.Variable(g.Const(0.0F, TFDataType.Float), out vlinearmodel); 137 | 138 | var linearmodel = g.Add(g.Mul(vW, x), vb); 139 | 140 | var hoplm = g.AssignVariableOp(hlinearmodel, linearmodel); 141 | 142 | //init all variable 143 | runner 144 | .AddTarget(g.GetGlobalVariablesInitializer()) 145 | .AddTarget(hoplm) 146 | .AddInput(x, new float[] { 1F, 2F, 3F, 4F }) 147 | .Run(status); 148 | 149 | //now get actual value 150 | results = s.GetRunner() 151 | .Fetch(vlinearmodel) 152 | .Run(); 153 | } 154 | 155 | results[0].GetValue(); 156 | ``` 157 | 158 | We've created a model, but we don't know how good it is yet. To evaluate the model on training data, we need a `y` placeholder to provide the desired values, and we need to write a loss function. 159 | 160 | A loss function measures how far apart the current model is from the provided data. We'll use a standard loss model for linear regression, which sums the squares of the deltas between the current model and the provided data. `linear_model - y` creates a vector where each element is the corresponding example's error delta. We call `tf.square` to square that error. Then, we sum all the squared errors to create a single scalar that abstracts the error of all examples using `tf.reduce_sum`: 161 | 162 | ```csharp 163 | { 164 | TFStatus status = new TFStatus(); 165 | 166 | var runner = s.GetRunner(); 167 | 168 | TFOutput vW, vb, vloss; 169 | var x = g.Placeholder(TFDataType.Float); 170 | var y = g.Placeholder(TFDataType.Float); 171 | var hW = g.Variable(g.Const(0.3F, TFDataType.Float), out vW); 172 | var hb = g.Variable(g.Const(-0.3F, TFDataType.Float), out vb); 173 | var hloss = g.Variable(g.Const(0.0F, TFDataType.Float), out vloss); 174 | 175 | var linearmodel = g.Add(g.Mul(vW, x), vb); 176 | var squared_deltas = g.Square(g.Add(linearmodel, g.Neg(y))); 177 | var loss = g.ReduceSum(squared_deltas, axis: g.Const(0)); 178 | 179 | var hoploss = g.AssignVariableOp(hloss, loss); 180 | 181 | runner 182 | .AddTarget(g.GetGlobalVariablesInitializer()) 183 | .AddTarget(hoploss) 184 | .AddInput(x, new float[] { 1F, 2F, 3F, 4F }) 185 | .AddInput(y, new float[] { 0F, -1F, -2F, -3F }) 186 | .Run(status); 187 | 188 | results = s.GetRunner() 189 | .Fetch(vloss) 190 | .Run(); 191 | } 192 | results[0].GetValue() 193 | ``` 194 | 195 | We could improve this manually by reassigning the values of `W` and `b` to the perfect values of -1 and 1. A variable is initialized to the value provided to `tf.Variable` but can be changed using operations like `tf.assign`. For example, `W=-1` and `b=1` are the optimal parameters for our model. We can change `W` and `b` accordingly: 196 | 197 | ```csharp 198 | { 199 | TFStatus status = new TFStatus(); 200 | 201 | var runner = s.GetRunner(); 202 | 203 | TFOutput vW, vb, vloss; 204 | var x = g.Placeholder(TFDataType.Float); 205 | var y = g.Placeholder(TFDataType.Float); 206 | var hW = g.Variable(g.Const(0.3F, TFDataType.Float), out vW); 207 | var hb = g.Variable(g.Const(-0.3F, TFDataType.Float), out vb); 208 | var hloss = g.Variable(g.Const(0.0F, TFDataType.Float), out vloss); 209 | 210 | var linearmodel = g.Add(g.Mul(vW, x), vb); 211 | var squared_deltas = g.Square(g.Add(linearmodel, g.Neg(y))); 212 | var loss = g.ReduceSum(squared_deltas, axis: g.Const(0)); 213 | 214 | var hoploss = g.AssignVariableOp(hloss, loss); 215 | 216 | Console.WriteLine("Lets go!!!"); 217 | 218 | runner 219 | .AddTarget(g.GetGlobalVariablesInitializer()) 220 | .AddTarget(hoploss) 221 | .AddInput(x, new float[] { 1F, 2F, 3F, 4F }) 222 | .AddInput(y, new float[] { 0F, -1F, -2F, -3F }) 223 | .Run(status); 224 | 225 | Console.WriteLine(status); 226 | 227 | results = s.GetRunner() 228 | .Fetch(vloss) 229 | .Run(); 230 | 231 | Console.WriteLine($"Result: {results[0]}"); 232 | 233 | var fixWop = g.AssignVariableOp(hW, g.Const(-1)); 234 | var fixbop = g.AssignVariableOp(hb, g.Const(1)); 235 | 236 | s.GetRunner() 237 | .AddTarget(g.GetGlobalVariablesInitializer()) 238 | .AddTarget(fixWop) 239 | .AddTarget(fixbop) 240 | .Run(status); 241 | 242 | Console.WriteLine(status); 243 | 244 | results = s.GetRunner() 245 | .Fetch(vloss) 246 | .Run(); 247 | 248 | Console.WriteLine($"Result: {results[0]}"); 249 | 250 | } 251 | results[0].GetValue() 252 | ``` -------------------------------------------------------------------------------- /tensorflow/3_tf_mnist.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: Accord.DataSets 7 | version: 3.5.0 8 | - id: Accord.Neuro 9 | version: 3.5.0 10 | - id: Accord.Audio 11 | version: 3.5.0 12 | - id: Accord.Vision 13 | version: 3.5.0 14 | - id: Accord.Statistics 15 | version: 3.5.0 16 | - id: System.ValueTuple 17 | version: 4.3.1 18 | - id: Accord 19 | version: 3.5.0 20 | - id: Accord.MachineLearning 21 | version: 3.5.0 22 | - id: Accord.IO 23 | version: 3.5.0 24 | - id: Accord.Video 25 | version: 3.5.0 26 | - id: Accord.Imaging 27 | version: 3.5.0 28 | - id: SharpZipLib 29 | version: 0.86.0 30 | - id: Accord.Math 31 | version: 3.5.0 32 | --- 33 | 34 | *This tutorial is intended for readers who are new to both machine learning and TensorFlow. If you already know what MNIST is, and what softmax (multinomial logistic) regression is, you might prefer this [faster paced tutorial](https://www.tensorflow.org/get_started/mnist/pros). Be sure to[install TensorFlow](https://www.tensorflow.org/install/index) before starting either tutorial.* 35 | 36 | When one learns how to program, there's a tradition that the first thing you do is print "Hello World." Just like programming has Hello World, machine learning has MNIST. 37 | 38 | MNIST is a simple computer vision dataset. It consists of images of handwritten digits like these: 39 | 40 | ![](https://www.tensorflow.org/images/MNIST.png) 41 | 42 | It also includes labels for each image, telling us which digit it is. For example, the labels for the above images are 5, 0, 4, and 1. 43 | 44 | In this tutorial, we're going to train a model to look at images and predict what digits they are. Our goal isn't to train a really elaborate model that achieves state-of-the-art performance -- although we'll give you code to do that later! -- but rather to dip a toe into using TensorFlow. As such, we're going to start with a very simple model, called a Softmax Regression. 45 | 46 | The actual code for this tutorial is very short, and all the interesting stuff happens in just three lines. However, it is very important to understand the ideas behind it: both how TensorFlow works and the core machine learning concepts. Because of this, we are going to very carefully work through the code. 47 | 48 | ```csharp 49 | #r "TensorFlowSharp" 50 | #r "System.IO.Compression" 51 | #r "System.IO.Compression.FileSystem" 52 | #r "System.Numerics" 53 | #r "Accord" 54 | #r "Accord.IO" 55 | #r "Accord.Math" 56 | #r "Accord.Statistics" 57 | #r "Accord.MachineLearning" 58 | #r "Accord.Neuro" 59 | #r "Accord.Imaging" 60 | #r "Accord.Vision" 61 | #r "Accord.Audio" 62 | #r "Accord.Audition" 63 | #r "Accord.DataSets" 64 | ``` 65 | 66 | ```csharp 67 | using System.Net; 68 | using System.IO; 69 | using System.IO.Compression; 70 | 71 | using TensorFlow; 72 | 73 | var envpaths = new List { @"C:\ProgramData\Anaconda3\envs\py35" } 74 | .Union(Environment.GetEnvironmentVariable("PATH").Split(new char[] { ';' }, StringSplitOptions.RemoveEmptyEntries)); 75 | Environment.SetEnvironmentVariable("PATH", string.Join(";", envpaths)); 76 | 77 | System.Console.WriteLine($"Location: {typeof(TFCore).Assembly.CodeBase}"); 78 | TFCore.Version 79 | ``` 80 | 81 | The MNIST data is hosted on [Yann LeCun's website](http://yann.lecun.com/exdb/mnist/). If you are copying and pasting in the code from this tutorial, start here with these two lines of code which will download and read in the data automatically: 82 | 83 | ```csharp 84 | string dir = @".\tmp"; 85 | 86 | var mnistdataset = new Accord.DataSets.MNIST(dir); 87 | Console.WriteLine($"Training Set {mnistdataset.Training.Item1.Count()} vs Testing {mnistdataset.Testing.Item1.Count()}"); 88 | mnistdataset 89 | ``` 90 | 91 | ```csharp 92 | TFTensor result; 93 | using (var g = new TFGraph ()) 94 | { 95 | var s = new TFSession (g); 96 | 97 | var x = g.Placeholder(TFDataType.Float, new TFShape (-1, 784)); 98 | 99 | var W = g.Variable(g.Zeros(new TFShape(784, 10))); 100 | var b = g.Variable(g.Zeros(new TFShape(10))); 101 | 102 | var y = g.Add(g.Mul(x, W), b); 103 | 104 | var ymax = g.Softmax(y); 105 | 106 | result = s.GetRunner().Run(y); 107 | } 108 | result 109 | ``` -------------------------------------------------------------------------------- /tensorflow/4_tf_image_recognistion.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: System.ValueTuple 7 | version: 4.3.1 8 | --- 9 | 10 | Tensorflow Example 11 | 12 | An example for using the TensorFlow C# API for image recognition using a pre-trained inception model (). 13 | 14 | The pre-trained model takes input in the form of a 4-dimensional tensor with shape \[ BATCH\_SIZE, IMAGE\_HEIGHT, IMAGE\_WIDTH, 3 ],\ 15 | where:\\ 16 | 17 | * BATCH\_SIZE allows for inference of multiple images in one pass through the graph\\ 18 | 19 | * IMAGE\_HEIGHT is the height of the images on which the model was trained\\ 20 | 21 | * IMAGE\_WIDTH is the width of the images on which the model was trained\\ 22 | 23 | * 3 is the (R, G, B) values of the pixel colors represented as a float. 24 | 25 | And produces as output a vector with shape \[ NUM\_LABELS ]. \ 26 | output\[i] is the probability that the input image was recognized as having the i-th label. 27 | 28 | A separate file contains a list of string labels corresponding to the integer indices of the output. 29 | 30 | This example:\\ 31 | 32 | * Loads the serialized representation of the pre-trained model into a Graph\\ 33 | 34 | * Creates a Session to execute operations on the Graph\\ 35 | 36 | * Converts an image file to a Tensor to provide as input to a Session run\\ 37 | 38 | * Executes the Session and prints out the label with the highest probability 39 | 40 | To convert an image file to a Tensor suitable for input to the Inception model,\ 41 | this example:\\ 42 | 43 | * Constructs another TensorFlow graph to normalize the image into a form suitable for the model (for example, resizing the image)\\ 44 | 45 | * Creates an executes a Session to obtain a Tensor in this normalized form. 46 | 47 | Install TensorflowSharp from Nuget 48 | 49 | ```csharp 50 | #r "TensorFlowSharp" 51 | #r "System.IO.Compression" 52 | #r "System.IO.Compression.FileSystem" 53 | #r "System.Numerics" 54 | ``` 55 | 56 | ```csharp 57 | using System.IO; 58 | using System.IO.Compression; 59 | using System.Net; 60 | 61 | using TensorFlow; 62 | 63 | var envpaths = new List { @"C:\ProgramData\Anaconda3\envs\py35" } 64 | .Union(Environment.GetEnvironmentVariable("PATH").Split(new char[] { ';' }, StringSplitOptions.RemoveEmptyEntries)); 65 | Environment.SetEnvironmentVariable("PATH", string.Join(";", envpaths)); 66 | 67 | System.Console.WriteLine($"Version: {TFCore.Version} at {typeof(TFCore).Assembly.CodeBase}") 68 | ``` 69 | 70 | Downloads the inception graph and labels 71 | 72 | ```csharp 73 | string dir = @".\tmp"; 74 | string modelFile = Path.Combine (dir, "tensorflow_inception_graph.pb"); 75 | string labelsFile = Path.Combine (dir, "imagenet_comp_graph_label_strings.txt"); 76 | string zipfile = Path.Combine (dir, "inception5h.zip"); 77 | string url = "https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip"; 78 | 79 | if (!(File.Exists (modelFile) || File.Exists (labelsFile))) 80 | { 81 | Directory.CreateDirectory (dir); 82 | var wc = new WebClient (); 83 | wc.DownloadFile (url, zipfile); 84 | ZipFile.ExtractToDirectory (zipfile, dir); 85 | File.Delete (zipfile); 86 | } 87 | 88 | var model = File.ReadAllBytes (modelFile); 89 | ``` 90 | 91 | The inception model takes as input the image described by a Tensor in a very specific normalized format (a particular image size, shape of the input tensor, normalized pixel values etc.).\ 92 | This function constructs a graph of TensorFlow operations which takes as input a JPEG-encoded string and returns a tensor suitable as input to the inception model. 93 | 94 | ```csharp 95 | 96 | static void ConstructGraphToNormalizeImage (out TFGraph graph, out TFOutput input, out TFOutput output) 97 | { 98 | // Some constants specific to the pre-trained model at: 99 | // https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip 100 | // 101 | // - The model was trained after with images scaled to 224x224 pixels. 102 | // - The colors, represented as R, G, B in 1-byte each were converted to 103 | // float using (value - Mean)/Scale. 104 | 105 | const int W = 224; 106 | const int H = 224; 107 | const float Mean = 117; 108 | const float Scale = 1; 109 | 110 | graph = new TFGraph (); 111 | input = graph.Placeholder (TFDataType.String); 112 | 113 | output = graph.Div ( 114 | x: graph.Sub ( 115 | x: graph.ResizeBilinear ( 116 | images: graph.ExpandDims ( 117 | input: graph.Cast ( 118 | graph.DecodeJpeg (contents: input, channels: 3), DstT: TFDataType.Float), 119 | dim: graph.Const (0, "make_batch")), 120 | size: graph.Const (new int [] { W, H }, "size")), 121 | y: graph.Const (Mean, "mean")), 122 | y: graph.Const (Scale, "scale")); 123 | } 124 | ``` 125 | 126 | Convert the image in filename to a Tensor suitable as input to the Inception model. 127 | 128 | ```csharp 129 | static TFTensor CreateTensorFromImageFile (string file) 130 | { 131 | var contents = File.ReadAllBytes (file); 132 | 133 | // DecodeJpeg uses a scalar String-valued tensor as input. 134 | var tensor = TFTensor.CreateString (contents); 135 | 136 | TFGraph graph; 137 | TFOutput input, output; 138 | 139 | // Construct a graph to normalize the image 140 | ConstructGraphToNormalizeImage (out graph, out input, out output); 141 | 142 | // Execute that graph to normalize this one image 143 | using (var session = new TFSession (graph)) { 144 | var normalized = session.Run ( 145 | inputs: new [] { input }, 146 | inputValues: new [] { tensor }, 147 | outputs: new [] { output }); 148 | 149 | return normalized [0]; 150 | } 151 | } 152 | ``` 153 | 154 | ```csharp 155 | var graph = new TFGraph (); 156 | graph.Import (model, ""); 157 | 158 | var files = System.IO.Directory.GetFiles(@".\demofiles", "*.jpg"); 159 | 160 | using (var session = new TFSession (graph)) { 161 | var labels = File.ReadAllLines (labelsFile); 162 | 163 | foreach (var file in files) { 164 | Console.WriteLine ($"Filename: {file}"); 165 | // Run inference on the image files 166 | // For multiple images, session.Run() can be called in a loop (and 167 | // concurrently). Alternatively, images can be batched since the model 168 | // accepts batches of image data as input. 169 | var tensor = CreateTensorFromImageFile (file); 170 | 171 | var runner = session.GetRunner (); 172 | runner.AddInput (graph ["input"] [0], tensor).Fetch (graph ["output"] [0]); 173 | var output = runner.Run (); 174 | // output[0].Value() is a vector containing probabilities of 175 | // labels for each image in the "batch". The batch size was 1. 176 | // Find the most probably label index. 177 | 178 | var result = output [0]; 179 | var rshape = result.Shape; 180 | if (result.NumDims != 2 || rshape [0] != 1) { 181 | var shape = ""; 182 | foreach (var d in rshape) { 183 | shape += $"{d} "; 184 | } 185 | shape = shape.Trim (); 186 | Console.WriteLine ($"Error: expected to produce a [1 N] shaped tensor where N is the number of labels, instead it produced one with shape [{shape}]"); 187 | Environment.Exit (1); 188 | } 189 | 190 | // You can get the data in two ways, as a multi-dimensional array, or arrays of arrays, 191 | // code can be nicer to read with one or the other, pick it based on how you want to process 192 | // it 193 | bool jagged = true; 194 | 195 | var bestIdx = 0; 196 | float best = 0; 197 | 198 | if (jagged) { 199 | var probabilities = ((float [] [])result.GetValue (jagged: true)) [0]; 200 | for (int i = 0; i < probabilities.Length; i++) { 201 | if (probabilities [i] > best) { 202 | bestIdx = i; 203 | best = probabilities [i]; 204 | } 205 | } 206 | 207 | } else { 208 | var val = (float [,])result.GetValue (jagged: false); 209 | 210 | // Result is [1,N], flatten array 211 | for (int i = 0; i < val.GetLength (1); i++) { 212 | if (val [0, i] > best) { 213 | bestIdx = i; 214 | best = val [0, i]; 215 | } 216 | } 217 | } 218 | 219 | Console.WriteLine ($"{file} best match: [{bestIdx}] {best * 100.0}% {labels [bestIdx]}"); 220 | } 221 | } 222 | ``` -------------------------------------------------------------------------------- /tensorflow/README.md: -------------------------------------------------------------------------------- 1 | # Introduction 2 | 3 | This is a list of Xamarin Workbooks for getting started with TF# 4 | 5 | ## Xamarin Workbooks 6 | Download 7 | [Xamarin Workbooks](https://developer.xamarin.com/guides/cross-platform/workbooks/) 8 | 9 | ## Nuget 10 | - Each wokbook may require packages available on Nuget www.nuget.org 11 | - The workbooks will have comment and refeneces to guide on the packages needed 12 | 13 | ### Tensorflow 14 | 15 | Some screenshots 16 | 17 | ![Screenshot.Tf.Gs.A](screenshot.tf.gs.a.png) 18 | 19 | ![Screenshot.Tf.Gs.B](screenshot.tf.gs.b.png) 20 | 21 | -------------------------------------------------------------------------------- /tensorflow/demofiles/example.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/tensorflow/demofiles/example.png -------------------------------------------------------------------------------- /tensorflow/demofiles/rio.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/tensorflow/demofiles/rio.jpg -------------------------------------------------------------------------------- /tensorflow/screenshot.tf.gs.a.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/tensorflow/screenshot.tf.gs.a.png -------------------------------------------------------------------------------- /tensorflow/screenshot.tf.gs.b.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/svsnux/cognibooks/6741906b2a40a105e836feb0a11516d75c8160ec/tensorflow/screenshot.tf.gs.b.png -------------------------------------------------------------------------------- /xplot/xplot.mt.bruno.txt: -------------------------------------------------------------------------------- 1 | 27.80985, 49.61936, 83.08067, 116.6632, 130.414, 150.7206, 220.1871, 156.1536, 148.6416, 203.7845, 206.0386, 107.1618, 68.36975, 45.3359, 49.96142, 21.89279, 17.02552, 11.74317, 14.75226, 13.6671, 5.677561, 3.31234, 1.156517, -0.147662 2 | 27.71966, 48.55022, 65.21374, 95.27666, 116.9964, 133.9056, 152.3412, 151.934, 160.1139, 179.5327, 147.6184, 170.3943, 121.8194, 52.58537, 33.08871, 38.40972, 44.24843, 69.5786, 4.019351, 3.050024, 3.039719, 2.996142, 2.967954, 1.999594 3 | 30.4267, 33.47752, 44.80953, 62.47495, 77.43523, 104.2153, 102.7393, 137.0004, 186.0706, 219.3173, 181.7615, 120.9154, 143.1835, 82.40501, 48.47132, 74.71461, 60.0909, 7.073525, 6.089851, 6.53745, 6.666096, 7.306965, 5.73684, 3.625628 4 | 16.66549, 30.1086, 39.96952, 44.12225, 59.57512, 77.56929, 106.8925, 166.5539, 175.2381, 185.2815, 154.5056, 83.0433, 62.61732, 62.33167, 60.55916, 55.92124, 15.17284, 8.248324, 36.68087, 61.93413, 20.26867, 68.58819, 46.49812, 0.2360095 5 | 8.815617, 18.3516, 8.658275, 27.5859, 48.62691, 60.18013, 91.3286, 145.7109, 116.0653, 106.2662, 68.69447, 53.10596, 37.92797, 47.95942, 47.42691, 69.20731, 44.95468, 29.17197, 17.91674, 16.25515, 14.65559, 17.26048, 31.22245, 46.71704 6 | 6.628881, 10.41339, 24.81939, 26.08952, 30.1605, 52.30802, 64.71007, 76.30823, 84.63686, 99.4324, 62.52132, 46.81647, 55.76606, 82.4099, 140.2647, 81.26501, 56.45756, 30.42164, 17.28782, 8.302431, 2.981626, 2.698536, 5.886086, 5.268358 7 | 21.83975, 6.63927, 18.97085, 32.89204, 43.15014, 62.86014, 104.6657, 130.2294, 114.8494, 106.9873, 61.89647, 55.55682, 86.80986, 89.27802, 122.4221, 123.9698, 109.0952, 98.41956, 77.61374, 32.49031, 14.67344, 7.370775, 0.03711011, 0.6423392 8 | 53.34303, 26.79797, 6.63927, 10.88787, 17.2044, 56.18116, 79.70141, 90.8453, 98.27675, 80.87243, 74.7931, 75.54661, 73.4373, 74.11694, 68.1749, 46.24076, 39.93857, 31.21653, 36.88335, 40.02525, 117.4297, 12.70328, 1.729771, 0.0 9 | 25.66785, 63.05717, 22.1414, 17.074, 41.74483, 60.27227, 81.42432, 114.444, 102.3234, 101.7878, 111.031, 119.2309, 114.0777, 110.5296, 59.19355, 42.47175, 14.63598, 6.944074, 6.944075, 27.74936, 0.0, 0.0, 0.09449376, 0.07732264 10 | 12.827, 69.20554, 46.76293, 13.96517, 33.88744, 61.82613, 84.74799, 121.122, 145.2741, 153.1797, 204.786, 227.9242, 236.3038, 228.3655, 79.34425, 25.93483, 6.944074, 6.944074, 6.944075, 7.553681, 0.0, 0.0, 0.0, 0.0 11 | 0.0, 68.66396, 59.0435, 33.35762, 47.45282, 57.8355, 78.91689, 107.8275, 168.0053, 130.9597, 212.5541, 165.8122, 210.2429, 181.1713, 189.7617, 137.3378, 84.65395, 8.677168, 6.956576, 8.468093, 0.0, 0.0, 0.0, 0.0 12 | 0.0, 95.17499, 80.03818, 59.89862, 39.58476, 50.28058, 63.81641, 80.61302, 66.37824, 198.7651, 244.3467, 294.2474, 264.3517, 176.4082, 60.21857, 77.41475, 53.16981, 56.16393, 6.949235, 7.531059, 3.780177, 0.0, 0.0, 0.0 13 | 0.0, 134.9879, 130.3696, 96.86325, 75.70494, 58.86466, 57.20374, 55.18837, 78.128, 108.5582, 154.3774, 319.1686, 372.8826, 275.4655, 130.2632, 54.93822, 25.49719, 8.047439, 8.084393, 5.115252, 5.678269, 0.0, 0.0, 0.0 14 | 0.0, 48.08919, 142.5558, 140.3777, 154.7261, 87.9361, 58.11092, 52.83869, 67.14822, 83.66798, 118.9242, 150.0681, 272.9709, 341.1366, 238.664, 190.2, 116.8943, 91.48672, 14.0157, 42.29277, 5.115252, 0.0, 0.0, 0.0 15 | 0.0, 54.1941, 146.3839, 99.48143, 96.19411, 102.9473, 76.14089, 57.7844, 47.0402, 64.36799, 84.23767, 162.7181, 121.3275, 213.1646, 328.482, 285.4489, 283.8319, 212.815, 164.549, 92.29631, 7.244015, 1.167, 0.0, 0.0 16 | 0.0, 6.919659, 195.1709, 132.5253, 135.2341, 89.85069, 89.45549, 60.29967, 50.33806, 39.17583, 59.06854, 74.52159, 84.93402, 187.1219, 123.9673, 103.7027, 128.986, 165.1283, 249.7054, 95.39966, 10.00284, 2.39255, 0.0, 0.0 17 | 0.0, 21.73871, 123.1339, 176.7414, 158.2698, 137.235, 105.3089, 86.63255, 53.11591, 29.03865, 30.40539, 39.04902, 49.23405, 63.27853, 111.4215, 101.1956, 40.00962, 59.84565, 74.51253, 17.06316, 2.435141, 2.287471, -0.0003636982, 0.0 18 | 0.0, 0.0, 62.04672, 136.3122, 201.7952, 168.1343, 95.2046, 58.90624, 46.94091, 49.27053, 37.10416, 17.97011, 30.93697, 33.39257, 44.03077, 55.64542, 78.22423, 14.42782, 9.954997, 7.768213, 13.0254, 21.73166, 2.156372, 0.5317867 19 | 0.0, 0.0, 79.62993, 139.6978, 173.167, 192.8718, 196.3499, 144.6611, 106.5424, 57.16653, 41.16107, 32.12764, 13.8566, 10.91772, 12.07177, 22.38254, 24.72105, 6.803666, 4.200841, 16.46857, 15.70744, 33.96221, 7.575688, -0.04880907 20 | 0.0, 0.0, 33.2664, 57.53643, 167.2241, 196.4833, 194.7966, 182.1884, 119.6961, 73.02113, 48.36549, 33.74652, 26.2379, 16.3578, 6.811293, 6.63927, 6.639271, 8.468093, 6.194273, 3.591233, 3.81486, 8.600739, 5.21889, 0.0 21 | 0.0, 0.0, 29.77937, 54.97282, 144.7995, 207.4904, 165.3432, 171.4047, 174.9216, 100.2733, 61.46441, 50.19171, 26.08209, 17.18218, 8.468093, 6.63927, 6.334467, 6.334467, 5.666687, 4.272203, 0.0, 0.0, 0.0, 0.0 22 | 0.0, 0.0, 31.409, 132.7418, 185.5796, 121.8299, 185.3841, 160.6566, 116.1478, 118.1078, 141.7946, 65.56351, 48.84066, 23.13864, 18.12932, 10.28531, 6.029663, 6.044627, 5.694764, 3.739085, 3.896037, 0.0, 0.0, 0.0 23 | 0.0, 0.0, 19.58994, 42.30355, 96.26777, 187.1207, 179.6626, 221.3898, 154.2617, 142.1604, 148.5737, 67.17937, 40.69044, 39.74512, 26.10166, 14.48469, 8.65873, 3.896037, 3.571392, 3.896037, 3.896037, 3.896037, 1.077756, 0.0 24 | 0.001229679, 3.008948, 5.909858, 33.50574, 104.3341, 152.2165, 198.1988, 191.841, 228.7349, 168.1041, 144.2759, 110.7436, 57.65214, 42.63504, 27.91891, 15.41052, 8.056102, 3.90283, 3.879774, 3.936718, 3.968634, 0.1236256, 3.985531, -0.1835741 25 | 0.0, 5.626141, 7.676256, 63.16226, 45.99762, 79.56688, 227.311, 203.9287, 172.5618, 177.1462, 140.4554, 123.9905, 110.346, 65.12319, 34.31887, 24.5278, 9.561069, 3.334991, 5.590495, 5.487353, 5.909499, 5.868994, 5.833817, 3.568177 -------------------------------------------------------------------------------- /xplot/xplot.workbook: -------------------------------------------------------------------------------- 1 | --- 2 | uti: com.xamarin.workbook 3 | platforms: 4 | - Console 5 | packages: 6 | - id: XPlot.Plotly 7 | version: 1.4.2 8 | - id: Newtonsoft.Json 9 | version: 9.0.1 10 | - id: System.ValueTuple 11 | version: 4.3.0 12 | - id: FSharp.Core 13 | version: 4.1.17 14 | - id: MatplotlibCS 15 | version: 1.0.45 16 | - id: NLog 17 | version: 4.4.9 18 | --- 19 | 20 | ```csharp 21 | //Firt add the FSharp.Core and XPlot.Plotly package from NuGet 22 | #r "FSharp.Core" 23 | #r "XPlot.Plotly" 24 | ``` 25 | 26 | ```csharp 27 | using XPlot.Plotly; 28 | using System.IO; 29 | using System.Linq; 30 | ``` 31 | 32 | ```csharp 33 | 34 | string[] text = System.IO.File.ReadAllLines("xplot.mt.bruno.txt"); 35 | float[][] z = text 36 | .ToList() 37 | .ConvertAll(t => t.Split(',').ToList().ConvertAll(r => Convert.ToSingle(r)).ToArray()) 38 | .ToArray(); 39 | 40 | var layout = new Layout.Layout() { title = "Mt Bruno Elevation", autosize = true, margin = new Graph.Margin() { l = 65, r = 50, b = 65, t = 90 }}; 41 | var surfaceplot = Chart.Plot(new Graph.Surface() { z = z }, layout); 42 | surfaceplot.WithWidth(700); 43 | surfaceplot.WithHeight(500); 44 | surfaceplot.GetHtml().AsHtml(); 45 | ``` 46 | 47 | ```csharp 48 | string[] text = System.IO.File.ReadAllLines("xplot.3DLineData.txt"); 49 | float[][] data = text 50 | .ToList() 51 | .ConvertAll(t => t.Split(',').ToList().ConvertAll(r => Convert.ToSingle(r)).ToArray()) 52 | .ToArray(); 53 | 54 | float[] x1 = data[0], y1 = data[1], z1 = data[2]; 55 | float[] x2 = data[3], y2 = data[4], z2 = data[5]; 56 | float[] x3 = data[6], y3 = data[7], z3 = data[8]; 57 | 58 | var trace1 = new Graph.Scatter3d() { x = x1, y = y1, z = z1, mode = "lines", 59 | marker = new Graph.Marker() { color = "#1f77b4", size = 12.0, symbol = "circle", 60 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 61 | }, 62 | line = new Graph.Line() { color = "#1f77b4", width = 1.0} 63 | }; 64 | 65 | var trace2 = new Graph.Scatter3d() { x = x2, y = y2, z = z2, mode = "lines", 66 | marker = new Graph.Marker() { color = "#9467bd", size = 12.0, symbol = "circle", 67 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 68 | }, 69 | line = new Graph.Line() { color = "rgb(44, 160, 44)", width = 1.0} 70 | }; 71 | 72 | var trace3 = new Graph.Scatter3d() { x = x3, y = y3, z = z3, mode = "lines", 73 | marker = new Graph.Marker() { color = "#bcbd22", size = 12.0, symbol = "circle", 74 | line = new Graph.Line() { color = "rgb(0,0,0)", width = 0.0 } 75 | }, 76 | line = new Graph.Line() { color = "#bcbd22", width = 1.0} 77 | }; 78 | 79 | var layout = new Layout.Layout() { title = "3D Random Walk", autosize = false, margin = new Graph.Margin() { l = 0, r = 0, b = 0, t = 65 }}; 80 | 81 | var scatterplot = Chart.Plot(new Graph.Scatter3d[] {trace1, trace2, trace3}, layout); 82 | scatterplot.WithWidth(700); 83 | scatterplot.WithHeight(500); 84 | scatterplot.GetHtml().AsHtml(); 85 | ``` 86 | 87 | ```csharp 88 | var bar = new Graph.Bar() 89 | { 90 | x = new int[] {20,14, 23}, 91 | y = new string[] { "giraffes", "orangutans", "monkeys"}, 92 | orientation = "h" 93 | }; 94 | Chart.Plot(bar).GetHtml().AsHtml() 95 | ``` --------------------------------------------------------------------------------