├── LICENSE.txt ├── assembleMobileNetv2.m ├── images └── mobilenetv2_deepNetworkDesigner.PNG ├── mobilenetv2Example.m ├── mobilenetv2Layers.m └── readme.md /LICENSE.txt: -------------------------------------------------------------------------------- 1 | Copyright (c) 2019, The MathWorks, Inc. 2 | 3 | Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 4 | 5 | 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 6 | 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 7 | 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. 8 | 9 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -------------------------------------------------------------------------------- /assembleMobileNetv2.m: -------------------------------------------------------------------------------- 1 | function net = assembleMobileNetv2() 2 | % assembleMobileNetv2 Assemble MobileNet-v2 network 3 | % 4 | % net = assembleMobileNetv2 creates a MobileNet-v2 network with weights 5 | % trained on ImageNet. You can load the same MobileNet-v2 network by 6 | % installing the Deep Learning Toolbox Model for MobileNet-v2 Network 7 | % support package from the Add-On Explorer and then using the mobilenetv2 8 | % function. 9 | 10 | % Copyright 2019 The MathWorks, Inc. 11 | 12 | % Download the network parameters. If these have already been downloaded, 13 | % this step will be skipped. 14 | 15 | % The files will be downloaded to a file "mobilenetv2Params.mat", in a 16 | % directory "MobileNetv2" located in the system's temporary directory. 17 | dataDir = fullfile(tempdir, "MobileNetv2"); 18 | paramFile = fullfile(dataDir, "mobilenetv2Params.mat"); 19 | downloadUrl = "http://www.mathworks.com/supportfiles/nnet/data/networks/mobilenetv2Params.mat"; 20 | 21 | if ~exist(dataDir, "dir") 22 | mkdir(dataDir); 23 | end 24 | 25 | if ~exist(paramFile, "file") 26 | disp("Downloading pretrained parameters file (13 MB).") 27 | disp("This may take several minutes..."); 28 | websave(paramFile, downloadUrl); 29 | disp("Download finished."); 30 | else 31 | disp("Skipping download, parameter file already exists."); 32 | end 33 | 34 | % Load the network parameters from the file mobilenetv2Params.mat. 35 | s = load(paramFile); 36 | params = s.params; 37 | 38 | % Create a layer graph with the network architecture of MobileNet-v2. 39 | lgraph = mobilenetv2Layers; 40 | 41 | % Create a cell array containing the layer names. 42 | layerNames = {lgraph.Layers(:).Name}'; 43 | 44 | % Loop over layers and add parameters. 45 | for i = 1:numel(layerNames) 46 | name = layerNames{i}; 47 | idx = strcmp(layerNames,name); 48 | layer = lgraph.Layers(idx); 49 | 50 | % Assign layer parameters. 51 | layerParams = params.(name); 52 | if ~isempty(layerParams) 53 | paramNames = fields(layerParams); 54 | for j = 1:numel(paramNames) 55 | layer.(paramNames{j}) = layerParams.(paramNames{j}); 56 | end 57 | 58 | % Add layer into layer graph. 59 | lgraph = replaceLayer(lgraph,name,layer); 60 | end 61 | end 62 | 63 | % Assemble the network. 64 | net = assembleNetwork(lgraph); 65 | 66 | end -------------------------------------------------------------------------------- /images/mobilenetv2_deepNetworkDesigner.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/matlab-deep-learning/mobilenet-v2/af29df3a81f894c1be2086fca7c6c4b06e54902c/images/mobilenetv2_deepNetworkDesigner.PNG -------------------------------------------------------------------------------- /mobilenetv2Example.m: -------------------------------------------------------------------------------- 1 | %% Classify Image Using MobileNet-v2 2 | % This example shows how to classify an image using the MobileNet-v2 3 | % pretrained convolutional neural network. 4 | 5 | % Copyright 2019 The MathWorks, Inc. 6 | 7 | % Read an example image. 8 | img = imread("peppers.png"); 9 | 10 | % The image that you want to classify must have the same size as the input 11 | % size of the network. Resize the image to be 224-by-224 pixels, the input 12 | % size of MobileNet-v2. 13 | img = imresize(img,[224 224]); 14 | 15 | % Assemble the pretrained MobileNet-v2 network. Alternatively, you can 16 | % create a pretrained MobileNet-v2 network by installing the Deep Learning 17 | % Toolbox Model for MobileNet-v2 Network support package from the Add-On 18 | % Explorer using the mobilenetv2 function. 19 | net = assembleMobileNetv2; 20 | 21 | % Analyze the network architecture. 22 | analyzeNetwork(net) 23 | 24 | % Classify the image using the network. 25 | label = classify(net,img); 26 | 27 | % Display the image together with the predicted label. 28 | figure 29 | imshow(img) 30 | title(string(label)) -------------------------------------------------------------------------------- /mobilenetv2Layers.m: -------------------------------------------------------------------------------- 1 | function lgraph = mobilenetv2Layers() 2 | % mobilenetv2Layers MobileNet-v2 layer graph 3 | % 4 | % lgraph = mobilenetv2Layers creates a layer graph with the network 5 | % architecture of MobileNet-v2. The layer graph contains no weights. 6 | 7 | lgraph = layerGraph(); 8 | %% Add Layer Branches 9 | % Add the branches of the network to the layer graph. Each branch is a linear 10 | % array of layers. 11 | 12 | tempLayers = [ 13 | imageInputLayer([224 224 3],"Name","input_1","Normalization","zscore") 14 | convolution2dLayer([3 3],32,"Name","Conv1","Padding","same","Stride",[2 2]) 15 | batchNormalizationLayer("Name","bn_Conv1","Epsilon",0.001) 16 | clippedReluLayer(6,"Name","Conv1_relu") 17 | groupedConvolution2dLayer([3 3],1,32,"Name","expanded_conv_depthwise","Padding","same") 18 | batchNormalizationLayer("Name","expanded_conv_depthwise_BN","Epsilon",0.001) 19 | clippedReluLayer(6,"Name","expanded_conv_depthwise_relu") 20 | convolution2dLayer([1 1],16,"Name","expanded_conv_project","Padding","same") 21 | batchNormalizationLayer("Name","expanded_conv_project_BN","Epsilon",0.001) 22 | convolution2dLayer([1 1],96,"Name","block_1_expand","Padding","same") 23 | batchNormalizationLayer("Name","block_1_expand_BN","Epsilon",0.001) 24 | clippedReluLayer(6,"Name","block_1_expand_relu") 25 | groupedConvolution2dLayer([3 3],1,96,"Name","block_1_depthwise","Padding","same","Stride",[2 2]) 26 | batchNormalizationLayer("Name","block_1_depthwise_BN","Epsilon",0.001) 27 | clippedReluLayer(6,"Name","block_1_depthwise_relu") 28 | convolution2dLayer([1 1],24,"Name","block_1_project","Padding","same") 29 | batchNormalizationLayer("Name","block_1_project_BN","Epsilon",0.001)]; 30 | lgraph = addLayers(lgraph,tempLayers); 31 | 32 | tempLayers = [ 33 | convolution2dLayer([1 1],144,"Name","block_2_expand","Padding","same") 34 | batchNormalizationLayer("Name","block_2_expand_BN","Epsilon",0.001) 35 | clippedReluLayer(6,"Name","block_2_expand_relu") 36 | groupedConvolution2dLayer([3 3],1,144,"Name","block_2_depthwise","Padding","same") 37 | batchNormalizationLayer("Name","block_2_depthwise_BN","Epsilon",0.001) 38 | clippedReluLayer(6,"Name","block_2_depthwise_relu") 39 | convolution2dLayer([1 1],24,"Name","block_2_project","Padding","same") 40 | batchNormalizationLayer("Name","block_2_project_BN","Epsilon",0.001)]; 41 | lgraph = addLayers(lgraph,tempLayers); 42 | 43 | tempLayers = [ 44 | additionLayer(2,"Name","block_2_add") 45 | convolution2dLayer([1 1],144,"Name","block_3_expand","Padding","same") 46 | batchNormalizationLayer("Name","block_3_expand_BN","Epsilon",0.001) 47 | clippedReluLayer(6,"Name","block_3_expand_relu") 48 | groupedConvolution2dLayer([3 3],1,144,"Name","block_3_depthwise","Padding","same","Stride",[2 2]) 49 | batchNormalizationLayer("Name","block_3_depthwise_BN","Epsilon",0.001) 50 | clippedReluLayer(6,"Name","block_3_depthwise_relu") 51 | convolution2dLayer([1 1],32,"Name","block_3_project","Padding","same") 52 | batchNormalizationLayer("Name","block_3_project_BN","Epsilon",0.001)]; 53 | lgraph = addLayers(lgraph,tempLayers); 54 | 55 | tempLayers = [ 56 | convolution2dLayer([1 1],192,"Name","block_4_expand","Padding","same") 57 | batchNormalizationLayer("Name","block_4_expand_BN","Epsilon",0.001) 58 | clippedReluLayer(6,"Name","block_4_expand_relu") 59 | groupedConvolution2dLayer([3 3],1,192,"Name","block_4_depthwise","Padding","same") 60 | batchNormalizationLayer("Name","block_4_depthwise_BN","Epsilon",0.001) 61 | clippedReluLayer(6,"Name","block_4_depthwise_relu") 62 | convolution2dLayer([1 1],32,"Name","block_4_project","Padding","same") 63 | batchNormalizationLayer("Name","block_4_project_BN","Epsilon",0.001)]; 64 | lgraph = addLayers(lgraph,tempLayers); 65 | 66 | tempLayers = additionLayer(2,"Name","block_4_add"); 67 | lgraph = addLayers(lgraph,tempLayers); 68 | 69 | tempLayers = [ 70 | convolution2dLayer([1 1],192,"Name","block_5_expand","Padding","same") 71 | batchNormalizationLayer("Name","block_5_expand_BN","Epsilon",0.001) 72 | clippedReluLayer(6,"Name","block_5_expand_relu") 73 | groupedConvolution2dLayer([3 3],1,192,"Name","block_5_depthwise","Padding","same") 74 | batchNormalizationLayer("Name","block_5_depthwise_BN","Epsilon",0.001) 75 | clippedReluLayer(6,"Name","block_5_depthwise_relu") 76 | convolution2dLayer([1 1],32,"Name","block_5_project","Padding","same") 77 | batchNormalizationLayer("Name","block_5_project_BN","Epsilon",0.001)]; 78 | lgraph = addLayers(lgraph,tempLayers); 79 | 80 | tempLayers = [ 81 | additionLayer(2,"Name","block_5_add") 82 | convolution2dLayer([1 1],192,"Name","block_6_expand","Padding","same") 83 | batchNormalizationLayer("Name","block_6_expand_BN","Epsilon",0.001) 84 | clippedReluLayer(6,"Name","block_6_expand_relu") 85 | groupedConvolution2dLayer([3 3],1,192,"Name","block_6_depthwise","Padding","same","Stride",[2 2]) 86 | batchNormalizationLayer("Name","block_6_depthwise_BN","Epsilon",0.001) 87 | clippedReluLayer(6,"Name","block_6_depthwise_relu") 88 | convolution2dLayer([1 1],64,"Name","block_6_project","Padding","same") 89 | batchNormalizationLayer("Name","block_6_project_BN","Epsilon",0.001)]; 90 | lgraph = addLayers(lgraph,tempLayers); 91 | 92 | tempLayers = [ 93 | convolution2dLayer([1 1],384,"Name","block_7_expand","Padding","same") 94 | batchNormalizationLayer("Name","block_7_expand_BN","Epsilon",0.001) 95 | clippedReluLayer(6,"Name","block_7_expand_relu") 96 | groupedConvolution2dLayer([3 3],1,384,"Name","block_7_depthwise","Padding","same") 97 | batchNormalizationLayer("Name","block_7_depthwise_BN","Epsilon",0.001) 98 | clippedReluLayer(6,"Name","block_7_depthwise_relu") 99 | convolution2dLayer([1 1],64,"Name","block_7_project","Padding","same") 100 | batchNormalizationLayer("Name","block_7_project_BN","Epsilon",0.001)]; 101 | lgraph = addLayers(lgraph,tempLayers); 102 | 103 | tempLayers = additionLayer(2,"Name","block_7_add"); 104 | lgraph = addLayers(lgraph,tempLayers); 105 | 106 | tempLayers = [ 107 | convolution2dLayer([1 1],384,"Name","block_8_expand","Padding","same") 108 | batchNormalizationLayer("Name","block_8_expand_BN","Epsilon",0.001) 109 | clippedReluLayer(6,"Name","block_8_expand_relu") 110 | groupedConvolution2dLayer([3 3],1,384,"Name","block_8_depthwise","Padding","same") 111 | batchNormalizationLayer("Name","block_8_depthwise_BN","Epsilon",0.001) 112 | clippedReluLayer(6,"Name","block_8_depthwise_relu") 113 | convolution2dLayer([1 1],64,"Name","block_8_project","Padding","same") 114 | batchNormalizationLayer("Name","block_8_project_BN","Epsilon",0.001)]; 115 | lgraph = addLayers(lgraph,tempLayers); 116 | 117 | tempLayers = additionLayer(2,"Name","block_8_add"); 118 | lgraph = addLayers(lgraph,tempLayers); 119 | 120 | tempLayers = [ 121 | convolution2dLayer([1 1],384,"Name","block_9_expand","Padding","same") 122 | batchNormalizationLayer("Name","block_9_expand_BN","Epsilon",0.001) 123 | clippedReluLayer(6,"Name","block_9_expand_relu") 124 | groupedConvolution2dLayer([3 3],1,384,"Name","block_9_depthwise","Padding","same") 125 | batchNormalizationLayer("Name","block_9_depthwise_BN","Epsilon",0.001) 126 | clippedReluLayer(6,"Name","block_9_depthwise_relu") 127 | convolution2dLayer([1 1],64,"Name","block_9_project","Padding","same") 128 | batchNormalizationLayer("Name","block_9_project_BN","Epsilon",0.001)]; 129 | lgraph = addLayers(lgraph,tempLayers); 130 | 131 | tempLayers = [ 132 | additionLayer(2,"Name","block_9_add") 133 | convolution2dLayer([1 1],384,"Name","block_10_expand","Padding","same") 134 | batchNormalizationLayer("Name","block_10_expand_BN","Epsilon",0.001) 135 | clippedReluLayer(6,"Name","block_10_expand_relu") 136 | groupedConvolution2dLayer([3 3],1,384,"Name","block_10_depthwise","Padding","same") 137 | batchNormalizationLayer("Name","block_10_depthwise_BN","Epsilon",0.001) 138 | clippedReluLayer(6,"Name","block_10_depthwise_relu") 139 | convolution2dLayer([1 1],96,"Name","block_10_project","Padding","same") 140 | batchNormalizationLayer("Name","block_10_project_BN","Epsilon",0.001)]; 141 | lgraph = addLayers(lgraph,tempLayers); 142 | 143 | tempLayers = [ 144 | convolution2dLayer([1 1],576,"Name","block_11_expand","Padding","same") 145 | batchNormalizationLayer("Name","block_11_expand_BN","Epsilon",0.001) 146 | clippedReluLayer(6,"Name","block_11_expand_relu") 147 | groupedConvolution2dLayer([3 3],1,576,"Name","block_11_depthwise","Padding","same") 148 | batchNormalizationLayer("Name","block_11_depthwise_BN","Epsilon",0.001) 149 | clippedReluLayer(6,"Name","block_11_depthwise_relu") 150 | convolution2dLayer([1 1],96,"Name","block_11_project","Padding","same") 151 | batchNormalizationLayer("Name","block_11_project_BN","Epsilon",0.001)]; 152 | lgraph = addLayers(lgraph,tempLayers); 153 | 154 | tempLayers = additionLayer(2,"Name","block_11_add"); 155 | lgraph = addLayers(lgraph,tempLayers); 156 | 157 | tempLayers = [ 158 | convolution2dLayer([1 1],576,"Name","block_12_expand","Padding","same") 159 | batchNormalizationLayer("Name","block_12_expand_BN","Epsilon",0.001) 160 | clippedReluLayer(6,"Name","block_12_expand_relu") 161 | groupedConvolution2dLayer([3 3],1,576,"Name","block_12_depthwise","Padding","same") 162 | batchNormalizationLayer("Name","block_12_depthwise_BN","Epsilon",0.001) 163 | clippedReluLayer(6,"Name","block_12_depthwise_relu") 164 | convolution2dLayer([1 1],96,"Name","block_12_project","Padding","same") 165 | batchNormalizationLayer("Name","block_12_project_BN","Epsilon",0.001)]; 166 | lgraph = addLayers(lgraph,tempLayers); 167 | 168 | tempLayers = [ 169 | additionLayer(2,"Name","block_12_add") 170 | convolution2dLayer([1 1],576,"Name","block_13_expand","Padding","same") 171 | batchNormalizationLayer("Name","block_13_expand_BN","Epsilon",0.001) 172 | clippedReluLayer(6,"Name","block_13_expand_relu") 173 | groupedConvolution2dLayer([3 3],1,576,"Name","block_13_depthwise","Padding","same","Stride",[2 2]) 174 | batchNormalizationLayer("Name","block_13_depthwise_BN","Epsilon",0.001) 175 | clippedReluLayer(6,"Name","block_13_depthwise_relu") 176 | convolution2dLayer([1 1],160,"Name","block_13_project","Padding","same") 177 | batchNormalizationLayer("Name","block_13_project_BN","Epsilon",0.001)]; 178 | lgraph = addLayers(lgraph,tempLayers); 179 | 180 | tempLayers = [ 181 | convolution2dLayer([1 1],960,"Name","block_14_expand","Padding","same") 182 | batchNormalizationLayer("Name","block_14_expand_BN","Epsilon",0.001) 183 | clippedReluLayer(6,"Name","block_14_expand_relu") 184 | groupedConvolution2dLayer([3 3],1,960,"Name","block_14_depthwise","Padding","same") 185 | batchNormalizationLayer("Name","block_14_depthwise_BN","Epsilon",0.001) 186 | clippedReluLayer(6,"Name","block_14_depthwise_relu") 187 | convolution2dLayer([1 1],160,"Name","block_14_project","Padding","same") 188 | batchNormalizationLayer("Name","block_14_project_BN","Epsilon",0.001)]; 189 | lgraph = addLayers(lgraph,tempLayers); 190 | 191 | tempLayers = additionLayer(2,"Name","block_14_add"); 192 | lgraph = addLayers(lgraph,tempLayers); 193 | 194 | tempLayers = [ 195 | convolution2dLayer([1 1],960,"Name","block_15_expand","Padding","same") 196 | batchNormalizationLayer("Name","block_15_expand_BN","Epsilon",0.001) 197 | clippedReluLayer(6,"Name","block_15_expand_relu") 198 | groupedConvolution2dLayer([3 3],1,960,"Name","block_15_depthwise","Padding","same") 199 | batchNormalizationLayer("Name","block_15_depthwise_BN","Epsilon",0.001) 200 | clippedReluLayer(6,"Name","block_15_depthwise_relu") 201 | convolution2dLayer([1 1],160,"Name","block_15_project","Padding","same") 202 | batchNormalizationLayer("Name","block_15_project_BN","Epsilon",0.001)]; 203 | lgraph = addLayers(lgraph,tempLayers); 204 | 205 | tempLayers = [ 206 | additionLayer(2,"Name","block_15_add") 207 | convolution2dLayer([1 1],960,"Name","block_16_expand","Padding","same") 208 | batchNormalizationLayer("Name","block_16_expand_BN","Epsilon",0.001) 209 | clippedReluLayer(6,"Name","block_16_expand_relu") 210 | groupedConvolution2dLayer([3 3],1,960,"Name","block_16_depthwise","Padding","same") 211 | batchNormalizationLayer("Name","block_16_depthwise_BN","Epsilon",0.001) 212 | clippedReluLayer(6,"Name","block_16_depthwise_relu") 213 | convolution2dLayer([1 1],320,"Name","block_16_project","Padding","same") 214 | batchNormalizationLayer("Name","block_16_project_BN","Epsilon",0.001) 215 | convolution2dLayer([1 1],1280,"Name","Conv_1") 216 | batchNormalizationLayer("Name","Conv_1_bn","Epsilon",0.001) 217 | clippedReluLayer(6,"Name","out_relu") 218 | globalAveragePooling2dLayer("Name","global_average_pooling2d_1") 219 | fullyConnectedLayer(1000,"Name","Logits") 220 | softmaxLayer("Name","Logits_softmax") 221 | classificationLayer("Name","ClassificationLayer_Logits")]; 222 | lgraph = addLayers(lgraph,tempLayers); 223 | 224 | %% Connect Layer Branches 225 | % Connect all the branches of the network to create the network graph. 226 | 227 | lgraph = connectLayers(lgraph,"block_1_project_BN","block_2_expand"); 228 | lgraph = connectLayers(lgraph,"block_1_project_BN","block_2_add/in2"); 229 | lgraph = connectLayers(lgraph,"block_2_project_BN","block_2_add/in1"); 230 | lgraph = connectLayers(lgraph,"block_3_project_BN","block_4_expand"); 231 | lgraph = connectLayers(lgraph,"block_3_project_BN","block_4_add/in2"); 232 | lgraph = connectLayers(lgraph,"block_4_project_BN","block_4_add/in1"); 233 | lgraph = connectLayers(lgraph,"block_4_add","block_5_expand"); 234 | lgraph = connectLayers(lgraph,"block_4_add","block_5_add/in2"); 235 | lgraph = connectLayers(lgraph,"block_5_project_BN","block_5_add/in1"); 236 | lgraph = connectLayers(lgraph,"block_6_project_BN","block_7_expand"); 237 | lgraph = connectLayers(lgraph,"block_6_project_BN","block_7_add/in2"); 238 | lgraph = connectLayers(lgraph,"block_7_project_BN","block_7_add/in1"); 239 | lgraph = connectLayers(lgraph,"block_7_add","block_8_expand"); 240 | lgraph = connectLayers(lgraph,"block_7_add","block_8_add/in2"); 241 | lgraph = connectLayers(lgraph,"block_8_project_BN","block_8_add/in1"); 242 | lgraph = connectLayers(lgraph,"block_8_add","block_9_expand"); 243 | lgraph = connectLayers(lgraph,"block_8_add","block_9_add/in2"); 244 | lgraph = connectLayers(lgraph,"block_9_project_BN","block_9_add/in1"); 245 | lgraph = connectLayers(lgraph,"block_10_project_BN","block_11_expand"); 246 | lgraph = connectLayers(lgraph,"block_10_project_BN","block_11_add/in2"); 247 | lgraph = connectLayers(lgraph,"block_11_project_BN","block_11_add/in1"); 248 | lgraph = connectLayers(lgraph,"block_11_add","block_12_expand"); 249 | lgraph = connectLayers(lgraph,"block_11_add","block_12_add/in2"); 250 | lgraph = connectLayers(lgraph,"block_12_project_BN","block_12_add/in1"); 251 | lgraph = connectLayers(lgraph,"block_13_project_BN","block_14_expand"); 252 | lgraph = connectLayers(lgraph,"block_13_project_BN","block_14_add/in2"); 253 | lgraph = connectLayers(lgraph,"block_14_project_BN","block_14_add/in1"); 254 | lgraph = connectLayers(lgraph,"block_14_add","block_15_expand"); 255 | lgraph = connectLayers(lgraph,"block_14_add","block_15_add/in2"); 256 | lgraph = connectLayers(lgraph,"block_15_project_BN","block_15_add/in1"); 257 | 258 | end -------------------------------------------------------------------------------- /readme.md: -------------------------------------------------------------------------------- 1 | # Overview 2 | 3 | MobileNet-v2 is a convolutional neural network that has been trained on a subset of the ImageNet database. As a result, the network has learned rich feature representations for a wide range of images. The network can classify images into 1000 object categories, such as keyboard, mouse, pencil, and many animals. The network has been designed to have a low memory footprint, making it ideal for deployment to low-memory hardware. 4 | 5 | The network has an image input size of 224-by-224-by-3. 6 | 7 | # Usage 8 | 9 | This repository requires [MATLAB](https://www.mathworks.com/products/matlab.html) (R2019a and above) and the [Deep Learning Toolbox](https://www.mathworks.com/products/deep-learning.html). 10 | 11 | This repository provides three functions: 12 | - mobilenetv2Layers: Creates an untrained network with the network architecture of MobileNet-v2 13 | - assembleMobileNetv2: Creates a MobileNet-v2 network with weights trained on ImageNet data 14 | - mobilenetv2Example: Demonstrates how to classify an image using a trained MobileNet-v2 network 15 | 16 | To construct an untrained MobileNet-v2 network to train from scratch, type the following at the MATLAB command line: 17 | ```matlab 18 | lgraph = mobilenetv2Layers; 19 | ``` 20 | The untrained network is returned as a `layerGraph` object. 21 | 22 | To construct a trained MobileNet-v2 network suitable for use in image classification, type the following at the MATLAB command line: 23 | ```matlab 24 | net = assembleMobileNetv2; 25 | ``` 26 | The trained network is returned as a `DAGNetwork` object. 27 | 28 | To classify an image with the network: 29 | ```matlab 30 | img = imresize(imread("peppers.png"),[224 224]); 31 | predLabel = classify(net, img); 32 | imshow(img); 33 | title(string(predLabel)); 34 | ``` 35 | 36 | # Documentation 37 | 38 | For more information about the MobileNet-v2 pre-trained model, see the [mobilenetv2](https://www.mathworks.com/help/deeplearning/ref/mobilenetv2.html) function page in the [MATLAB Deep Learning Toolbox documentation](https://www.mathworks.com/help/deeplearning/index.html). 39 | 40 | # Architecture 41 | 42 | MobileNet-v2 is a residual network. A residual network is a type of DAG network that has residual (or shortcut) connections that bypass the main network layers. Residual connections enable the parameter gradients to propagate more easily from the output layer to the earlier layers of the network, which makes it possible to train deeper networks. This increased network depth can result in higher accuracies on more difficult tasks. 43 | 44 | You can explore and edit the network architecture using [Deep Network Designer](https://www.mathworks.com/help/deeplearning/ug/build-networks-with-deep-network-designer.html). 45 | 46 | ![MobileNet-v2 in Deep Network Designer](images/mobilenetv2_deepNetworkDesigner.PNG "MobileNet-v2 in Deep Network Designer") 47 | 48 | # MobileNet-v2 in MATLAB 49 | 50 | This repository demonstrates the construction of a residual deep neural network from scratch in MATLAB. You can use the code in this repository as a foundation for building residual networks with different numbers of residual blocks. 51 | 52 | You can also create a trained MobileNet-v2 network from inside MATLAB by installing the Deep Learning Toolbox Model for MobileNet-v2 Network support package. Type `mobilenetv2` at the command line. If the Deep Learning Toolbox Model for MobileNet-v2 Network support package is not installed, then the function provides a link to the required support package in the Add-On Explorer. To install the support package, click the link, and then click Install. 53 | 54 | Alternatively, you can download the MobileNet-v2 pre-trained model from the MathWorks File Exchange, at [Deep Learning Toolbox Model for MobileNet-v2 Network](https://www.mathworks.com/matlabcentral/fileexchange/70986-deep-learning-toolbox-model-for-mobilenet-v2-network). 55 | 56 | You can create an untrained MobileNet-v2 network from inside MATLAB by importing a trained MobileNet-v2 network into the Deep Network Designer App and selecting Export > Generate Code. The exported code will generate an untrained network with the network architecture of MobileNet-v2. 57 | --------------------------------------------------------------------------------