├── Articles used in this project ├── Explanation of article codes ├── 1** data loading ├── LICENSE ├── README.md ├── 4**VGG16 ├── 5** ResNet ├── 2** AlexNet └── 3** GoogleNet /Articles used in this project: -------------------------------------------------------------------------------- 1 | Articles used in this project 2 | https://drive.google.com/drive/folders/1dJqc6O_08QVEHtkWHuuih3Gc3TcTbX9D?usp=drive_link 3 | -------------------------------------------------------------------------------- /Explanation of article codes: -------------------------------------------------------------------------------- 1 | Explanation of article codes 2 | https://drive.google.com/drive/folders/19yFtPJKLFFeJnM0QUb9MmlA5w3DJtTub?usp=drive_link 3 | -------------------------------------------------------------------------------- /1** data loading: -------------------------------------------------------------------------------- 1 | clc 2 | clear 3 | close all 4 | 5 | for s = 1:3064 6 | 7 | %% Load data 8 | clc 9 | disp(['data' num2str(s) ' ,(' num2str(s/3064*100) '%)'] ) 10 | load(['./data/mat/' num2str(s) '.mat']); 11 | 12 | img = cjdata.image; 13 | img1 = imadjust(img); 14 | img = imresize(img1,[224,224]); 15 | img = im2double(img); 16 | 17 | %% Data Normalization 18 | featureSet = img./(max(img(:))); 19 | featureSet = uint8(255*featureSet); 20 | 21 | Data(:,:,:,s) = featureSet; 22 | 23 | label(s) = cjdata.label; 24 | 25 | end 26 | 27 | label = categorical(label); 28 | 29 | save('MRIdata.mat','Data','label') 30 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2024 mahdieslaminet 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # brain_tumors_machine_learning_1 2 | In today's life and society, the correct diagnosis of brain tumors can save the lives of many people. In this article, we have contributed to improving the diagnosis of these tumors by using four artificial neural networks. You can use the following site to obtain the data: https://figshare.com/articles/dataset/brain_tumor_dataset/1512427. In this study, we have used the MATLAB programming language. 3 | 4 | 1)First, we assign the existing MAT 5 | numbers to the image so that we can use the MRI images. 6 | 7 | 2)Next, we enhance the contrast of the images and normalize them. 8 | 9 | 3)The subsequent step involves utilizing neural networks to check them. Following that, we use SVM, KNN, and Softmax classifications to assess their accuracy in diagnosing three specific types of brain tumors. 10 | 11 | To understand how the project works, please watch the video below: 12 | {{https://drive.google.com/drive/folders/19yFtPJKLFFeJnM0QUb9MmlA5w3DJtTub?usp=drive_link}} 13 | 14 | To run the files more easily and effectively, please execute them in the following order: 15 | 16 | 1** DATA LOADING 17 | 2**AlexNet 18 | 3**GooleNet 19 | 4**VGG16 20 | 5**ResNet 21 | 22 | 23 | proposal and article use in this project 24 | https://drive.google.com/drive/folders/1gQ7UMsqw2JYSv4bk4OaC7S2ZWl2cBkNi 25 | 26 | If you have any questions or encounter issues with the implementation of the code, feel free to send an email to the following address: sanagolmarzialasl@gmail.com. 27 | 28 | 29 | 30 | 31 | 32 | 33 | -------------------------------------------------------------------------------- /4**VGG16: -------------------------------------------------------------------------------- 1 | clc 2 | clear 3 | close all 4 | 5 | %% Laod Data 6 | 7 | load('MRIdata.mat'); 8 | 9 | imageSize = [224 224]; 10 | N_data = numel(label); 11 | 12 | for i = 1:N_data; 13 | img = Data(:,:,:,i); 14 | img2(:,:,1) = imresize(img,imageSize); 15 | img2(:,:,2) = imresize(img,imageSize); 16 | img2(:,:,3) = imresize(img,imageSize); 17 | data1(:,:,:,i)=img2; 18 | end 19 | 20 | clear data 21 | 22 | %% Shuffling Data 23 | N_data = size(data1,4); 24 | 25 | ind_tr = round(0.75*N_data); 26 | 27 | idx = randperm(N_data); 28 | 29 | X = data1(:,:,:,idx); 30 | T = label(idx); 31 | 32 | %% Train ans Test Data Sepratation 33 | X_Tr = X(:,:,:,1:ind_tr); 34 | X_Ts = X(:,:,:,1+ind_tr:end); 35 | 36 | T_Tr = T(1:ind_tr); 37 | T_Ts = T(1+ind_tr:end); 38 | 39 | clear data1; 40 | 41 | %% VGG16 42 | net = vgg16; 43 | 44 | %% CNN Options 45 | layer = 'fc7'; 46 | 47 | for i = 1:N_data 48 | features(i,:) = activations(net,X(:,:,:,i),layer,'OutputAs','rows'); 49 | end 50 | 51 | %% Train KNN 52 | KNN_classifier = fitcknn(features(1:ind_tr,:),T(1:ind_tr),'Standardize',true,... 53 | 'NumNeighbors',3,'Distance','euclidean'); 54 | 55 | Y_Tr_KNN = predict(KNN_classifier,features(1:ind_tr,:)); 56 | Y_Ts_KNN = predict(KNN_classifier,features(1+ind_tr:end,:)); 57 | 58 | acc_tr = sum(Y_Tr_KNN == T_Tr')./numel(T_Tr)*100; 59 | acc_ts = sum(Y_Ts_KNN == T_Ts')./numel(T_Ts)*100; 60 | 61 | KNN = [acc_tr acc_ts]'; 62 | 63 | %% Train SVM 64 | template = templateSVM('Standardize',true,'KernelFunction','polynomial'); 65 | SVM_classifier = fitcecoc(features(1:ind_tr,:),T(1:ind_tr),'Learners',template); 66 | 67 | Y_Tr_SVM = predict(SVM_classifier,features(1:ind_tr,:)); 68 | Y_Ts_SVM = predict(SVM_classifier,features(1+ind_tr:end,:)); 69 | 70 | acc_tr = sum(Y_Tr_SVM == T_Tr')./numel(T_Tr)*100; 71 | acc_ts = sum(Y_Ts_SVM == T_Ts')./numel(T_Ts)*100; 72 | 73 | SVM = [acc_tr acc_ts]'; 74 | 75 | %% Train Softmax 76 | 77 | NumFeatures = size(features,2); 78 | 79 | layers = [ 80 | featureInputLayer(NumFeatures) 81 | 82 | fullyConnectedLayer(50) 83 | dropoutLayer(0.05) 84 | 85 | fullyConnectedLayer(3) 86 | softmaxLayer 87 | classificationLayer]; 88 | 89 | options = trainingOptions('adam', ... 90 | 'MaxEpochs',20,... 91 | 'LearnRateSchedule','piecewise', ... 92 | 'InitialLearnRate',1e-4, ... 93 | 'LearnRateDropFactor', 0.75, ... 94 | 'LearnRateDropPeriod', 4, ... 95 | 'Verbose',false, ... 96 | 'Plots','training-progress'); 97 | 98 | Net = trainNetwork(features(1:ind_tr,:),T(1:ind_tr),layers,options); 99 | 100 | Y_Tr_SFT = classify(Net,features(1:ind_tr,:)); 101 | Y_Ts_SFT = classify(Net,features(1+ind_tr:end,:)); 102 | Y = classify(Net,features); 103 | 104 | acc_tr = sum(Y_Tr_SFT == T_Tr')./numel(T_Tr)*100; 105 | acc_ts = sum(Y_Ts_SFT == T_Ts')./numel(T_Ts)*100; 106 | 107 | Softmax = [acc_tr acc_ts]'; 108 | 109 | Result = table(KNN, SVM, Softmax,'RowNames',{'Train','Test'}) 110 | save('./Result/VGG16/Result.mat','Result') 111 | 112 | Result = [KNN, SVM, Softmax]'; 113 | x = categorical({'KNN','SVM','Softmax'}); 114 | x = reordercats(x,{'KNN','SVM','Softmax'}); 115 | bar(x,Result) 116 | ylim([0 110]) 117 | grid minor 118 | savefig('./Result/VGG16/BarPlot.fig') 119 | 120 | plotconfusion(T_Tr',Y_Tr_KNN,'KNN Train',T_Tr',Y_Tr_SVM,'SVM Train',T_Tr',Y_Tr_SFT,'SoftMax Train',... 121 | T_Ts',Y_Ts_KNN,'KNN Test',T_Ts',Y_Ts_SVM,'SVM Test',T_Ts',Y_Ts_SFT,'SoftMax Test') 122 | savefig('./Result/VGG16/ConfusionMatrix.fig') 123 | 124 | 125 | 126 | -------------------------------------------------------------------------------- /5** ResNet: -------------------------------------------------------------------------------- 1 | clc 2 | clear 3 | close all 4 | 5 | %% Laod Data 6 | 7 | load('MRIdata.mat'); 8 | 9 | imageSize = [224 224]; 10 | N_data = numel(label); 11 | 12 | for i = 1:N_data; 13 | img = Data(:,:,:,i); 14 | img2(:,:,1) = imresize(img,imageSize); 15 | img2(:,:,2) = imresize(img,imageSize); 16 | img2(:,:,3) = imresize(img,imageSize); 17 | data1(:,:,:,i)=img2; 18 | end 19 | 20 | clear data 21 | 22 | %% Shuffling Data 23 | N_data = size(data1,4); 24 | 25 | ind_tr = round(0.75*N_data); 26 | 27 | idx = randperm(N_data); 28 | 29 | X = data1(:,:,:,idx); 30 | T = label(idx); 31 | 32 | %% Train ans Test Data Sepratation 33 | X_Tr = X(:,:,:,1:ind_tr); 34 | X_Ts = X(:,:,:,1+ind_tr:end); 35 | 36 | T_Tr = T(1:ind_tr); 37 | T_Ts = T(1+ind_tr:end); 38 | 39 | clear data1; 40 | 41 | %% ResNet 18 42 | net = resnet18; 43 | 44 | %% ResNet 18 feature extractor 45 | layer = 'pool5'; 46 | 47 | for i = 1:N_data 48 | features(i,:) = activations(net,X(:,:,:,i),layer,'OutputAs','rows'); 49 | end 50 | 51 | %% Train KNN 52 | KNN_classifier = fitcknn(features(1:ind_tr,:),T(1:ind_tr),'Standardize',true,... 53 | 'NumNeighbors',10,'Distance','euclidean'); 54 | 55 | Y_Tr_KNN = predict(KNN_classifier,features(1:ind_tr,:)); 56 | Y_Ts_KNN = predict(KNN_classifier,features(1+ind_tr:end,:)); 57 | 58 | acc_tr = sum(Y_Tr_KNN == T_Tr')./numel(T_Tr)*100; 59 | acc_ts = sum(Y_Ts_KNN == T_Ts')./numel(T_Ts)*100; 60 | 61 | KNN = [acc_tr acc_ts]'; 62 | 63 | %% Train SVM 64 | template = templateSVM('Standardize',true,'KernelFunction','linear'); 65 | SVM_classifier = fitcecoc(features(1:ind_tr,:),T(1:ind_tr),'Learners',template); 66 | 67 | Y_Tr_SVM = predict(SVM_classifier,features(1:ind_tr,:)); 68 | Y_Ts_SVM = predict(SVM_classifier,features(1+ind_tr:end,:)); 69 | 70 | acc_tr = sum(Y_Tr_SVM == T_Tr')./numel(T_Tr)*100; 71 | acc_ts = sum(Y_Ts_SVM == T_Ts')./numel(T_Ts)*100; 72 | 73 | SVM = [acc_tr acc_ts]'; 74 | 75 | %% Train Softmax 76 | 77 | NumFeatures = size(features,2); 78 | 79 | layers = [ 80 | featureInputLayer(NumFeatures) 81 | 82 | fullyConnectedLayer(50) 83 | dropoutLayer(0.05) 84 | 85 | fullyConnectedLayer(3) 86 | softmaxLayer 87 | classificationLayer]; 88 | 89 | options = trainingOptions('adam', ... 90 | 'MaxEpochs',20,... 91 | 'LearnRateSchedule','piecewise', ... 92 | 'InitialLearnRate',1e-4, ... 93 | 'LearnRateDropFactor', 0.75, ... 94 | 'LearnRateDropPeriod', 4, ... 95 | 'Verbose',false, ... 96 | 'Plots','training-progress'); 97 | 98 | Net = trainNetwork(features(1:ind_tr,:),T(1:ind_tr),layers,options); 99 | 100 | Y_Tr_SFT = classify(Net,features(1:ind_tr,:)); 101 | Y_Ts_SFT = classify(Net,features(1+ind_tr:end,:)); 102 | Y = classify(Net,features); 103 | 104 | acc_tr = sum(Y_Tr_SFT == T_Tr')./numel(T_Tr)*100; 105 | acc_ts = sum(Y_Ts_SFT == T_Ts')./numel(T_Ts)*100; 106 | 107 | Softmax = [acc_tr acc_ts]'; 108 | 109 | Result = table(KNN, SVM, Softmax,'RowNames',{'Train','Test'}) 110 | save('./Result/ResNet18/Result.mat','Result') 111 | 112 | Result = [KNN, SVM, Softmax]'; 113 | x = categorical({'KNN','SVM','Softmax'}); 114 | x = reordercats(x,{'KNN','SVM','Softmax'}); 115 | bar(x,Result) 116 | ylim([0 110]) 117 | grid minor 118 | savefig('./Result/ResNet18/BarPlot.fig') 119 | 120 | plotconfusion(T_Tr',Y_Tr_KNN,'KNN Train',T_Tr',Y_Tr_SVM,'SVM Train',T_Tr',Y_Tr_SFT,'SoftMax Train',... 121 | T_Ts',Y_Ts_KNN,'KNN Test',T_Ts',Y_Ts_SVM,'SVM Test',T_Ts',Y_Ts_SFT,'SoftMax Test') 122 | savefig('./Result/ResNet18/ConfusionMatrix.fig') 123 | 124 | 125 | -------------------------------------------------------------------------------- /2** AlexNet: -------------------------------------------------------------------------------- 1 | clc 2 | clear 3 | close all 4 | 5 | %% Laod Data 6 | 7 | load('MRIdata.mat'); 8 | 9 | imageSize = [227 227]; 10 | N_data = numel(label); 11 | 12 | for i = 1:N_data; 13 | img = Data(:,:,:,i); 14 | img2(:,:,1) = imresize(img,imageSize); 15 | img2(:,:,2) = imresize(img,imageSize); 16 | img2(:,:,3) = imresize(img,imageSize); 17 | data1(:,:,:,i)=img2; 18 | end 19 | 20 | clear data 21 | 22 | %% Shuffling Data 23 | N_data = size(data1,4); 24 | 25 | ind_tr = round(0.75*N_data); 26 | 27 | idx = randperm(N_data); 28 | 29 | X = data1(:,:,:,idx); 30 | T = label(idx); 31 | 32 | %% Train ans Test Data Sepratation 33 | X_Tr = X(:,:,:,1:ind_tr); 34 | X_Ts = X(:,:,:,1+ind_tr:end); 35 | 36 | T_Tr = T(1:ind_tr); 37 | T_Ts = T(1+ind_tr:end); 38 | 39 | clear data1; 40 | 41 | %% AlexNet 42 | net = alexnet; 43 | 44 | %% AlexNet feature extractor 45 | layer = 'relu7'; 46 | 47 | for i = 1:N_data 48 | features(i,:) = activations(net,X(:,:,:,i),layer,'OutputAs','rows'); 49 | end 50 | 51 | %% Train KNN 52 | KNN_classifier = fitcknn(features(1:ind_tr,:),T(1:ind_tr),'Standardize',true,... 53 | 'NumNeighbors',3,'Distance','euclidean'); 54 | 55 | Y_Tr_KNN = predict(KNN_classifier,features(1:ind_tr,:)); 56 | Y_Ts_KNN = predict(KNN_classifier,features(1+ind_tr:end,:)); 57 | 58 | acc_tr = sum(Y_Tr_KNN == T_Tr')./numel(T_Tr)*100; 59 | acc_ts = sum(Y_Ts_KNN == T_Ts')./numel(T_Ts)*100; 60 | 61 | KNN = [acc_tr acc_ts]'; 62 | 63 | %% Train SVM 64 | template = templateSVM('Standardize',true,'KernelFunction','polynomial'); 65 | SVM_classifier = fitcecoc(features(1:ind_tr,:),T(1:ind_tr),'Learners',template); 66 | 67 | Y_Tr_SVM = predict(SVM_classifier,features(1:ind_tr,:)); 68 | Y_Ts_SVM = predict(SVM_classifier,features(1+ind_tr:end,:)); 69 | 70 | acc_tr = sum(Y_Tr_SVM == T_Tr')./numel(T_Tr)*100; 71 | acc_ts = sum(Y_Ts_SVM == T_Ts')./numel(T_Ts)*100; 72 | 73 | SVM = [acc_tr acc_ts]'; 74 | 75 | %% Train Softmax 76 | 77 | NumFeatures = size(features,2); 78 | 79 | layers = [ 80 | featureInputLayer(NumFeatures) 81 | 82 | fullyConnectedLayer(50) 83 | dropoutLayer(0.05) 84 | 85 | fullyConnectedLayer(3) 86 | softmaxLayer 87 | classificationLayer]; 88 | 89 | options = trainingOptions('adam', ... 90 | 'MaxEpochs',20,... 91 | 'LearnRateSchedule','piecewise', ... 92 | 'InitialLearnRate',1e-4, ... 93 | 'LearnRateDropFactor', 0.75, ... 94 | 'LearnRateDropPeriod', 4, ... 95 | 'Verbose',false, ... 96 | 'Plots','training-progress'); 97 | 98 | Net = trainNetwork(features(1:ind_tr,:),T(1:ind_tr),layers,options); 99 | 100 | Y_Tr_SFT = classify(Net,features(1:ind_tr,:)); 101 | Y_Ts_SFT = classify(Net,features(1+ind_tr:end,:)); 102 | Y = classify(Net,features); 103 | 104 | acc_tr = sum(Y_Tr_SFT == T_Tr')./numel(T_Tr)*100; 105 | acc_ts = sum(Y_Ts_SFT == T_Ts')./numel(T_Ts)*100; 106 | 107 | Softmax = [acc_tr acc_ts]'; 108 | 109 | Result = table(KNN, SVM, Softmax,'RowNames',{'Train','Test'}) 110 | save('./Result/AlexNet/Result.mat','Result') 111 | 112 | Result = [KNN, SVM, Softmax]'; 113 | x = categorical({'KNN','SVM','Softmax'}); 114 | x = reordercats(x,{'KNN','SVM','Softmax'}); 115 | bar(x,Result) 116 | ylim([0 110]) 117 | grid minor 118 | savefig('./Result/AlexNet/BarPlot.fig') 119 | 120 | plotconfusion(T_Tr',Y_Tr_KNN,'KNN Train',T_Tr',Y_Tr_SVM,'SVM Train',T_Tr',Y_Tr_SFT,'SoftMax Train',... 121 | T_Ts',Y_Ts_KNN,'KNN Test',T_Ts',Y_Ts_SVM,'SVM Test',T_Ts',Y_Ts_SFT,'SoftMax Test') 122 | savefig('./Result/AlexNet/ConfusionMatrix.fig') 123 | 124 | 125 | 126 | -------------------------------------------------------------------------------- /3** GoogleNet: -------------------------------------------------------------------------------- 1 | clc 2 | clear 3 | close all 4 | 5 | %% Laod Data 6 | 7 | load('MRIdata.mat'); 8 | 9 | imageSize = [224 224]; 10 | N_data = numel(label); 11 | 12 | for i = 1:N_data; 13 | img = Data(:,:,:,i); 14 | img2(:,:,1) = imresize(img,imageSize); 15 | img2(:,:,2) = imresize(img,imageSize); 16 | img2(:,:,3) = imresize(img,imageSize); 17 | data1(:,:,:,i)=img2; 18 | end 19 | 20 | clear Data 21 | 22 | %% Shuffling Data 23 | N_data = size(data1,4); 24 | 25 | ind_tr = round(0.75*N_data); 26 | 27 | idx = randperm(N_data); 28 | 29 | X = data1(:,:,:,idx); 30 | T = label(idx); 31 | 32 | %% Train ans Test Data Sepratation 33 | X_Tr = X(:,:,:,1:ind_tr); 34 | X_Ts = X(:,:,:,1+ind_tr:end); 35 | 36 | T_Tr = T(1:ind_tr); 37 | T_Ts = T(1+ind_tr:end); 38 | 39 | clear data1; 40 | 41 | %% Googlenet 42 | net = googlenet; 43 | 44 | %% Googlenet feature extractor 45 | layer = 'pool5-7x7_s1'; 46 | 47 | for i = 1:N_data 48 | features(i,:) = activations(net,X(:,:,:,i),layer,'OutputAs','rows'); 49 | end 50 | 51 | %% Train KNN 52 | KNN_classifier = fitcknn(features(1:ind_tr,:),T(1:ind_tr),'Standardize',true,... 53 | 'NumNeighbors',3,'Distance','euclidean'); 54 | 55 | Y_Tr_KNN = predict(KNN_classifier,features(1:ind_tr,:)); 56 | Y_Ts_KNN = predict(KNN_classifier,features(1+ind_tr:end,:)); 57 | 58 | acc_tr = sum(Y_Tr_KNN == T_Tr')./numel(T_Tr)*100; 59 | acc_ts = sum(Y_Ts_KNN == T_Ts')./numel(T_Ts)*100; 60 | 61 | KNN = [acc_tr acc_ts]'; 62 | 63 | %% Train SVM 64 | template = templateSVM('Standardize',true,'KernelFunction','polynomial'); 65 | SVM_classifier = fitcecoc(features(1:ind_tr,:),T(1:ind_tr),'Learners',template); 66 | 67 | Y_Tr_SVM = predict(SVM_classifier,features(1:ind_tr,:)); 68 | Y_Ts_SVM = predict(SVM_classifier,features(1+ind_tr:end,:)); 69 | 70 | acc_tr = sum(Y_Tr_SVM == T_Tr')./numel(T_Tr)*100; 71 | acc_ts = sum(Y_Ts_SVM == T_Ts')./numel(T_Ts)*100; 72 | 73 | SVM = [acc_tr acc_ts]'; 74 | 75 | %% Train Softmax 76 | 77 | NumFeatures = size(features,2); 78 | 79 | layers = [ 80 | featureInputLayer(NumFeatures) 81 | 82 | fullyConnectedLayer(50) 83 | dropoutLayer(0.05) 84 | 85 | fullyConnectedLayer(3) 86 | softmaxLayer 87 | classificationLayer]; 88 | 89 | options = trainingOptions('adam', ... 90 | 'MaxEpochs',20,... 91 | 'LearnRateSchedule','piecewise', ... 92 | 'InitialLearnRate',1e-4, ... 93 | 'LearnRateDropFactor', 0.75, ... 94 | 'LearnRateDropPeriod', 4, ... 95 | 'Verbose',false, ... 96 | 'Plots','training-progress'); 97 | 98 | Net = trainNetwork(features(1:ind_tr,:),T(1:ind_tr),layers,options); 99 | 100 | Y_Tr_SFT = classify(Net,features(1:ind_tr,:)); 101 | Y_Ts_SFT = classify(Net,features(1+ind_tr:end,:)); 102 | Y = classify(Net,features); 103 | 104 | acc_tr = sum(Y_Tr_SFT == T_Tr')./numel(T_Tr)*100; 105 | acc_ts = sum(Y_Ts_SFT == T_Ts')./numel(T_Ts)*100; 106 | 107 | Softmax = [acc_tr acc_ts]'; 108 | 109 | Result = table(KNN, SVM, Softmax,'RowNames',{'Train','Test'}) 110 | save('./Result/GoogleNet/Result.mat','Result') 111 | 112 | Result = [KNN, SVM, Softmax]'; 113 | x = categorical({'KNN','SVM','Softmax'}); 114 | x = reordercats(x,{'KNN','SVM','Softmax'}); 115 | bar(x,Result) 116 | ylim([0 110]) 117 | grid minor 118 | savefig('./Result/GoogleNet/BarPlot.fig') 119 | 120 | plotconfusion(T_Tr',Y_Tr_KNN,'KNN Train',T_Tr',Y_Tr_SVM,'SVM Train',T_Tr',Y_Tr_SFT,'SoftMax Train',... 121 | T_Ts',Y_Ts_KNN,'KNN Test',T_Ts',Y_Ts_SVM,'SVM Test',T_Ts',Y_Ts_SFT,'SoftMax Test')GoogleNet, also known as Inception, is a deep convolutional neural network architecture designed for image classification and object detection tasks. While it is not specifically tailored for medical imaging, it can be adapted for use in the diagnosis of brain tumors through a process known as transfer learning. 122 | 123 | Transfer learning involves taking a pre-trained neural network, like GoogleNet, which has learned features from a large dataset for general image classification, and fine-tuning it on a smaller dataset related to a specific task—in this case, brain tumor detection. The advantages of transfer learning include the ability to leverage the knowledge gained from vast datasets and accelerate training on a smaller, task-specific dataset. 124 | 125 | In the context of brain tumor diagnosis, GoogleNet can be used as a feature extractor to capture meaningful patterns and features from medical images, such as MRI scans. The network learns to recognize important characteristics that distinguish between healthy brain tissue and areas affected by tumors. 126 | 127 | The function of GoogleNet in this context involves: 128 | 129 | Feature Extraction: GoogleNet excels at capturing hierarchical and abstract features from images through its inception modules. These features may include shapes, textures, and patterns that are indicative of abnormalities like tumors. 130 | 131 | Pattern Recognition: The trained GoogleNet can identify complex patterns in medical images, aiding in the detection of abnormal regions within the brain. 132 | 133 | Classification: Through transfer learning, the network can be fine-tuned to classify image patches or segments as either tumor or non-tumor regions. This classification can provide valuable information to radiologists and healthcare professionals. 134 | 135 | Localization: GoogleNet can also assist in localizing the precise location of tumors within the brain, which is crucial for treatment planning. 136 | 137 | It's important to note that while GoogleNet can contribute to the diagnosis of brain tumors, medical imaging tasks often require specialized architectures designed for segmentation. Segmentation networks, such as U-Net or variations of it, are better suited for outlining and precisely delineating the boundaries of tumors in medical images. 138 | 139 | In summary, GoogleNet can play a role in the diagnosis of brain tumors by leveraging its capabilities in feature extraction, pattern recognition, and classification when adapted through transfer learning on relevant medical imaging datasets. 140 | savefig('./Result/GoogleNet/ConfusionMatrix.fig') 141 | 142 | 143 | 144 | 145 | --------------------------------------------------------------------------------