├── Code ├── AMSGrad.m ├── AMSGradloss.mat ├── AMSGradtrack.mat ├── AdaDelta.m ├── AdaDeltaloss.mat ├── AdaDeltatrack.mat ├── Adagrad.m ├── Adagradloss.mat ├── Adagradtrack.mat ├── Adam.m ├── Adama.m ├── Adamaloss.mat ├── Adamatrack.mat ├── Adamloss.mat ├── Adamtrack.mat ├── GD1.m ├── GD2.m ├── SGD.m ├── SGDM.m ├── SGDMloss.mat ├── SGDMtrack.mat ├── SGDloss.mat ├── SGDtrack.mat ├── equal_high.m ├── ex1.m ├── lossduibi.m ├── mydata.mat ├── mydata1.mat ├── myfun.m ├── num_grad.m ├── objfun1.m ├── objfun2.m ├── objfun3.m ├── test.m └── track.m └── README.md /Code/AMSGrad.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/AMSGrad.m -------------------------------------------------------------------------------- /Code/AMSGradloss.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/AMSGradloss.mat -------------------------------------------------------------------------------- /Code/AMSGradtrack.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/AMSGradtrack.mat -------------------------------------------------------------------------------- /Code/AdaDelta.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/AdaDelta.m -------------------------------------------------------------------------------- /Code/AdaDeltaloss.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/AdaDeltaloss.mat -------------------------------------------------------------------------------- /Code/AdaDeltatrack.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/AdaDeltatrack.mat -------------------------------------------------------------------------------- /Code/Adagrad.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adagrad.m -------------------------------------------------------------------------------- /Code/Adagradloss.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adagradloss.mat -------------------------------------------------------------------------------- /Code/Adagradtrack.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adagradtrack.mat -------------------------------------------------------------------------------- /Code/Adam.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adam.m -------------------------------------------------------------------------------- /Code/Adama.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adama.m -------------------------------------------------------------------------------- /Code/Adamaloss.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adamaloss.mat -------------------------------------------------------------------------------- /Code/Adamatrack.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adamatrack.mat -------------------------------------------------------------------------------- /Code/Adamloss.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adamloss.mat -------------------------------------------------------------------------------- /Code/Adamtrack.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/Adamtrack.mat -------------------------------------------------------------------------------- /Code/GD1.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/GD1.m -------------------------------------------------------------------------------- /Code/GD2.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/GD2.m -------------------------------------------------------------------------------- /Code/SGD.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/SGD.m -------------------------------------------------------------------------------- /Code/SGDM.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/SGDM.m -------------------------------------------------------------------------------- /Code/SGDMloss.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/SGDMloss.mat -------------------------------------------------------------------------------- /Code/SGDMtrack.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/SGDMtrack.mat -------------------------------------------------------------------------------- /Code/SGDloss.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/SGDloss.mat -------------------------------------------------------------------------------- /Code/SGDtrack.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/SGDtrack.mat -------------------------------------------------------------------------------- /Code/equal_high.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/equal_high.m -------------------------------------------------------------------------------- /Code/ex1.m: -------------------------------------------------------------------------------- 1 | clear all 2 | clc 3 | % a = 133.35; 4 | % b = 5.20; 5 | % c = 0.78; 6 | a = 100; 7 | b = 5; 8 | % c = 1; 9 | x = 0:0.1:12; 10 | y = a./(1+exp(b-x)); 11 | y1 = y + 0.4*randn(1,length(y)); 12 | plot(x,y,'r-',x,y1,'o') 13 | data = [x;y1]; 14 | save mydata1 data -------------------------------------------------------------------------------- /Code/lossduibi.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/lossduibi.m -------------------------------------------------------------------------------- /Code/mydata.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/mydata.mat -------------------------------------------------------------------------------- /Code/mydata1.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/mydata1.mat -------------------------------------------------------------------------------- /Code/myfun.m: -------------------------------------------------------------------------------- 1 | function y = myfun(x,theta) 2 | a = theta(1); 3 | b = theta(2); 4 | % c = theta(3); 5 | y = a./(1+exp(b-x)); -------------------------------------------------------------------------------- /Code/num_grad.m: -------------------------------------------------------------------------------- 1 | 2 | function df = num_grad(func, X) 3 | h = 1e-5; 4 | df = zeros(length(X), 1); 5 | 6 | % for each dimension of objective function 7 | for i=1:length(X) 8 | % vary variable i by a small amount (left and right) 9 | x1 = X; 10 | x2 = X; 11 | x1(i) = X(i) - h; 12 | x2(i) = X(i) + h; 13 | 14 | % evaluate the objective function at the left and right points 15 | y1 = feval(func,x1); 16 | y2 = feval(func,x2); 17 | 18 | % calculate the slope (rise/run) for dimension i 19 | df(i) = (y2 - y1) / (2*h); 20 | end 21 | -------------------------------------------------------------------------------- /Code/objfun1.m: -------------------------------------------------------------------------------- 1 | function loss = objfun1(theta,data) 2 | x = data(1,:); 3 | y = data(2,:); 4 | N = length(x); 5 | loss = 0; 6 | for i = 1:N 7 | loss = loss + 0.5*(myfun(x(i),theta) - y(i))^2; 8 | end 9 | loss = loss/N; 10 | -------------------------------------------------------------------------------- /Code/objfun2.m: -------------------------------------------------------------------------------- 1 | function loss = objfun2(theta,sample) 2 | x = sample(1); 3 | y = sample(2); 4 | loss = 0.5*(myfun(x,theta) - y)^2; 5 | -------------------------------------------------------------------------------- /Code/objfun3.m: -------------------------------------------------------------------------------- 1 | function loss = objfun3(theta,data1) 2 | x = data1(1,:); 3 | y = data1(2,:); 4 | N = length(x); 5 | loss = 0; 6 | for i = 1:N 7 | loss = loss + 0.5*(myfun(x(i),theta) - y(i))^2; 8 | end 9 | loss = loss/N; 10 | -------------------------------------------------------------------------------- /Code/test.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/test.m -------------------------------------------------------------------------------- /Code/track.m: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wuyinwuxian/Neural_Network_optimization_method/39416abdaff0a3d83de68ead2f4eba1ba7847881/Code/track.m -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Neural_Network_optimization_method 2 | 这是一个Matlab代码,里面包括五种常见神经网络优化算法的对比。包括SGD、SGDM、Adagrad、AdaDelta、Adam,都是批量的更新,test.m是所有方法的入口 3 | --------------------------------------------------------------------------------