![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhizxdSL5xp3o-gINHFw4lS162s73sK850wAlcDDzw5ePw5qS82xrNDQ1kYJcntmwrnZnZk_v_Z47DBJMyRnwkLKXePtFPfhTLRF1NM02gJhGuEHeX2mBJPTu8gJERUCFHZx5cNrZDS3fM/s320/four_1.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhO6knDAJy34MsFxRmvOerbXSB5y1TOvcKdEyclg-FDu2kRqaFzaK69j9My-NmLlS7jDy1LLEv-8y2CNB4kkLI46REONI0yGQehAN7fqr0XVZBp4A7kGm0Fd75RtYRkERHJUhuNQZ85uI4/s320/four_2.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiAvIN4nPABePgbO9FR-EfzCjanFsKprRfgNjPkv9LVD6a8DbHLKJS1k4T0FGtUfx_q3WMmseRMPRpvZUKkgV0SKEe7VnlYpZg5yei7jJrv1hQPVsFQ752pNR-NDhAc1ITi5xYFOd1QcQ8/s320/four_3.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKTDLTWXk9gPNHz317r7vM_pkVtDMO_403zDD48D3NQoy3RilomrWxYVX6jsHmD9FjuB0abqVmg9GTLiuaRLpZQ-dsChBjCrmqOWu_iFsIYM04V4ADsDxFWxC4JLQLUdqASbcQLmN-0zM/s320/four_4.jpg)
More details could be found in my published book:
MATLAB编程基础与典型应用
北京:人民邮电出版社,2008
ISBN:978-7-115-17932-6/TP
Pls contact me with Email:lhd06@mails.tsinghua.edu.cn
分享Matlab数学工程软件的学习资源以及源代码,深入探讨交流MATLAB编程,Simlink仿真技术以及MATLAB接口编程等技术!交流Protel, Candence等工程软件的学习方法和经验交流!
net=newff(minmax(P),[5,1],{'logsig','purelin'},'traingdm');
net.trainParam
ans =
epochs: 100
goal: 0
lr: 0.0100
max_fail: 5
mc: 0.9000
min_grad: 1.0000e-010
show: 25
time: Inf
net=newff(minmax(P),[5,1],{'logsig','purelin'},'traingdm');
net.trainParam.show = 50;
net.trainParam.lr = 0.1;
net.trainParam.mc = 0.9;
net.trainParam.epochs = 300;
net.trainParam.goal = 0.01;
[net,tr]=train(net,P,T);
%回代检验
A=sim(net,P);
%测试样本检验
a=sim(net,p);
a =
0.6104 0.5641 0.4307
更多MATLAB BP神经网络编程资源:
x = quadprog(H,f,A,b)
x = quadprog(H,f,A,b,Aeq,beq)
x = quadprog(H,f,A,b,Aeq,beq,lb,ub)
x = quadprog(H,f,A,b,Aeq,beq,lb,ub,x0)
x = quadprog(H,f,A,b,Aeq,beq,lb,ub,x0,options)
[x,fval] = quadprog(...)
[x,fval,exitflag] = quadprog(...)
[x,fval,exitflag,output] = quadprog(...)
[x,fval,exitflag,output,lambda] = quadprog(...)
H=[2 -1;-1 2]; %标准变换后的H矩阵
f=[-10;4]; %标准变换后的f向量
lb=zeros(2,1); %下限约束
A=[-1 -1]; %线性不等式系数矩阵
b=-8; %线性不等式右侧常数向量
[x,fval,exitflag,output,lambda] = quadprog(H,f,A,b,[ ],[ ],lb) %二次规划求解
x =
6.3333
1.6667
fval =
-24.3333
exitflag =
1
A=[-1 -1];
b=[-8];
lb=[0 0];
x0=[0;0];
fun=@(x)(x(1)^2-10*x(1)-x(1)*x(2)+x(2)^2-4*x(2)); %@函数句柄
options=optimset('Display','iter','MaxFunEvals',1e5);
[x,fval,exitflag,output,lambda,grad,hessian]=fmincon(fun,x0,A,b,[],[],lb,[],[],options)
max Line search Directional First-order
Iter F-count f(x) constraint steplength derivative optimality Procedure
0 3 0 8 Infeasible
1 6 -40 -4 1 36 6 start point
2 10 -50.7721 -6.9 0.5 0.864 1.57
3 13 -51.9816 -5.743 1 0.337 0.204
4 16 -52 -6 1 4.26e-008 4.45e-006
x =
8.0000
6.0000
fval =
-52.0000
exitflag =
5
>> which('slblocks.m', '-all')
>> open('D:\MATLAB\R2006a\toolbox\Simulink\blocks\slblocks.m')
function blkStruct = slblocks
%SLBLOCKS Defines a block library.
% Library's name. The name appears in the Library Browser's
% contents pane.
blkStruct.Name = ['own Definition' sprintf('\n') 'Library']; % 模块库的显示名称
% The function that will be called when the user double-clicks on
% the library's name. ;
blkStruct.OpenFcn = 'own_definition'; %自定义的模块库名称
% The argument to be set as the Mask Display for the subsystem. You
% may comment this line out if no specific mask is desired.
% Example: blkStruct.MaskDisplay =
'plot([0:2*pi],sin([0:2*pi]));';
% No display for now.
% blkStruct.MaskDisplay = '';
% End of blocks
net=newff(minmax(P),[5,1],{'logsig','purelin'},'traingdx');在MATLAB命令行窗口中输入:
net.trainParam
ans =
epochs: 100
goal: 0
lr: 0.0100 %学习速率基值
lr_dec: 0.7000 %学习速率减少率
lr_inc: 1.0500 %学习速率增加率
max_fail: 5
max_perf_inc: 1.0400
mc: 0.9000 %动量因子
min_grad: 1.0000e-006
show: 25
time: Inf
net=newff(minmax(P),[5,1],{'logsig','purelin'},'traingdx');下图所示为traingdx函数bp神经网络训练过程曲线,测试样本的输出结果为:
net.trainParam.show = 50;
net.trainParam.lr = 0.1;
net.trainParam.lr_inc = 1.05;
net.trainParam.lr_dec = 0.85;
net.trainParam.mc = 0.9;
net.trainParam.epochs = 300;
net.trainParam.goal = 0.01;
[net,tr]=train(net,P,T);
%回代检验
A=sim(net,P);
%测试样本检验
a=sim(net,p);
TRAINGDX-calcgrad, Epoch 0/300, MSE 5.70591/0.01, Gradient 7.94678/1e-006
TRAINGDX-calcgrad, Epoch 50/300, MSE 0.0185869/0.01, Gradient 0.0617651/1e-006
TRAINGDX-calcgrad, Epoch 72/300, MSE 0.00997184/0.01, Gradient 0.0194848/1e-006
TRAINGDX, Performance goal met.
a =
0.5880 0.6223 0.5236
net=newff(minmax(P),[5,1],{'logsig','purelin'},'traingda');在MATLAB命令行窗口中输入以下程序段:
net.trainParam
ans =
epochs: 100
goal: 0
lr: 0.0100 %学习速率基值
lr_inc: 1.0500 %学习速率增加率为1.05
lr_dec: 0.7000 %学习速率减少率为0.7
max_fail: 5
max_perf_inc: 1.0400
min_grad: 1.0000e-006
show: 25
time: Inf
net=newff(minmax(P),[5,1],{'logsig','purelin'},'traingda');下图所示为traingda函数bp神经网络训练过程曲线,BP神经网络对测试样本的输出结果为:
net.trainParam.show = 50;
net.trainParam.lr = 0.1;
net.trainParam.lr_inc = 1.05;
net.trainParam.lr_dec = 0.85;
net.trainParam.epochs = 300;
net.trainParam.goal = 0.01;
[net,tr]=train(net,P,T);
%回代检验
A=sim(net,P);
%测试样本检验
a=sim(net,p);
TRAINGDA-calcgrad, Epoch 0/300, MSE 0.784702/0.01, Gradient 1.98321/1e-006
TRAINGDA-calcgrad, Epoch 36/300, MSE 0.00993732/0.01, Gradient 0.0156508/1e-006
TRAINGDA, Performance goal met.
a =
0.4192 0.5750 0.7746