矮猫鼬优化算法(Dwarf Mongoose Optimization Algorithm,DMHO)是期刊“COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING”(IF 7.3)的2022年智能优化算法
01.引言
矮猫鼬优化算法(Dwarf Mongoose Optimization Algorithm,DMHO)模仿矮猫鼬的觅食行为。猫鼬捕食的限制性模式极大地影响了猫鼬的社会行为和生态适应,以补偿高效的家庭营养。猫鼬的代偿性行为适应包括猎物大小、空间利用、群体大小和食物供应。提出的算法中使用了矮猫鼬的三个社会群体:阿尔法群体、保姆群体和侦察兵群体。整个家族以觅食为单位,雌性首领发起觅食,决定觅食路径、覆盖的距离和睡觉的土丘。一定数量的猫鼬(通常是雄性和雌性的混合)充当保姆。它们一直陪伴着幼崽,直到企鹅群中午或傍晚返回。保姆被交换为第一个与群体一起觅食的人(开发阶段)。矮猫鼬不为幼崽筑巢;它们把它们从一个熟睡的土堆移到另一个,不会回到之前觅食的地方。矮猫鼬在一块足以支持整个群体的领土上采用了半游牧的生活方式(探索阶段)。这种游牧行为防止了对某一特定地区的过度开发。它还确保了对整个领土的探索,因为以前参观过的沉睡的土丘不会被归还。
02.优化算法的流程
03.论文中算法对比图
04.部分代码
function [BEF,BEP,BestCost]=DMOA(nPop,MaxIt,VarMin,VarMax,nVar,F_obj)
%nVar=5; % Number of Decision Variables
VarSize=[1 nVar]; % Decision Variables Matrix Size
%VarMin=-10; % Decision Variables Lower Bound
%VarMax= 10; % Decision Variables Upper Bound
%% ABC Settings
% MaxIt=1000; % Maximum Number of Iterations
% nPop=100; % Population Size (Family Size)
nBabysitter= 3; % Number of babysitters
nAlphaGroup=nPop-nBabysitter; % Number of Alpha group
nScout=nAlphaGroup; % Number of Scouts
L=round(0.6*nVar*nBabysitter); % Babysitter Exchange Parameter
peep=2; % Alpha female痴 vocalization
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Empty Mongoose Structure
empty_mongoose.Position=[];
empty_mongoose.Cost=[];
% Initialize Population Array
pop=repmat(empty_mongoose,nAlphaGroup,1);
% Initialize Best Solution Ever Found
BestSol.Cost=inf;
tau=inf;
Iter=1;
sm=inf(nAlphaGroup,1);
% Create Initial Population
for i=1:nAlphaGroup
pop(i).Position=unifrnd(VarMin,VarMax,VarSize);
pop(i).Cost=F_obj(pop(i).Position);
if pop(i).Cost<=BestSol.Cost
BestSol=pop(i);
end
end
% Abandonment Counter
C=zeros(nAlphaGroup,1);
CF=(1-Iter/MaxIt)^(2*Iter/MaxIt);
% Array to Hold Best Cost Values
BestCost=zeros(MaxIt,1);
%% DMOA Main Loop
for it=1:MaxIt
% Alpha group
F=zeros(nAlphaGroup,1);
MeanCost = mean([pop.Cost]);
for i=1:nAlphaGroup
% Calculate Fitness Values and Selection of Alpha
F(i) = exp(-pop(i).Cost/MeanCost); % Convert Cost to Fitness
end
P=F/sum(F);
% Foraging led by Alpha female
for m=1:nAlphaGroup
% Select Alpha female
i=RouletteWheelSelection(P);
% Choose k randomly, not equal to Alpha
K=[1:i-1 i+1:nAlphaGroup];
k=K(randi([1 numel(K)]));
% Define Vocalization Coeff.
phi=(peep/2)*unifrnd(-1,+1,VarSize);
% New Mongoose Position
newpop.Position=pop(i).Position+phi.*(pop(i).Position-pop(k).Position);
% Evaluation
newpop.Cost=F_obj(newpop.Position);
% Comparision
if newpop.Cost<=pop(i).Cost
pop(i)=newpop;
else
C(i)=C(i)+1;
end
end
% Scout group
for i=1:nScout
% Choose k randomly, not equal to i
K=[1:i-1 i+1:nAlphaGroup];
k=K(randi([1 numel(K)]));
% Define Vocalization Coeff.
phi=(peep/2)*unifrnd(-1,+1,VarSize);
% New Mongoose Position
newpop.Position=pop(i).Position+phi.*(pop(i).Position-pop(k).Position);
% Evaluation
newpop.Cost=F_obj(newpop.Position);
% Sleeping mould
sm(i)=(newpop.Cost-pop(i).Cost)/max(newpop.Cost,pop(i).Cost);
% Comparision
if newpop.Cost<=pop(i).Cost
pop(i)=newpop;
else
C(i)=C(i)+1;
end
end
% Babysitters
for i=1:nBabysitter
if C(i)>=L
pop(i).Position=unifrnd(VarMin,VarMax,VarSize);
pop(i).Cost=F_obj(pop(i).Position);
C(i)=0;
end
end
% Update Best Solution Ever Found
for i=1:nAlphaGroup
if pop(i).Cost<=BestSol.Cost
BestSol=pop(i);
end
end
% Next Mongoose Position
newtau=mean(sm);
for i=1:nScout
M=(pop(i).Position.*sm(i))/pop(i).Position;
if newtau>tau
newpop.Position=pop(i).Position-CF*phi*rand.*(pop(i).Position-M);
else
newpop.Position=pop(i).Position+CF*phi*rand.*(pop(i).Position-M);
end
tau=newtau;
end
% Update Best Solution Ever Found
for i=1:nAlphaGroup
if pop(i).Cost<=BestSol.Cost
BestSol=pop(i);
end
end
% Store Best Cost Ever Found
BestCost(it)=BestSol.Cost;
BEF=BestSol.Cost;
BEP=BestSol.Position;
% Display Iteration Information
disp(['Iteration ' num2str(it) ': Best Cost = ' num2str(BestCost(it))]);
end
end
function i=RouletteWheelSelection(P)
r=rand;
C=cumsum(P);
i=find(r<=C,1,'first');
end
04.本代码效果图
获取代码请关注MATLAB科研小白的个人公众号(即文章下方二维码),并回复智能优化算法本公众号致力于解决找代码难,写代码怵。各位有什么急需的代码,欢迎后台留言~不定时更新科研技巧类推文,可以一起探讨科研,写作,文献,代码等诸多学术问题,我们一起进步。