经常有小伙伴后台留言问:
作者改进的算法可不可以用来写论文呀?
回答是:当然可以!且不用加引用!
如果我的文章能帮助到大家写论文,那是作者的荣幸呀!
布谷鸟优化算法是一个非常经典的优化算法,直到今天还有不少人研究对其改进。今天为大家带来一期由小淘自行改进的布谷鸟优化算法---融合柯西变异和自适应莱维飞行的布谷鸟优化算法(Cauchy-Adaptive-Levy-Cuckoo search,CALCS),与往期改进算法相同,该算法不会增加原始算法的复杂度。运行时间不会像某些加入了反向学习、贪婪策略等改进算法一样陡然上升。
原理详解
基本布谷鸟算法原理介绍:
基本布谷鸟算法(Cuckoo Search algorithm,CS)是通过模拟布谷鸟寻窝产卵的行为来求解最优解,其求解过程为:
1)设置种群数目、弃巢率、问题边界及最大迭代次数。
2)在问题领域内随机产生一定数目的种群并计算每一个个体的目标函数值,求出当前最优值和最优解。
3)判断迭代次数是否达到最大迭代次数,如果是则退出循环,输出最优值和最优解;否则进入步骤4)。
4)根据下式更新鸟巢。

式中: nest(t+1)i ,nestti分别表示鸟巢i 在第t+1和t 代的位置; α=0.01R,R∈N(0,1) ; best表示当前鸟巢中最佳位置; step表示Levy飞行产生的步长。

式中: β 取值为1.5; ν,μ∈N(0,1) ; φ 按下式计算:

5) 计算新鸟巢的目标函数值,若较之前的优越则替换函数值和相应的鸟巢,并记录最优解。
6) 随机生成[0,1]之间的数并与弃巢率比较,若小于则保留该鸟巢,否则按下式产生新鸟巢。

式中: a、c 为第t 次迭代中不重复的随机整数; r∈[0,1]的随机数。
7) 重新计算新鸟巢的目标函数值,若较之前的优越则替换函数值和相应的鸟巢,并记录最优解。
8) 判断函数值与最优值,如果小于则替换最优值与最优解; 算法转入步骤3) 进行下一次迭代。
改进布谷鸟算法原理介绍:
改进点1:在基本布谷鸟算法基础上采用Tent混沌映射,初始化种群的多样性。
改进点2:采用自适应步长的Levy飞行来改进更新鸟巢的公式。
针对标准布谷鸟搜索算法收敛速度慢、寻优精度低的缺点,通过自适应莱维飞行机制进行全局搜索:莱维飞行步长随迭代的进行不断减小。改进的算法在寻优初期拥有较大的步长因子,从而扩大算法前期的搜索空间,提高全局搜索能力;在寻优过程中,步长减小,提高算法局部搜索性能。由标准算法的0.01改为下式:α=0.5*exp(-t/tmax)。当然关于此类的公式,网上还有很多,大家如果不想用这个公式还可以自行改进。
改进点3:在随机游走之前,先采用柯西变异来更新鸟巢位置。柯西分布与标准的正态分布相似,为连续的概率分布,在原点处值较小,两端较为扁长,逼近零速率较慢, 因而相比于之前的随机游走策略能产生更大的扰动。因此,利用柯西变异对鸟巢位置进行扰动,从而扩大布谷鸟算法的搜索规模,进而提升算法跳出局部最优能力。
结果展示
在CEC2005函数集上进行测试,结果如下:其中CALCS为本文所提改进布谷鸟算法,CS是原始的布谷鸟优化算法,GWO是灰狼优化算法,PSO是粒子群优化算法。
算法迭代1000次,每种算法的粒子数设置为30。










结果分析:在单峰值函数与多峰值函数的测试中可以看到,融合柯西变异和自适应莱维飞行的布谷鸟优化算法寻优效果确实不错!
代码展示
- %%
- clear
- clc
- close all
- number='F1'; %选定优化函数,自行替换:F1~F23
- [lower_bound,upper_bound,variables_no,fobj]=CEC2005(number); % [lb,ub,D,y]:下界、上界、维度、目标函数表达式
- pop_size=30; % population members
- max_iter=1000; % maximum number of iteration
- %% GWO
- [fMin , bestX,GWO_convergence_curve ] =GWO(pop_size,max_iter,lower_bound,upper_bound,variables_no,fobj);
- display(['The best optimal value of the objective funciton found by GWO for ' [num2str(number)],' is : ', num2str(fMin)]);
- fprintf ('Best solution obtained by GWO: %s\n', num2str(bestX,'%e '));
-
-
- %% PSO
- [Best_pos,Best_score, PSO_convergence_curve ] = PSO(pop_size,max_iter,lower_bound,upper_bound,variables_no,fobj); % Call PSO
- fprintf ('Best solution obtained by PSO: %s\n', num2str(Best_score,'%e '));
- display(['The best optimal value of the objective funciton found by PSO for ' [num2str(number)],' is : ', num2str(Best_pos)]);
-
-
- %% CS
- [CS_Score,CSbestx,CS_convergence_curve]=CS(pop_size,max_iter,lower_bound,upper_bound,variables_no,fobj);
- display(['The best optimal value of the objective funciton found by CS for ' [num2str(number)],' is : ', num2str(CSbestx)]);
- fprintf ('Best solution obtained by CS: %s\n', num2str(CS_Score,'%e '));
-
-
- %% CALCS
- [CALCS_Score,CALCSbestx,CALCS_convergence_curve]=CALCS(pop_size,max_iter,lower_bound,upper_bound,variables_no,fobj);
- display(['The best optimal value of the objective funciton found by CALCS for ' [num2str(number)],' is : ', num2str(CALCSbestx)]);
- fprintf ('Best solution obtained by CALCS: %s\n', num2str(CALCS_Score,'%e '));
- %% Figure
- figure1 = figure('Color',[1 1 1]);
- G1=subplot(1,2,1,'Parent',figure1);
- func_plot(number)
- title(number)
- xlabel('x')
- ylabel('y')
- zlabel('z')
- subplot(1,2,2)
- G2=subplot(1,2,2,'Parent',figure1);
- CNT=20;
- k=round(linspace(1,max_iter,CNT)); %随机选CNT个点
- % 注意:如果收敛曲线画出来的点很少,随机点很稀疏,说明点取少了,这时应增加取点的数量,100、200、300等,逐渐增加
- % 相反,如果收敛曲线上的随机点非常密集,说明点取多了,此时要减少取点数量
- iter=1:1:max_iter;
- if ~strcmp(number,'F16')&&~strcmp(number,'F9')&&~strcmp(number,'F11') %这里是因为这几个函数收敛太快,不适用于semilogy,直接plot
-
- semilogy(iter(k),CS_convergence_curve(k),'m-^','linewidth',1);
- hold on
- semilogy(iter(k),GWO_convergence_curve(k),'b-*','linewidth',1);
- hold on
- semilogy(iter(k),PSO_convergence_curve(k),'y-o','linewidth',1);
- hold on
- semilogy(iter(k),CALCS_convergence_curve(k),'g-p','linewidth',1);
-
- else
- plot(iter(k),CS_convergence_curve(k),'m-^','linewidth',1);
- hold on
- plot(iter(k),GWO_convergence_curve(k),'b-*','linewidth',1);
- hold on
- plot(iter(k),PSO_convergence_curve(k),'r-o','linewidth',1);
- hold on
-
- plot(iter(k),CALCS_convergence_curve(k),'g-p','linewidth',1);
-
- end
- grid on;
- title('收敛曲线')
- xlabel('迭代次数');
- ylabel('适应度值');
- box on
- legend('CS','GWO','PSO','CALCS')
- set (gcf,'position', [300,300,800,330]
基本布谷鸟算法:
- function [bestnest,fmin,lhy]=CS(n,N_IterTotal,lb,ub,nd,fobj)
- pa=0.25; % Discovery rate of alien eggs/solutions
- %% Simple bounds of the search domain
- Lb=lb.*ones(1,nd); % Lower bounds
- Ub=ub.*ones(1,nd); % Upper bounds
- % Random initial solutions
- nest=initialization(n,nd,Ub,Lb);
- % Get the current best of the initial population
- fitness=10^10*ones(n,1);
- [fmin,bestnest,nest,fitness]=get_best_nest(nest,nest,fitness,fobj);
- %% Starting iterations
- for iter=1:N_IterTotal
- % Generate new solutions (but keep the current best)
- new_nest=get_cuckoos(nest,bestnest,Lb,Ub);
- [fnew,best,nest,fitness]=get_best_nest(nest,new_nest,fitness,fobj);
- % Discovery and randomization
- new_nest=empty_nests(nest,Lb,Ub,pa) ;
- % Evaluate this set of solutions
- [fnew,best,nest,fitness]=get_best_nest(nest,new_nest,fitness,fobj);
- % Find the best objective so far
- if fnew
- fmin=fnew;
- bestnest=best;
- end
- lhy(iter) = fmin;
- end %% End of iterations
-
-
- %% --------------- All subfunctions are list below ------------------
- %% Get cuckoos by ramdom walk
- function nest=get_cuckoos(nest,best,Lb,Ub)
- % Levy flights
- n=size(nest,1);
- % For details about Levy flights, please read Chapter 3 of the book:
- % X. S. Yang, Nature-Inspired Optimization Algorithms, Elesevier, (2014).
- beta=3/2;
- sigma=(gamma(1+beta)*sin(pi*beta/2)/(gamma((1+beta)/2)*beta*2^((beta-1)/2)))^(1/beta);
- for j=1:n
- s=nest(j,:);
- % This is a simple way of implementing Levy flights
- % For standard random walks, use step=1;
- %% Levy flights by Mantegna's algorithm
- u=randn(size(s))*sigma;
- v=randn(size(s));
- step=u./abs(v).^(1/beta);
-
- % In the next equation, the difference factor (s-best) means that
- % when the solution is the best solution, it remains unchanged.
- stepsize=0.01*step.*(s-best);
- % Here the factor 0.01 comes from the fact that L/100 should be the
- % typical step size for walks/flights where L is the problem scale;
- % otherwise, Levy flights may become too aggresive/efficient,
- % which makes new solutions (even) jump out side of the design domain
- % (and thus wasting evaluations).
- % Now the actual random walks or flights
- s=s+stepsize.*randn(size(s));
- % Apply simple bounds/limits
- nest(j,:)=simplebounds(s,Lb,Ub);
- end
- %% Find the current best solution/nest among the population
- function [fmin,best,nest,fitness]=get_best_nest(nest,newnest,fitness,fobj)
- % Evaluating all new solutions
- for j=1:size(nest,1)
- fnew=fobj(newnest(j,:));
- if fnew<=fitness(j)
- fitness(j)=fnew;
- nest(j,:)=newnest(j,:);
- end
- end
- % Find the current best
- [fmin,K]=min(fitness) ;
- best=nest(K,:);
- %% Replace some not-so-good nests by constructing new solutions/nests
- function new_nest=empty_nests(nest,Lb,Ub,pa)
- % A fraction of worse nests are discovered with a probability pa
- n=size(nest,1);
- % Discovered or not -- a status vector
- K=rand(size(nest))>pa;
- % Notes: In the real world, if a cuckoo's egg is very similar to
- % a host's eggs, then this cuckoo's egg is less likely to be discovered.
- % so the fitness should be related to the difference in solutions.
- % Therefore, it is a good idea to do a random walk in a biased way
- % with some random step sizes.
- %% New solution by biased/selective random walks
- stepsize=rand*(nest(randperm(n),:)-nest(randperm(n),:));
- new_nest=nest+stepsize.*K;
- for j=1:size(new_nest,1)
- s=new_nest(j,:);
- new_nest(j,:)=simplebounds(s,Lb,Ub);
- end
- % Application of simple bounds/constraints
- function s=simplebounds(s,Lb,Ub)
- % Apply the lower bound
- ns_tmp=s;
- I=ns_tmp
- ns_tmp(I)=Lb(I);
-
- % Apply the upper bounds
- J=ns_tmp>Ub;
- ns_tmp(J)=Ub(J);
- % Update this new move
- s=ns_tmp;
- %% You can replace the following objective function
- %% by your own functions (also update the Lb and Ub)
点击下方卡片获取更多代码!