论文题目:SCA is a novel algorithm for solving single-objective optimization problems
这个算法是澳大利亚学者Mirjalili提出的。这个人在这类优化算法中真的是很厉害,像鲸鱼优化算法,多元宇宙算法,包括这个正弦余弦算法,都是他提出的。给我的感觉就是,这类算法大部分在寻优的时候,分为两部分,也就是局部挖掘和全局搜索。这两个部分相互促进又相互矛盾,全局搜索用于快速定位最优解的范围,而局部挖掘用于寻找最优解,这两者必须达到一种动态平衡的状态。全局搜索过多的时候,运行会特别慢,尤其是用MATLAB这种不擅长循环的软件;局部挖掘过多的时候,就容易陷入局部最优解,当然实际工程里,全局最优解可能真的不容易找到,尤其在一些非凸问题里,局部最优解也凑合。
引用格式:
Seyedali Mirjalili (2022). SCA: A Sine Cosine Algorithm (https://www.mathworks.com/matlabcentral/fileexchange/54948-sca-a-sine-cosine-algorithm), MATLAB Central File Exchange. 检索来源 2022/7/28.
少废话,直接上算法源码
- % Sine Cosine Algorithm (SCA)
- %
- % Source codes demo version 1.0
- %
- % Developed in MATLAB R2011b(7.13)
- %
- % Author and programmer: Seyedali Mirjalili
- %
- % e-Mail: ali.mirjalili@gmail.com
- % seyedali.mirjalili@griffithuni.edu.au
- %
- % Homepage: http://www.alimirjalili.com
- %
- % Main paper:
- % S. Mirjalili, SCA: A Sine Cosine Algorithm for solving optimization problems
- % Knowledge-Based Systems, DOI: http://dx.doi.org/10.1016/j.knosys.2015.12.022
- %_______________________________________________________________________________________________
- % You can simply define your cost function in a seperate file and load its handle to fobj
- % The initial parameters that you need are:
- %__________________________________________
- % fobj = @YourCostFunction
- % dim = number of your variables
- % Max_iteration = maximum number of iterations
- % SearchAgents_no = number of search agents
- % lb=[lb1,lb2,...,lbn] where lbn is the lower bound of variable n
- % ub=[ub1,ub2,...,ubn] where ubn is the upper bound of variable n
- % If all the variables have equal lower bound you can just
- % define lb and ub as two single numbers
-
- % To run SCA: [Best_score,Best_pos,cg_curve]=SCA(SearchAgents_no,Max_iteration,lb,ub,dim,fobj)
- %______________________________________________________________________________________________
-
-
- function [Destination_fitness,Destination_position,Convergence_curve]=SCA(N,Max_iteration,lb,ub,dim,fobj)
-
- display('SCA is optimizing your problem');
-
- %Initialize the set of random solutions
- X=initialization(N,dim,ub,lb);
-
- Destination_position=zeros(1,dim);
- Destination_fitness=inf;
-
- Convergence_curve=zeros(1,Max_iteration);
- Objective_values = zeros(1,size(X,1));
-
- % Calculate the fitness of the first set and find the best one
- for i=1:size(X,1)
- Objective_values(1,i)=fobj(X(i,:));
- if i==1
- Destination_position=X(i,:);
- Destination_fitness=Objective_values(1,i);
- elseif Objective_values(1,i)
- Destination_position=X(i,:);
- Destination_fitness=Objective_values(1,i);
- end
-
- All_objective_values(1,i)=Objective_values(1,i);
- end
-
- %Main loop
- t=2; % start from the second iteration since the first iteration was dedicated to calculating the fitness
- while t<=Max_iteration
-
- % Eq. (3.4)
- a = 2;
- Max_iteration = Max_iteration;
- r1=a-t*((a)/Max_iteration); % r1 decreases linearly from a to 0
-
- % Update the position of solutions with respect to destination
- for i=1:size(X,1) % in i-th solution
- for j=1:size(X,2) % in j-th dimension
-
- % Update r2, r3, and r4 for Eq. (3.3)
- r2=(2*pi)*rand();
- r3=2*rand;
- r4=rand();
-
- % Eq. (3.3)
- if r4<0.5
- % Eq. (3.1)
- X(i,j)= X(i,j)+(r1*sin(r2)*abs(r3*Destination_position(j)-X(i,j)));
- else
- % Eq. (3.2)
- X(i,j)= X(i,j)+(r1*cos(r2)*abs(r3*Destination_position(j)-X(i,j)));
- end
-
- end
- end
-
- for i=1:size(X,1)
-
- % Check if solutions go outside the search spaceand bring them back
- Flag4ub=X(i,:)>ub;
- Flag4lb=X(i,:)
- X(i,:)=(X(i,:).*(~(Flag4ub+Flag4lb)))+ub.*Flag4ub+lb.*Flag4lb;
-
- % Calculate the objective values
- Objective_values(1,i)=fobj(X(i,:));
-
- % Update the destination if there is a better solution
- if Objective_values(1,i)
- Destination_position=X(i,:);
- Destination_fitness=Objective_values(1,i);
- end
- end
-
- Convergence_curve(t)=Destination_fitness;
-
- % Display the iteration and best optimum obtained so far
- if mod(t,50)==0
- display(['At iteration ', num2str(t), ' the optimum is ', num2str(Destination_fitness)]);
- end
-
- % Increase the iteration counter
- t=t+1;
- end
The SCA creates multiple initial random candidate solutions and requires them to fluctuate outwards or towards the best solution using a mathematical model based on sine and cosine functions. Several random and adaptive variables also are integrated to this algorithm to emphasize exploration and exploitation of the search space in different milestones of optimization.
This is the source codes of the paper:
S. Mirjalili, SCA: A Sine Cosine Algorithm for solving optimization problems, Knowledge-Based Systems, in press, 2015, DOI: Redirecting
Link: http://www.sciencedirect.com/science/article/pii/S0950705115005043
A Matlab toolbox for this algorithm can be found here: Sine Cosine Algorithm Toolbox - File Exchange - MATLAB Central
If you have no access to the paper, please drop me an email at ali.mirjalili@gmail.com and I will send you the paper.
All of the source codes and extra information as well as more optimization techniques can be found in my personal website at http://www.alimirjalili.com
I have a number of relevant courses in this area. You can enrol via the following links with 95% discount:
*******************************************************************************************************************************************
A course on “Optimization Problems and Algorithms: how to understand, formulation, and solve optimization problems”:
https://www.udemy.com/optimisation/?couponCode=MATHWORKSREF
A course on “Introduction to Genetic Algorithms: Theory and Applications”
https://www.udemy.com/geneticalgorithm/?couponCode=MATHWORKSREF
*******************************************************************************************************************************************
-
相关阅读:
某验三代滑块流程分析
Linux进阶-线程
GraphX 图计算实践之模式匹配抽取特定子图
vue基础操作(vue基础)
Sulfo CY5-MAL(maleimide) 荧光性质和光谱特性2242791-82-6
通过es索引生命周期策略删除日志索引
Python.03.函数使用
IDEA中字符串怎么自动转义,双引号自动转义的小技巧
JavaEE高阶---SpringBoot的创建和使用
46.drf过滤、搜索、排序
-
原文地址:https://blog.csdn.net/Vertira/article/details/126031198