0
点赞
收藏
分享

微信扫一扫

40 多种智能优化算法包括 PSO、GA、DE、ACO、GSA 等(Matlab代码实现)


💥💥💥💞💞💞欢迎来到本博客❤️❤️❤️💥💥💥



🏆博主优势:🌞🌞🌞博客内容尽量做到思维缜密,逻辑清晰,为了方便读者。



⛳️座右铭:行百里者,半于九十。


目录

​​💥1 概述​​

​​ 📚2 运行结果​​

​​🌈3 Matlab代码实现 ​​


💥1 概述

本文总结40总优化算法,然后以3种方法为例,进行可视化比较,其他的换个名字就行,下面先看看这些算法汇总:

%---List of available wrapper FS methods------------------------------

% 2020
'mpa' : Marine Predators Algorithm;
'gndo' : Generalized Normal Distribution Optimization;
'sma' : Slime Mould Algorithm;
'eo' : Equilibrium Optimizer;
'mrfo' : Manta Ray Foraging Optimization;
% 2019
'aso' : Atom Search Optimization;
'hho' : Harris Hawks Optimization;
'hgso' : Henry Gas Solubility Optimization;
'pfa' : Path Finder Algorithm;
'pro' : Poor And Rich Optimization;
% 2018
'boa' : Butterfly Optimization Algorithm;
'epo' : Emperor Penguin Optimizer;
'tga' : Tree Growth Algorithm;
% 2017
'abo' : Artificial Butterfly Optimization;
'ssa' : Salp Swarm Algorithm;
'sbo' : Satin Bower Bird Optimization;
'wsa' : Weighted Superposition Attraction;
% 2016
'ja' : Jaya Algorithm;
'csa' : Crow Search Algorithm;
'sca' : Sine Cosine Algorithm;
'woa' : Whale Optimization Algorithm;
% 2015
'alo' : Ant Lion Optimizer;
'hlo' : Human Learning Optimization;
'mbo' : Monarch Butterfly Optimization;
'mfo' : Moth Flame Optimization;
'mvo' : Multi Verse Optimizer;
'tsa' : Tree Seed Algorithm;
% 2014
'gwo' : Grey Wolf Optimizer;
'sos' : Symbiotic Organisms Search;
% 2012
'fpa' : Flower Pollination Algorithm;
'foa' : Fruit Fly Optimization Algorithm;
% 2009 - 2010
'ba' : Bat Algorithm;
'fa' : Firefly Algorithm;
'cs' : Cuckoo Search Algorithm;
'gsa' : Gravitational Search Algorithm;
% Traditional
'abc' : Artificial Bee Colony;
'hs' : Harmony Search;
'de' : Differential Evolution;
'aco' : Ant Colony Optimization;
'acs' : Ant Colony System;
'pso' : Particle Swarm Optimization;
'gat' : Genetic Algorithm (Tournament);
'ga' : Genetic Algorithm (Roulette Wheel);
'sa' : Simulated Annealing;

 📚2 运行结果

然后以三种简单的算法为例,进行可视化比较:

%---Usage-------------------------------------------------------------
% If you wish to use 'PSO' (see example 1) then you write
% FS = jfs('pso',feat,label,opts);

% If you want to use 'SMA' (see example 2) then you write
% FS = jfs('sma',feat,label,opts);

% * All methods have different calling name (refer List_Method.m file)


%---Input-------------------------------------------------------------
% feat : Feature vector matrix (Instances x Features)
% label : Label matrix (Instances x 1)
% opts : Parameter settings
% opts.N : Number of solutions / population size (* for all methods)
% opts.T : Maximum number of iterations (* for all methods)
% opts.k : Number of k in k-nearest neighbor

% Some methods have their specific parameters (example: PSO, GA, DE)
% if you do not set them then they will define as default settings
% * you may open the < m.file > to view or change the parameters
% * you may use 'opts' to set the parameters of method (see example 1)
% * you may also change the < jFitnessFunction.m file >


%---Output------------------------------------------------------------
% FS : Feature selection model (It contains several results)
% FS.sf : Index of selected features
% FS.ff : Selected features
% FS.nf : Number of selected features
% FS.c : Convergence curve
% Acc : Accuracy of validation model


%% Example 1: Particle Swarm Optimization (PSO)
clear, clc, close;
% Number of k in K-nearest neighbor
opts.k = 5;
% Ratio of validation data
ho = 0.2;
% Common parameter settings
opts.N = 10; % number of solutions
opts.T = 100; % maximum number of iterations
% Parameters of PSO
opts.c1 = 2;
opts.c2 = 2;
opts.w = 0.9;
% Load dataset
load ionosphere.mat;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho);
opts.Model = HO;
% Perform feature selection
FS = jfs('pso',feat,label,opts);
% Define index of selected features
sf_idx = FS.sf;
% Accuracy
Acc = jknn(feat(:,sf_idx),label,opts);
% Plot convergence
figure(1)
plot(FS.c); grid on;
xlabel('Number of Iterations');
ylabel('Fitness Value');
title('PSO');


%% Example 2: Slime Mould Algorithm (SMA)
%clear, clc, close;
% Number of k in K-nearest neighbor
opts.k = 5;
% Ratio of validation data
ho = 0.2;
% Common parameter settings
opts.N = 10; % number of solutions
opts.T = 100; % maximum number of iterations
% Load dataset
load ionosphere.mat;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho);
opts.Model = HO;
% Perform feature selection
FS = jfs('sma',feat,label,opts);
% Define index of selected features
sf_idx = FS.sf;
% Accuracy
Acc = jknn(feat(:,sf_idx),label,opts);
% Plot convergence
figure(2)
plot(FS.c); grid on;
xlabel('Number of Iterations');
ylabel('Fitness Value');
title('SMA');


%% Example 3: Whale Optimization Algorithm (WOA)
%clear, clc, close;
% Number of k in K-nearest neighbor
opts.k = 5;
% Ratio of validation data
ho = 0.2;
% Common parameter settings
opts.N = 10; % number of solutions
opts.T = 100; % maximum number of iterations
% Parameter of WOA
opts.b = 1;
% Load dataset
load ionosphere.mat;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho);
opts.Model = HO;
% Perform feature selection
FS = jfs('woa',feat,label,opts);
% Define index of selected features
sf_idx = FS.sf;
% Accuracy
Acc = jknn(feat(:,sf_idx),label,opts);
% Plot convergence
figure(3)
plot(FS.c); grid on;
xlabel('Number of Iterations');
ylabel('Fitness Value');
title('WOA');

40 多种智能优化算法包括 PSO、GA、DE、ACO、GSA 等(Matlab代码实现)_算法

 

40 多种智能优化算法包括 PSO、GA、DE、ACO、GSA 等(Matlab代码实现)_matlab代码_02

 

40 多种智能优化算法包括 PSO、GA、DE、ACO、GSA 等(Matlab代码实现)_matlab代码_03

部分代码:

%---Usage-------------------------------------------------------------
% If you wish to use 'PSO' (see example 1) then you write
% FS = jfs('pso',feat,label,opts);% If you want to use 'SMA' (see example 2) then you write
% FS = jfs('sma',feat,label,opts);% * All methods have different calling name (refer List_Method.m file)
%---Input-------------------------------------------------------------
% feat : Feature vector matrix (Instances x Features)
% label : Label matrix (Instances x 1)
% opts : Parameter settings
% opts.N : Number of solutions / population size (* for all methods)
% opts.T : Maximum number of iterations (* for all methods)
% opts.k : Number of k in k-nearest neighbor % Some methods have their specific parameters (example: PSO, GA, DE)
% if you do not set them then they will define as default settings
% * you may open the < m.file > to view or change the parameters
% * you may use 'opts' to set the parameters of method (see example 1)
% * you may also change the < jFitnessFunction.m file > %---Output------------------------------------------------------------
% FS : Feature selection model (It contains several results)
% FS.sf : Index of selected features
% FS.ff : Selected features
% FS.nf : Number of selected features
% FS.c : Convergence curve
% Acc : Accuracy of validation model %% Example 1: Particle Swarm Optimization (PSO)
clear, clc, close;
% Number of k in K-nearest neighbor
opts.k = 5;
% Ratio of validation data
ho = 0.2;
% Common parameter settings
opts.N = 10; % number of solutions
opts.T = 100; % maximum number of iterations
% Parameters of PSO
opts.c1 = 2;
opts.c2 = 2;
opts.w = 0.9;
% Load dataset
load ionosphere.mat;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho);
opts.Model = HO;
% Perform feature selection
FS = jfs('pso',feat,label,opts);
% Define index of selected features
sf_idx = FS.sf;
% Accuracy
Acc = jknn(feat(:,sf_idx),label,opts);
% Plot convergence
figure(1)
plot(FS.c); grid on;
xlabel('Number of Iterations');
ylabel('Fitness Value');
title('PSO'); %% Example 2: Slime Mould Algorithm (SMA)
%clear, clc, close;
% Number of k in K-nearest neighbor
opts.k = 5;
% Ratio of validation data
ho = 0.2;
% Common parameter settings
opts.N = 10; % number of solutions
opts.T = 100; % maximum number of iterations
% Load dataset
load ionosphere.mat;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho);
opts.Model = HO;
% Perform feature selection
FS = jfs('sma',feat,label,opts);
% Define index of selected features
sf_idx = FS.sf;
% Accuracy
Acc = jknn(feat(:,sf_idx),label,opts);
% Plot convergence
figure(2)
plot(FS.c); grid on;
xlabel('Number of Iterations');
ylabel('Fitness Value');
title('SMA'); %% Example 3: Whale Optimization Algorithm (WOA)
%clear, clc, close;
% Number of k in K-nearest neighbor
opts.k = 5;
% Ratio of validation data
ho = 0.2;
% Common parameter settings
opts.N = 10; % number of solutions
opts.T = 100; % maximum number of iterations
% Parameter of WOA
opts.b = 1;
% Load dataset
load ionosphere.mat;
% Divide data into training and validation sets
HO = cvpartition(label,'HoldOut',ho);
opts.Model = HO;
% Perform feature selection
FS = jfs('woa',feat,label,opts);
% Define index of selected features
sf_idx = FS.sf;
% Accuracy
Acc = jknn(feat(:,sf_idx),label,opts);
% Plot convergence
figure(3)
plot(FS.c); grid on;
xlabel('Number of Iterations');
ylabel('Fitness Value');
title('WOA');

​​🌈​​3 Matlab代码实现 


举报

相关推荐

matlab多种智能优化算法

0 条评论