• 基于PSO训练常规自动编码器(Matlab代码实现)


     💥💥💞💞欢迎来到本博客❤️❤️💥💥

    🏆博主优势:🌞🌞🌞博客内容尽量做到思维缜密,逻辑清晰,为了方便读者。

    ⛳️座右铭:行百里者,半于九十。

    目录

    💥1 概述

    📚2 运行结果

    🎉3 参考文献

    🌈4 Matlab代码实现

    💥1 概述

    本文基于PSO训练常规自动编码器,粒子群优化是优化中最著名的基于随机搜索算法之一。这里不过多介绍。

    📚2 运行结果

    部分代码:

    clear all;
    clc;
    addpath('NEW_PSO','AE');
    %% data preparation
    original=imread('tu_pian.png');
    original=imresize(original,[150,90]);
    x=rgb2gray(original);
    Inputs=double(x);
    %% network initialization
    number_neurons=89;% number of neurons
    LB=-10;           % lower bands of weights
    UB=10;            % upperbands of weights
    n=10;             % number of population
    %% training process
    [net]=PSO_AE(Inputs,number_neurons,LB,UB,n);
    %% Illustration
    regenerated=net.code*pinv(net.B');
    subplot(121)
    imagesc(regenerated);
    colormap(gray);
    Tc=num2str(net.prefomance);
    Tc= ['RMSE = ' Tc];
    xlabel('regenerated image')
    title(Tc)
    subplot(122)
    plot(smooth(net.errors,52),'LineWidth',2);
    xlabel('iterations')
    ylabel('RMSE')
    title('loss function behavior')
    axis([0 length(net.errors) min(net.errors) max(net.errors)])
    grid

    function[net]=PSO_AE(Inputs,number_neurons,LB,UB,n)
    % PSO_AE:this function  trains  an auto-encoder based random search tool (PSO). 
    % number_neurons:number of neurons in hidden layer.
    % Inputs: the training set.
    % LB: Lower band constraints for the weights
    % LB: Lower band constraints for the weights
    % n : number of population (in PSO).
    % net: this variable contains the important Characteristics  of training
    Inputs = scaledata(Inputs,0,1);% data Normalization
    alpha=size(Inputs);
    % Initialize the PSO parameters
    m=number_neurons*alpha(2);
    LB=ones(1,m)*(LB);
    UB=ones(1,m)*(UB);
    % Solving Optimization problem based on random search
    [Fvalue,B,nb_iterations,fit_behavior]=PSO(m,n,LB,UB,Inputs,number_neurons);%
    % prepare the problem solution 
    B=reshape(B,number_neurons,alpha(2));
    % calculate the Inputs_hat : Unlike other networks the AEs uses the same weight
    % beta as an input weigth for coding and output weights for decoding
    % we will no longer use the old input weights:input_weights. 
    Hnew=Inputs*B';          % the hidden layer
    Inputs_hat=Hnew*pinv(B');% the estimated Input
    % store the network Characteristics 
    net.errors=fit_behavior;% the training loss function behavior
    net.prefomance=sqrt(mse(Inputs-Inputs_hat));% the training preformance
    net.B=B;% the reconstructio weights 
    net.code=Hnew;% the training hidden layer
    end 

    🎉3 参考文献

    部分理论来源于网络,如有侵权请联系删除。

    [1] M. N. Alam, “Particle Swarm Optimization : Algorithm and its Codes in MATLAB Particle Swarm Optimization : Algorithm and its Codes in MATLAB,” no. March, 2016.
    [2] Y. Liu, B. He, D. Dong, Y. Shen, and T. Yan, “ROS-ELM: A Robust Online Sequential Extreme Learning Machine for Big Data Analytics,” Proc. ELM-2014 Vol. 1, Algorthims Theor., vol. 3, pp. 325–344, 2015.
    [3] H. Zhou, G.-B. Huang, Z. Lin, H. Wang, and Y. C. Soh, “Stacked Extreme Learning Machines.,” IEEE Trans. Cybern., vol. PP, no. 99, p. 1, 2014.

    [4]路强,滕进风,黎杰,凌亮,丁超,黄健刚.基于自动编码器的时间序列预测混合模型[J].计算机系统应用,2022,31(7):55-65

    🌈4 Matlab代码实现

  • 相关阅读:
    速盾:使用 CDN 可以隐藏 IP 吗?该怎样应对防御?
    java数据类型-简介
    [BMIm]BF4离子液体修饰的铜纳米粒子直径小于10nm
    4、Kafka 消费者
    Linux编译器-gcc的使用
    java的基本数据类型
    c++视觉处理---cv::Sobel()`算子
    机构用户注册/登录的设计
    软件测试基础
    Google codelab WebGPU入门教程源码<6> - 使用计算着色器实现计算元胞自动机之生命游戏模拟过程(源码)
  • 原文地址:https://blog.csdn.net/weixin_46039719/article/details/127831604