当前位置:网站首页>"Analysis of 43 cases of MATLAB neural network": Chapter 40 research on prediction of dynamic neural network time series -- implementation of NARX based on MATLAB
"Analysis of 43 cases of MATLAB neural network": Chapter 40 research on prediction of dynamic neural network time series -- implementation of NARX based on MATLAB
2022-07-01 12:27:00 【mozun2020】
《MATLAB neural network 43 A case study 》: The first 40 Chapter Research on dynamic neural network time series prediction —— be based on MATLAB Of NARX Realization
1. Preface
《MATLAB neural network 43 A case study 》 yes MATLAB Technology Forum (www.matlabsky.com) planning , Led by teacher wangxiaochuan ,2013 Beijing University of Aeronautics and Astronautics Press MATLAB A book for tools MATLAB Example teaching books , Is in 《MATLAB neural network 30 A case study 》 On the basis of modification 、 Complementary , Adhering to “ Theoretical explanation — case analysis — Application extension ” This feature , Help readers to be more intuitive 、 Learn neural networks vividly .
《MATLAB neural network 43 A case study 》 share 43 Chapter , The content covers common neural networks (BP、RBF、SOM、Hopfield、Elman、LVQ、Kohonen、GRNN、NARX etc. ) And related intelligent algorithms (SVM、 Decision tree 、 Random forests 、 Extreme learning machine, etc ). meanwhile , Some chapters also cover common optimization algorithms ( Genetic algorithm (ga) 、 Ant colony algorithm, etc ) And neural network . Besides ,《MATLAB neural network 43 A case study 》 It also introduces MATLAB R2012b New functions and features of neural network toolbox in , Such as neural network parallel computing 、 Custom neural networks 、 Efficient programming of neural network, etc .
In recent years, with the rise of artificial intelligence research , The related direction of neural network has also ushered in another upsurge of research , Because of its outstanding performance in the field of signal processing , The neural network method is also being applied to various applications in the direction of speech and image , This paper combines the cases in the book , It is simulated and realized , It's a relearning , I hope I can review the old and know the new , Strengthen and improve my understanding and practice of the application of neural network in various fields . I just started this book on catching more fish , Let's start the simulation example , Mainly to introduce the source code application examples in each chapter , This paper is mainly based on MATLAB2015b(32 position ) Platform simulation implementation , This is the research example of dynamic neural network time series prediction in Chapter 40 of this book , Don't talk much , Start !
2. MATLAB Simulation example
open MATLAB, Click on “ Home page ”, Click on “ open ”, Find the sample file 
Choose chapter40.m, Click on “ open ”
chapter40.m Source code is as follows :
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% function : Research on dynamic neural network time series prediction - be based on MATLAB Of NARX Realization
% Environmental Science :Win7,Matlab2015b
%Modi: C.S
% Time :2022-06-21
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% Matlab neural network 43 A case study
% Research on dynamic neural network time series prediction - be based on MATLAB Of NARX Realization
% by Wang Xiao Chuan (@ Wang Xiao Chuan _matlab)
% http://www.matlabsky.com
% Email:[email protected]163.com
% http://weibo.com/hgsz2003
%% Clear environment variables
clear
clc
tic
%% Load data
% load phdata
[phInputs,phTargets] = ph_dataset;
inputSeries = phInputs;
targetSeries = phTargets;
%% Establish a nonlinear autoregressive model
inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = 10;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
%% Definition of network data preprocessing function
net.inputs{
1}.processFcns = {
'removeconstantrows','mapminmax'};
net.inputs{
2}.processFcns = {
'removeconstantrows','mapminmax'};
%% Time series data preparation
[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{
},targetSeries);
%% Training data 、 Validation data 、 Test data division
net.divideFcn = 'dividerand';
net.divideMode = 'value';
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
%% Network training function setting
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
%% Error function setting
net.performFcn = 'mse'; % Mean squared error
%% Drawing function settings
net.plotFcns = {
'plotperform','plottrainstate','plotresponse', ...
'ploterrcorr', 'plotinerrcorr'};
%% Network training
[net,tr] = train(net,inputs,targets,inputStates,layerStates);
%% Network testing
outputs = net(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs)
%% Calculate the training set 、 Verification set 、 Test set error
trainTargets = gmultiply(targets,tr.trainMask);
valTargets = gmultiply(targets,tr.valMask);
testTargets = gmultiply(targets,tr.testMask);
trainPerformance = perform(net,trainTargets,outputs)
valPerformance = perform(net,valTargets,outputs)
testPerformance = perform(net,testTargets,outputs)
%% Visualization of network training effect
figure, plotperform(tr)
figure, plottrainstate(tr)
figure, plotregression(targets,outputs)
figure, plotresponse(targets,outputs)
figure, ploterrcorr(errors)
figure, plotinerrcorr(inputs,errors)
%% close loop The realization of pattern
% change NARX Neural network mode
narx_net_closed = closeloop(net);
view(net)
view(narx_net_closed)
% Calculation 1500-2000 Fitting effect of points
phInputs_c=phInputs(1500:2000);
PhTargets_c=phTargets(1500:2000);
[p1,Pi1,Ai1,t1] = preparets(narx_net_closed,phInputs_c,{
},PhTargets_c);
% Network simulation
yp1 = narx_net_closed(p1,Pi1,Ai1);
plot([cell2mat(yp1)' cell2mat(t1)'])
toc
Add completed , Click on “ function ”, Start emulating , The output simulation results are as follows :
performance =
0.0188
trainPerformance =
0.0180
valPerformance =
0.0164
testPerformance =
0.0252
Time has passed 7.422134 second .

( In turn, click Performance,Training State,Time-Series Response,Error Autocorrelation,Input-Error Cross-correlation Corresponding figures can be obtained respectively )







3. Summary
Dynamic neural network has become a new research topic of deep learning . Compared to the static model ( Fixed calculation chart 、 Fixed parameter ), Dynamic network can adjust its own structure or parameters according to different inputs , Formed accuracy 、 Computational efficiency 、 Significant advantages in self adaptation, etc . Interested in the content of this chapter or want to fully learn and understand , It is suggested to study the contents of chapter 40 in the book ( The learning link is attached at the end of the text ). Some of these knowledge points will be supplemented on the basis of their own understanding in the later stage , Welcome to study and exchange together .
边栏推荐
- MySQL common functions
- Understanding of NAND flash deblocking
- 队列的链式存储
- Joint Time-Frequency and Time Domain Learning for Speech Enhancement
- Chain storage of binary tree
- GID:旷视提出全方位的检测模型知识蒸馏 | CVPR 2021
- QT 播放器之列表[通俗易懂]
- GPS 数据中的精度因子(DOP)与协方差之间的关系 (参考链接)
- Machine learning - Data Science Library Day 3 - Notes
- Message queue monitoring refund task batch process
猜你喜欢

2022-06-28-06-29

Typora realizes automatic uploading of picture pasting

MySQL workbench data modeling function

Onenet Internet of things platform - create mqtts products and devices

BIM and safety in road maintenance-buildSmart Spain

Queue operation---

Self organization is the two-way rush of managers and members

I wish you all a happy reunion

GID: open vision proposes a comprehensive detection model knowledge distillation | CVPR 2021

NOV Schedule for . Net to display and organize appointments and recurring events
随机推荐
[Yunju entrepreneurial foundation notes] Chapter 7 Entrepreneurial Resource test 4
[20220605] Literature Translation -- visualization in virtual reality: a systematic review
Dlhsoft Kanban, Kanban component of WPF
关于NAND FLASH解扣的认识
循环链表--
GPS 数据中的精度因子(DOP)与协方差之间的关系 (参考链接)
MySQL common functions
LeetCode 454. Add four numbers II
Talk about biological live broadcast - genovis Zhang Hongyan antibody specific enzyme digestion technology helps to characterize the structure of antibody drugs
腾讯黎巍:深耕“监管科技”,护航数字经济行稳致远
[speech signal processing] 3 speech signal visualization -- prosody
[datawhale202206] pytorch recommendation system: multi task learning esmm & MMOE
Onenet Internet of things platform - mqtt product devices send messages to message queues MQ
Fatal error: execution: there is no such file or directory
比特熊直播间一周年,英雄集结令!邀你来合影!
JS reverse | m3u8 data decryption of a spring and autumn network
Pandas reads MySQL data
Onenet Internet of things platform - create mqtts products and devices
[Yunju entrepreneurial foundation notes] Chapter 7 Entrepreneurial Resource test 2
栈的应用——括号匹配问题