当前位置:网站首页>MLP neural network, GRNN neural network, SVM neural network and deep learning neural network compare and identify human health and non-health data
MLP neural network, GRNN neural network, SVM neural network and deep learning neural network compare and identify human health and non-health data
2022-08-01 03:44:00 【fpga and matlab】
目录
一、理论基础
MLP多层感知器神经网络(Multi-layer perceptron neural networks),其结构由输入层、一个或多个隐藏层、输出层构成,其结构框图如下图所示:

GRNNThe network structure of the generalized regression neural network is shown in the figure6所示,整个网络包括四层神经元:第一层为GRNNThe input layer of the neural network、第二层为GRNNThe pattern layer of the neural network、第三层为GRNNThe summation layer of the neural network,第四层为GRNN神经网络的输出层.

SVM支持向量机方法VapnikA new machine learning method based on statistical correlation theory proposed by et al,The basic idea is that it is linearly separable,在原空间寻找两类样本的最优分类超平面.在线性不可分的情况下,Slack variables were added for analysis,The low-dimensional input space is mapped to a high-dimensional attribute space by using a nonlinear mapping to make it a linear case,Therefore, it is possible to use linear algorithm to analyze the nonlinearity of samples in high-dimensional attribute space,And find the optimal classification hyperplane in this feature space.SVMThe optimal classification hyperplane is constructed in the attribute space by using the principle of structural risk minimization,Make the classifier get the global optimum,And the expected risk in the entire sample space satisfies a certain upper bound with a certain probability.

目前,Almost all neural network technologies are based on a shallow network architecture,The main feature of the learning model based on shallow structure is its simple structure,And the learning process in the middle is not observable.this technique alone,Its biggest flaw is in dealing with conversations involving natural signals such as human language,For complex real-world applications such as images and vision,A better learning effect cannot be obtained.And human processing of this complex information,Then it extracts its internal structure through deep architecture,And builds the internal representation through rich sensory input.因此,Studying a deep neural network architecture will facilitate the learning, training and testing of complex signals.The deep learning neural network is based on this background,深度神经网络,It mainly includes the network structure based on restricted Boltzmann machine and the network structure based on convolution operation,Deep LearningThe overall structure of the neural network is shown in the figure below:
二、案例背景
1.问题描述
Feature selection is a difficult problem in the field of machine learning,Essentially a combinatorial optimization problem,The most straightforward way to solve combinatorial optimization problems is to search,In theory, all possible feature combinations can be searched by exhaustive method,The feature subset that makes the evaluation criteria optimal is selected as the final output,But the amount of operations by the exhaustive method will increase exponentially with the number of features.因此,Feature selection needs to be performed through a specific feature set search strategy,The basic steps are shown in the following block diagram:

2.思路流程
Machine learning is an important part of the field of computer intelligence algorithms.Machine learning research is based on physiology、Cognitive science, etc. to understand the mechanism of human learning,Build a computational or cognitive model of the human learning process,Develop a variety of learning theories and learning methods,Investigate general learning algorithms and perform theoretical analysis,Build task-oriented learning systems with specific applications.For four commonly used machine learning algorithms,分为是MLP神经网络,GRNN神经网络,SVMNeural network and deep learning neural network for comparison of recognition performance,The characteristic data used is a group of physical characteristic data of healthy people and non-healthy people,for these characteristic data,This paper also proposes a feature selection method,Obtain the most effective feature data from a large number of feature data as feature identification data for healthy and unhealthy populations.最后通过MATLABFour algorithms are tested,仿真结果表明,After feature selection,The recognition algorithm based on deep neural network can be achieved96%以上的识别率.
三、部分MATLAB程序
MLP
for i = 1:k
[ Traindata,Trainaim,Testdata,Testaim] = func_crossvalidation(data_random, indices,i,row,col);
Testdata = [Testdata;Testdata(30:end,:)];
Testaim = [Testaim;Testaim(30:end,:)];
Traindata = Traindata(1:30,:);
Trainaim = Trainaim(1:30,:);
%%
%Feature not selected
[Ftrain,Ftest] = func_feature_selection0(Traindata,Testdata);
%%
%MLP识别
[Preaim,Preaim2,Rate] = func_machine_Learing_method(Ftrain,Trainaim,Ftest,Testaim);
%ROC&PR
% draw_prc(2*Testaim'-1, 2*Preaim2'-1,2);
endGRNN
for i = 1:k
[ Traindata,Trainaim,Testdata,Testaim] = func_crossvalidation(data_random, indices,i,row,col);
Testdata = [Testdata;Testdata(30:end,:)];
Testaim = [Testaim;Testaim(30:end,:)];
Traindata = Traindata(1:30,:);
Trainaim = Trainaim(1:30,:);
%%
%Feature not selected
[Ftrain,Ftest] = func_feature_selection0(Traindata,Testdata);
%%
%MLP识别
[Preaim,Preaim2,Rate] = func_machine_Learing_method(Ftrain,Trainaim,Ftest,Testaim);
%ROC&PR
% draw_prc(2*Testaim'-1, 2*Preaim2'-1,2);
endSVM
for i = 1:k
[ Traindata,Trainaim,Testdata,Testaim] = func_crossvalidation(data_random, indices,i,row,col);
Testdata = [Testdata;Testdata(30:end,:)];
Testaim = [Testaim;Testaim(30:end,:)];
Traindata = Traindata(1:30,:);
Trainaim = Trainaim(1:30,:);
%%
%Feature not selected
[Ftrain,Ftest] = func_feature_selection0(Traindata,Testdata);
%%
%MLP识别
[Preaim,Preaim2,Rate] = func_machine_Learing_method(Ftrain,Trainaim,Ftest,Testaim);
%ROC&PR
% draw_prc(2*Testaim'-1, 2*Preaim2'-1,2);
endDeep learning
for i = 1:k
[ Traindata,Trainaim,Testdata,Testaim] = func_crossvalidation(data_random, indices,i,row,col);
Testdata = [Testdata;Testdata(30:end,:)];
Testaim = [Testaim;Testaim(30:end,:)];
Traindata = Traindata(1:30,:);
Trainaim = Trainaim(1:30,:);
%%
%Feature not selected
[Ftrain,Ftest] = func_feature_selection0(Traindata,Testdata);
%%
%MLP识别
[Preaim,Preaim2,Rate] = func_machine_Learing_method(Ftrain,Trainaim,Ftest,Testaim);
%ROC&PR
% draw_prc(2*Testaim'-1, 2*Preaim2'-1,2);
end四、仿真结论分析
MLP

对比图ROC图和PR图可知,After processing through feature filtering,It has better recognition accuracy.Finally, compare and contrast what is presented in this paperForward和BackwardFeature selection simulation,Their feature recognition rates are respectively93.6709%和69.6203%
GRNN

对比ROC图和PR图可知,After processing through feature filtering,It has better recognition accuracy.Finally, compare and contrast what is presented in this paperForward和BackwardFeature selection simulation,Their feature recognition rates are respectively89.8734%和74.6835%
SVM

对比ROC图和PR图可知,After processing through feature filtering,It has better recognition accuracy.Finally, compare and contrast what is presented in this paperForward和BackwardFeature selection simulation,Their feature recognition rates are respectively88.6076%和71.3241%
深度学习:

ROC图和PR图可知,After processing through feature filtering,It has better recognition accuracy.
It can be seen from the comparison of the final results of the above four algorithms,The final recognition rates are respectively:

It can be seen from the comparison of the four algorithms,in the feature data,第3个特征,第23个特征,第19个特征和第64This feature has a strong discriminative ability,through these four types of characteristics,A higher recognition rate can be obtained.此外,从性能上看,Deep learning neural networks outperformGRNNNeural network performance,优于MLPNeural network performance,优于SVM神经网络的性能.
五、参考文献
[01]Langley P, Simon H A. Applications of machine learning and rule induction. Communications of the ACM, 1995, 38(11): 55–64.
[02]Aha D W, Kibler D, Albert M K. Instance based learning algorithms. Machine Learning, 1991, 6:37–66. A05-42
边栏推荐
- win10 固定本机IP
- HIRO: Hierarchical Reinforcement Learning 】 【 Data - Efficient Hierarchical Reinforcement Learning
- [uniCloud] Application and Improvement of Cloud Objects
- Chain programming, packages, access
- This map drawing tool is amazing, I recommend it~~
- 【分层强化学习】HIRO:Data-Efficient Hierarchical Reinforcement Learning
- pdb drug comprehensive database
- [Message Notification] How about using the official account template message?
- Flutter Tutorial 01 Configure the environment and run the demo program (tutorial includes source code)
- Lua introductory case of actual combat 1234 custom function and the standard library function
猜你喜欢

Compiled on unbutu with wiringPi library and run on Raspberry Pi

Elastic Stack的介绍

简单易用的任务队列-beanstalkd

初出茅庐的小李第114篇博客项目笔记之机智云智能浇花器实战(3)-基础Demo实现

【分层强化学习】HIRO:Data-Efficient Hierarchical Reinforcement Learning

Message queue design based on mysql

移动端页面秒开优化总结

What is dynamic programming and what is the knapsack problem

【入门教程】Rollup模块打包器整合

初出茅庐的小李第112篇博客项目笔记之机智云智能浇花器实战(1)-基础Demo实现
随机推荐
Solve the problem that Excel opens very slowly after installing MySQL
787. Merge Sort
The IDEA can't find or unable to load The main class or Module "*" must not contain The source root "*" The root already belongs to The Module "*"
Soft Exam Senior System Architect Series: Basic Knowledge of System Development
leetcode6133. 分组的最大数量(中等)
Simple vim configuration
This article takes you to understand the past and present of Mimir, Grafana's latest open source project
【SemiDrive源码分析】系列文章链接汇总(全)
The 16th day of the special assault version of the sword offer
二舅
Weekly Summary (*67): Why not dare to express an opinion
Game Security 03: A Simple Explanation of Buffer Overflow Attacks
Modify Postman installation path
黑客到底可以厉害到什么程度?
This map drawing tool is amazing, I recommend it~~
leetcode6132. 使数组中所有元素都等于零(简单,周赛)
<JDBC> 批量插入 的四种实现方式:你真的get到了吗?
Unity在BuildIn渲染管线下实现PlanarReflection的初级方法
leetcode:126. Word Solitaire II
在打开MYSQL表时,有的可以显示编辑,有的没有,如何设置。