当前位置:网站首页>A brief history of neural networks
A brief history of neural networks
2020-11-06 01:28:00 【Artificial intelligence meets pioneer】
author |SANYA4 compile |VK source |Analytics Vidhya
Introduce
Now neural networks are everywhere . Companies are squandering on hardware and talent , To make sure they can build the most complex neural networks , And the best deep learning solution .
Although deep learning is a fairly old subset of machine learning , But it was not until the 20 century 10 s , It gets the recognition it deserves . today , It's all over the world , Attracted the attention of the public .
In this paper , I want to take a slightly different approach to neural networks , And understand how they are formed .
The origin of neural networks
The earliest reports in the field of neural networks began with 20 century 40 years , warren · McCullough and Walter · Pitts tried to build a simple neural network with circuits .
The figure below shows a MCP Neuron . If you're studying high school physics , You'll find that it looks like a simple NOR door .
l The paper shows the basic idea of using signals , And how to make decisions by transforming the input provided .
McCulloch and Pitts This paper provides a way to describe brain function in abstract terms , And the computational power of neural networks can be shown to be huge .
Despite its pioneering significance , But this paper hardly attracts people's attention , Until about 6 After year , Donald · Herb ( The figure below ) Published a paper , It emphasizes that neural pathways are strengthened every time they are used .
please remember , Computers were still in their infancy ,IBM stay 1981 The first PC(IBM5150).
Fast forward to 90 years , Many studies on artificial neural networks have been published . Rosenblat is in 20 century 50 The first perceptron was invented in the s ,1989 year Yann LeCun The back propagation algorithm is successfully implemented in Bell lab . here we are 20 century 90 years , The U.S. Postal Service has been able to read the postcode on the envelope .
What we know today LSTM Is in 1997 Invented in .
If 90 So many foundations have been laid in the s , Why wait until 2012 It is only years before we can use neural network to complete the task of deep learning ?
Hardware and the rise of the Internet
One of the main challenges in deep learning research is the lack of repeatable research . up to now , These developments are all theory driven , Because the availability of reliable data is low , Limited hardware resources .
In the past 20 years , Great progress has been made in hardware and the Internet . stay 20 century 90 years ,IBM Personal computer RAM by 16KB. stay 2010 year , On average, there is 4GB about !
Now? , We can train a small model on our computer , This is in 90 The age is unimaginable .
The game market also played an important role in this revolution , image NVIDIA and AMD Such companies are investing heavily in supercomputers , To provide a high-end virtual experience .
With the development of the Internet , It's much easier to create and distribute datasets for machine learning tasks .
from Wikipedia It's easier to learn and collect pictures in .
2010 year : Our era of deep learning
ImageNet:2009 year , The beginning of modern deep learning era , Li Feifei of Stanford University founded ImageNet, This is a large visualization dataset , It has been hailed as a project that has spawned the artificial intelligence revolution around the world .
As early as 2006 year , Lee is a new professor at the University of Illinois at Urbana Champaign . Her colleagues will constantly discuss new algorithms to make better decisions . However , She saw the flaws in their plans .
If you train on data sets that reflect the real world , So the best algorithms don't work well either .ImageNet By more than 2 Ten thousand categories of 1400 Ten thousand images , up to now , Still the cornerstone of object recognition technology .
Open competition :2009 year ,Netflix There was a program called Netflix Prize Open competition of , To predict the user ratings of movies .2009 year 9 month 21 Japan ,BellKor The practical chaos team of 10.06% The advantage of beating Netflix My own algorithm , To obtain the 100 $10000 bonus .
Kaggle Founded on 2010 year , It's a platform for all people around the world to hold machine learning competitions . It makes researchers 、 Engineers and native programmers are able to overcome the limits of complex data tasks .
Before the AI boom , The investment in AI is about 2000 Thousands of dollars . To 2014 year , This investment has increased 20 times , Google 、Facebook Amazon and other market leaders set aside money , Further research on future AI products . This new wave of investment has increased the number of recruiters in the field of deep learning from hundreds to tens of thousands .
ending
Despite the slow start , But deep learning has become an inevitable part of our lives . from Netflix and YouTube Recommended to language translation engine , From facial recognition and medical diagnosis to autopilot , There is no field that deep learning doesn't touch .
These advances broaden the future scope and application of neural networks in improving our quality of life .
Artificial intelligence is not our future , It's our present , It's just beginning !
Link to the original text :https://www.analyticsvidhya.com/blog/2020/10/how-does-the-gradient-descent-algorithm-work-in-machine-learning/
Welcome to join us AI Blog station : http://panchuang.net/
sklearn Machine learning Chinese official documents : http://sklearn123.com/
Welcome to pay attention to pan Chuang blog resource summary station : http://docs.panchuang.net/
版权声明
本文为[Artificial intelligence meets pioneer]所创,转载请带上原文链接,感谢
边栏推荐
- Do not understand UML class diagram? Take a look at this edition of rural love class diagram, a learn!
- Keyboard entry lottery random draw
- NLP model Bert: from introduction to mastery (1)
- 阿里云Q2营收破纪录背后,云的打开方式正在重塑
- Recommendation system based on deep learning
- 一篇文章带你了解CSS3 背景知识
- Arrangement of basic knowledge points
- axios学习笔记(二):轻松弄懂XHR的使用及如何封装简易axios
- [C / C + + 1] clion configuration and running C language
- Advanced Vue component pattern (3)
猜你喜欢
关于Kubernetes 与 OAM 构建统一、标准化的应用管理平台知识!(附网盘链接)
2019年的一个小目标,成为csdn的博客专家,纪念一下
NLP model Bert: from introduction to mastery (1)
一篇文章带你了解CSS3圆角知识
Building and visualizing decision tree with Python
EOS创始人BM: UE,UBI,URI有什么区别?
ES6学习笔记(二):教你玩转类的继承和类的对象
What to do if you are squeezed by old programmers? I don't want to quit
TensorFlow中的Tensor是什么?
git rebase的時候捅婁子了,怎麼辦?線上等……
随机推荐
High availability cluster deployment of jumpserver: (6) deployment of SSH agent module Koko and implementation of system service management
Filecoin最新动态 完成重大升级 已实现四大项目进展!
6.5 request to view name translator (in-depth analysis of SSM and project practice)
零基础打造一款属于自己的网页搜索引擎
use Asponse.Words Working with word templates
[event center azure event hub] interpretation of error information found in event hub logs
How long does it take you to work out an object-oriented programming interview question from Ali school?
It's so embarrassing, fans broke ten thousand, used for a year!
前端都应懂的入门基础-github基础
Summary of common algorithms of binary tree
Python基础变量类型——List浅析
速看!互联网、电商离线大数据分析最佳实践!(附网盘链接)
[JMeter] two ways to realize interface Association: regular representation extractor and JSON extractor
使用 Iceberg on Kubernetes 打造新一代云原生数据湖
Skywalking series blog 1 - install stand-alone skywalking
Thoughts on interview of Ali CCO project team
一篇文章教会你使用HTML5 SVG 标签
中小微企业选择共享办公室怎么样?
Let the front-end siege division develop independently from the back-end: Mock.js
Do not understand UML class diagram? Take a look at this edition of rural love class diagram, a learn!