当前位置:网站首页>A brief history of neural networks
A brief history of neural networks
2020-11-06 01:28:00 【Artificial intelligence meets pioneer】
author |SANYA4 compile |VK source |Analytics Vidhya
Introduce
Now neural networks are everywhere . Companies are squandering on hardware and talent , To make sure they can build the most complex neural networks , And the best deep learning solution .
Although deep learning is a fairly old subset of machine learning , But it was not until the 20 century 10 s , It gets the recognition it deserves . today , It's all over the world , Attracted the attention of the public .
In this paper , I want to take a slightly different approach to neural networks , And understand how they are formed .
The origin of neural networks
The earliest reports in the field of neural networks began with 20 century 40 years , warren · McCullough and Walter · Pitts tried to build a simple neural network with circuits .
The figure below shows a MCP Neuron . If you're studying high school physics , You'll find that it looks like a simple NOR door .
l The paper shows the basic idea of using signals , And how to make decisions by transforming the input provided .
McCulloch and Pitts This paper provides a way to describe brain function in abstract terms , And the computational power of neural networks can be shown to be huge .
Despite its pioneering significance , But this paper hardly attracts people's attention , Until about 6 After year , Donald · Herb ( The figure below ) Published a paper , It emphasizes that neural pathways are strengthened every time they are used .
please remember , Computers were still in their infancy ,IBM stay 1981 The first PC(IBM5150).
Fast forward to 90 years , Many studies on artificial neural networks have been published . Rosenblat is in 20 century 50 The first perceptron was invented in the s ,1989 year Yann LeCun The back propagation algorithm is successfully implemented in Bell lab . here we are 20 century 90 years , The U.S. Postal Service has been able to read the postcode on the envelope .
What we know today LSTM Is in 1997 Invented in .
If 90 So many foundations have been laid in the s , Why wait until 2012 It is only years before we can use neural network to complete the task of deep learning ?
Hardware and the rise of the Internet
One of the main challenges in deep learning research is the lack of repeatable research . up to now , These developments are all theory driven , Because the availability of reliable data is low , Limited hardware resources .
In the past 20 years , Great progress has been made in hardware and the Internet . stay 20 century 90 years ,IBM Personal computer RAM by 16KB. stay 2010 year , On average, there is 4GB about !
Now? , We can train a small model on our computer , This is in 90 The age is unimaginable .
The game market also played an important role in this revolution , image NVIDIA and AMD Such companies are investing heavily in supercomputers , To provide a high-end virtual experience .
With the development of the Internet , It's much easier to create and distribute datasets for machine learning tasks .
from Wikipedia It's easier to learn and collect pictures in .
2010 year : Our era of deep learning
ImageNet:2009 year , The beginning of modern deep learning era , Li Feifei of Stanford University founded ImageNet, This is a large visualization dataset , It has been hailed as a project that has spawned the artificial intelligence revolution around the world .
As early as 2006 year , Lee is a new professor at the University of Illinois at Urbana Champaign . Her colleagues will constantly discuss new algorithms to make better decisions . However , She saw the flaws in their plans .
If you train on data sets that reflect the real world , So the best algorithms don't work well either .ImageNet By more than 2 Ten thousand categories of 1400 Ten thousand images , up to now , Still the cornerstone of object recognition technology .
Open competition :2009 year ,Netflix There was a program called Netflix Prize Open competition of , To predict the user ratings of movies .2009 year 9 month 21 Japan ,BellKor The practical chaos team of 10.06% The advantage of beating Netflix My own algorithm , To obtain the 100 $10000 bonus .
Kaggle Founded on 2010 year , It's a platform for all people around the world to hold machine learning competitions . It makes researchers 、 Engineers and native programmers are able to overcome the limits of complex data tasks .
Before the AI boom , The investment in AI is about 2000 Thousands of dollars . To 2014 year , This investment has increased 20 times , Google 、Facebook Amazon and other market leaders set aside money , Further research on future AI products . This new wave of investment has increased the number of recruiters in the field of deep learning from hundreds to tens of thousands .
ending
Despite the slow start , But deep learning has become an inevitable part of our lives . from Netflix and YouTube Recommended to language translation engine , From facial recognition and medical diagnosis to autopilot , There is no field that deep learning doesn't touch .
These advances broaden the future scope and application of neural networks in improving our quality of life .
Artificial intelligence is not our future , It's our present , It's just beginning !
Link to the original text :https://www.analyticsvidhya.com/blog/2020/10/how-does-the-gradient-descent-algorithm-work-in-machine-learning/
Welcome to join us AI Blog station : http://panchuang.net/
sklearn Machine learning Chinese official documents : http://sklearn123.com/
Welcome to pay attention to pan Chuang blog resource summary station : http://docs.panchuang.net/
版权声明
本文为[Artificial intelligence meets pioneer]所创,转载请带上原文链接,感谢
边栏推荐
- vue任意关系组件通信与跨组件监听状态 vue-communication
- PN8162 20W PD快充芯片,PD快充充电器方案
- 带你学习ES5中新增的方法
- H5 makes its own video player (JS Part 2)
- What is the side effect free method? How to name it? - Mario
- Free patent download tutorial (HowNet, Espacenet)
- ES6学习笔记(四):教你轻松搞懂ES6的新增语法
- Group count - word length
- 一篇文章带你了解CSS3圆角知识
- 一篇文章带你了解CSS 渐变知识
猜你喜欢
keras model.compile Loss function and optimizer
Python Jieba segmentation (stuttering segmentation), extracting words, loading words, modifying word frequency, defining thesaurus
[JMeter] two ways to realize interface Association: regular representation extractor and JSON extractor
零基础打造一款属于自己的网页搜索引擎
Filecoin最新动态 完成重大升级 已实现四大项目进展!
Interface pressure test: installation, use and instruction of siege pressure test
一篇文章教会你使用HTML5 SVG 标签
I think it is necessary to write a general idempotent component
一篇文章带你了解CSS 分页实例
采购供应商系统是什么?采购供应商管理平台解决方案
随机推荐
Python + appium automatic operation wechat is enough
速看!互联网、电商离线大数据分析最佳实践!(附网盘链接)
每个前端工程师都应该懂的前端性能优化总结:
Did you blog today?
Solve the problem of database insert data garbled in PL / SQL developer
5.5 controlleradvice notes - SSM in depth analysis and project practice
axios学习笔记(二):轻松弄懂XHR的使用及如何封装简易axios
一篇文章带你了解SVG 渐变知识
How to use Python 2.7 after installing anaconda3?
Skywalking series blog 5-apm-customize-enhance-plugin
In order to save money, I learned PHP in one day!
Analysis of react high order components
Character string and memory operation function in C language
Vuejs development specification
I think it is necessary to write a general idempotent component
Python Jieba segmentation (stuttering segmentation), extracting words, loading words, modifying word frequency, defining thesaurus
Brief introduction and advantages and disadvantages of deepwalk model
2018中国云厂商TOP5:阿里云、腾讯云、AWS、电信、联通 ...
[JMeter] two ways to realize interface Association: regular representation extractor and JSON extractor
Python3 e-learning case 4: writing web proxy