当前位置:网站首页>[first song] machine learning of rebirth - linear regression
[first song] machine learning of rebirth - linear regression
2022-07-27 06:06:00 【Collapse an old face】
About 【 Head song 】 Linear regression theory and case practice Other units are only seen by fans , Students who want more learning resources pay attention to me ~
It's not easy to create , Before reference , Point a praise , Collection , You can't pay too much attention , Family
The first 1 Turn off : Data loading and analysis
Task description
Our mission : Write a small program that can load linear regression related data .
Programming requirements
The data in the actual combat content is a metadata , utilize pandas Read in the data file , And attach the name tag to the corresponding data , Respectively Population and Profit.
data = pd.read_csv(path, header= , names=[ ' ', ' ' ])
if __name__ == "__main__":
path = os.getcwd() + '/ex1data1.txt'
# utilize pandas Read in the data data, And the data attributes are named 'Population' and 'Profit'
#********* begin *********#
data=pd.read_csv(path,header=None,names=['Population','Profit'])
#********* end *********#
print(data.shape)The first 2 Turn off : Calculate the loss function
Programming requirements

According to the above formula , Write calculation loss function computeCost(X, y, theta), Finally back to cost.
X: I. metadata matrix , namelyPopulationdata ;y: Target data , namelyProfitdata ;theta: Model parameters ;cost: Loss function value .
Test instructions
Test input : nothing
Test output :the cost is: 32.0727338775
def computeCost(X, y, theta):
# Write the loss function calculation function according to the formula
#********* begin *********#
inner=np.power(((X*theta.T)-y),2)
cost=np.sum(inner)/(2*len(X))
cost=round(cost,10)
#********* end *********#
return costThe first 3 Turn off : Perform gradient descent to get a linear model
Programming requirements

According to the above formula , Write calculation loss function gradientDescent(X, y, theta, alpha, iters), Finally back to theta, cost.
x: I. metadata matrix , namelyPopulationdata ;y: Target data , namelyProfitdata ;theta: Model parameters ;m: Data scale ;α: Learning rate .
Test instructions
Test input : nothing
Test output : The model parameters are :[[-3.241402141.1272942]]
def gradientDescent(X, y, theta, alpha, iters):
temp = np.matrix(np.zeros(theta.shape))
parameters = int(theta.ravel().shape[1])
cost = np.zeros(iters)
for i in range(iters):
error = (X * theta.T) - y
for j in range(parameters):
#********* begin *********#
term=np.multiply(error,X[:,j])
temp[0,j]=theta[0,j]-((alpha/len(X))*np.sum(term))
#********* end *********#
theta = temp
cost[i] = computeCost(X, y, theta)notes : The content is only for reference and sharing , Do not spread without permission , Tort made delete
边栏推荐
猜你喜欢

AE 3D粒子系统插件:Trapcode Particular

15. GPU acceleration, Minist test practice and visdom visualization

Kaggle调用自定义模块方法

17. Attenuation of momentum and learning rate

【5·20特辑】MatLAb之我在和你表白

物联网操作系统多任务基础

超强远程连接管理工具:Royal TSX

LaTeX中多个公式公用一个序号时

Greedy high performance neural network and AI chip application research and training

【12】理解电路:从电报机到门电路,我们如何做到“千里传信”?
随机推荐
物联网操作系统多任务基础
LaTeX中多个公式公用一个序号时
[first song] rebirth of me in py introductory training (6): definition and application of functions
C语言-程序的编译
如何管理大量的定时任务
百问网驱动大全学习(一)LCD驱动
【头歌】重生之我在py入门实训中(5):列表
Stm32-fsmc extended memory SRAM
对于windows下的Redis,只能读不能写的问题
【头歌】重生之我在py入门实训中(2):公式编程
李宏毅 2020 深度学习与人类语言处理 DLHLP-Coreference Resolution-p21
制作视频特效必备工具:NUKE 13
编程学习记录--第2课【初识C语言】
Essential tool for making video special effects: nuke 13
文件的路径
【Unity URP】代码获取当前URP配置UniversalRendererData,并动态添加RendererFeature
Weidongshan digital photo frame project learning (IV) simple TXT document display (e-paper book)
图像超分辨率评价指标
Can it replace PS's drawing software?
[high concurrency] interviewer