当前位置:网站首页>[first song] machine learning of rebirth - linear regression
[first song] machine learning of rebirth - linear regression
2022-07-27 06:06:00 【Collapse an old face】
About 【 Head song 】 Linear regression theory and case practice Other units are only seen by fans , Students who want more learning resources pay attention to me ~
It's not easy to create , Before reference , Point a praise , Collection , You can't pay too much attention , Family
The first 1 Turn off : Data loading and analysis
Task description
Our mission : Write a small program that can load linear regression related data .
Programming requirements
The data in the actual combat content is a metadata , utilize pandas Read in the data file , And attach the name tag to the corresponding data , Respectively Population and Profit.
data = pd.read_csv(path, header= , names=[ ' ', ' ' ])
if __name__ == "__main__":
path = os.getcwd() + '/ex1data1.txt'
# utilize pandas Read in the data data, And the data attributes are named 'Population' and 'Profit'
#********* begin *********#
data=pd.read_csv(path,header=None,names=['Population','Profit'])
#********* end *********#
print(data.shape)The first 2 Turn off : Calculate the loss function
Programming requirements

According to the above formula , Write calculation loss function computeCost(X, y, theta), Finally back to cost.
X: I. metadata matrix , namelyPopulationdata ;y: Target data , namelyProfitdata ;theta: Model parameters ;cost: Loss function value .
Test instructions
Test input : nothing
Test output :the cost is: 32.0727338775
def computeCost(X, y, theta):
# Write the loss function calculation function according to the formula
#********* begin *********#
inner=np.power(((X*theta.T)-y),2)
cost=np.sum(inner)/(2*len(X))
cost=round(cost,10)
#********* end *********#
return costThe first 3 Turn off : Perform gradient descent to get a linear model
Programming requirements

According to the above formula , Write calculation loss function gradientDescent(X, y, theta, alpha, iters), Finally back to theta, cost.
x: I. metadata matrix , namelyPopulationdata ;y: Target data , namelyProfitdata ;theta: Model parameters ;m: Data scale ;α: Learning rate .
Test instructions
Test input : nothing
Test output : The model parameters are :[[-3.241402141.1272942]]
def gradientDescent(X, y, theta, alpha, iters):
temp = np.matrix(np.zeros(theta.shape))
parameters = int(theta.ravel().shape[1])
cost = np.zeros(iters)
for i in range(iters):
error = (X * theta.T) - y
for j in range(parameters):
#********* begin *********#
term=np.multiply(error,X[:,j])
temp[0,j]=theta[0,j]-((alpha/len(X))*np.sum(term))
#********* end *********#
theta = temp
cost[i] = computeCost(X, y, theta)notes : The content is only for reference and sharing , Do not spread without permission , Tort made delete
边栏推荐
猜你喜欢
随机推荐
古老的艺术-用好长尾关键词
对于windows下的Redis,只能读不能写的问题
【头歌】重生之我在py入门实训中(10): Numpy
安全帽反光衣检测识别数据集和yolov5模型
Speech and Language Processing (3rd ed. draft) Chapter 2 ——正则表达式,文本归一化,编辑距离 阅读笔记
DSGAN退化网络
面试常问Future、FutureTask和CompletableFuture
关于pytorch转onnx经常出现的问题
PS 2022 updated in June, what new functions have been added
STM32 infrared remote control
[MVC Architecture] MVC model
【头歌】重生之我在py入门实训中(5):列表
11. Gradient derivation of perceptron
PZK学C语言之字符串函数(一)
【Arduino】重生之Arduino 学僧(1)
[song] rebirth of me in py introductory training (9): exception handling
Kaggle调用自定义模块方法
Gbase 8C - SQL reference 6 SQL syntax (14)
韦东山 数码相框 项目学习(三)freetype的移植
数据库索引的一些说明以及使用









