数据分析与机器学习之线性回归与逻辑回归(六) (9)
#梯度下降实现原理: 对theta1与theta0进行求偏导值
#对theta1值进行求偏导值
def partial_cost_theta1(theta0,theta1,x,y):
h = theta0 + theta1*x #预测函数
diff = (h-y) *x # 对theta1进行求偏导 (h(x) - y) * x
partial = diff.sum()/(x.shape[0]) #进行求和并除以样本数量
return partial
partial1 = partial_cost_theta1(0,5,pga.distance,pga.accuracy)
print(partial1)
#对theta0进行求偏导值
def partial_cost_theta0(theta0,theta1,x,y):
h = theta0 + theta1*x #预测函数
diff = (h-y)
#对theta0求偏导 (h(x) - y)
partial = diff.sum() / (x.shape[0]) #进行求和并除以样本数量
return partial
partial0 = partial_cost_theta0(1,1,pga.distance,pga.accuracy)
print(partial0)
#输出
5.5791338540719
1.0000000000000104
内容版权声明:除非注明,否则皆为本站原创文章。