美文网首页
GD:特征放缩Feature Scaling&归一化Mean N

GD:特征放缩Feature Scaling&归一化Mean N

作者: jianshuqwerty | 来源:发表于2017-12-25 13:24 被阅读0次

如果我们输入的样本的特征值都大致在相同的取值范围内会加快梯度下降的速度。这是因为θ在小范围内下降快。We can speed up gradient descent by having each of our input values in roughly the same range. This is because θ will descend quickly on small ranges and slowly on large ranges, and so will oscillate inefficiently down to the optimum when the variables are very uneven.

Feature Scaling: input value/(max-min)

Mean Normalization(归一化):

s可以是range也可以是均方差

相关文章

网友评论

      本文标题:GD:特征放缩Feature Scaling&归一化Mean N

      本文链接:https://www.haomeiwen.com/subject/qwllgxtx.html