gpt4 book ai didi

java - Java 中的梯度下降线性回归

转载 作者:塔克拉玛干 更新时间:2023-11-02 20:10:01 27 4
gpt4 key购买 nike

这有点远景,但我想知道是否有人可以看看这个。我在这里正确地进行线性回归的批量梯度下降吗?它给出了单个自变量和截距的预期答案,但不是多个自变量。

/**
* (using Colt Matrix library)
* @param alpha Learning Rate
* @param thetas Current Thetas
* @param independent
* @param dependent
* @return new Thetas
*/
public DoubleMatrix1D descent(double alpha,
DoubleMatrix1D thetas,
DoubleMatrix2D independent,
DoubleMatrix1D dependent ) {
Algebra algebra = new Algebra();

// ALPHA*(1/M) in one.
double modifier = alpha / (double)independent.rows();

//I think this can just skip the transpose of theta.
//This is the result of every Xi run through the theta (hypothesis fn)
//So each Xj feature is multiplied by its Theata, to get the results of the hypothesis
DoubleMatrix1D hypothesies = algebra.mult( independent, thetas );

//hypothesis - Y
//Now we have for each Xi, the difference between predictect by the hypothesis and the actual Yi
hypothesies.assign(dependent, Functions.minus);

//Transpose Examples(MxN) to NxM so we can matrix multiply by hypothesis Nx1
DoubleMatrix2D transposed = algebra.transpose(independent);

DoubleMatrix1D deltas = algebra.mult(transposed, hypothesies );


// Scale the deltas by 1/m and learning rate alhpa. (alpha/m)
deltas.assign(Functions.mult(modifier));

//Theta = Theta - Deltas
thetas.assign( deltas, Functions.minus );

return( thetas );
}

最佳答案

您的实现没有任何问题,根据您的评论,共线性 中的问题是您在生成 x2 时引起的。这在回归估计中是有问题的。

要测试您的算法,您可以生成两个独立的随机数列。选择 w0w1w2 的值,即 interceptx1 的系数和 x2 分别。计算相关值 y

然后看看你的随机/批量梯度下降算法是否可以恢复w0w1w2

关于java - Java 中的梯度下降线性回归,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/14948503/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com