MathPHP\Statistics\Regression\Methods\LeastSquares::leastSquares PHP Метод

leastSquares() публичный Метод

Generalizing from a straight line (first degree polynomial) to a kᵗʰ degree polynomial: y = a₀ + a₁x + ⋯ + akxᵏ Leads to equations in matrix form: [n Σxᵢ ⋯ Σxᵢᵏ ] [a₀] [Σyᵢ ] [Σxᵢ Σxᵢ² ⋯ Σxᵢᵏ⁺¹] [a₁] [Σxᵢyᵢ ] [ ⋮ ⋮ ⋱ ⋮ ] [ ⋮ ] = [ ⋮ ] [Σxᵢᵏ Σxᵢᵏ⁺¹ ⋯ Σxᵢ²ᵏ ] [ak] [Σxᵢᵏyᵢ] This is a Vandermonde matrix: [1 x₁ ⋯ x₁ᵏ] [a₀] [y₁] [1 x₂ ⋯ x₂ᵏ] [a₁] [y₂] [⋮ ⋮ ⋱ ⋮ ] [ ⋮ ] = [ ⋮] [1 xn ⋯ xnᵏ] [ak] [yn] Can write as equation: y = Xa Solve by premultiplying by transpose Xᵀ: Xᵀy = XᵀXa Invert to yield vector solution: a = (XᵀX)⁻¹Xᵀy (http://mathworld.wolfram.com/LeastSquaresFittingPolynomial.html) For reference, the traditional way to do least squares: _ _ __ x y - xy _ _ m = _________ b = y - mx _ __ (x)² - x²
public leastSquares ( array $ys, array $xs, integer $order = 1, integer $fit_constant = 1 ) : Matrix
$ys array y values
$xs array x values
$order integer The order of the polynomial. 1 = linear, 2 = x², etc
$fit_constant integer '1' if we are fitting a constant to the regression.
Результат MathPHP\LinearAlgebra\Matrix [[m], [b]]
    public function leastSquares(array $ys, array $xs, $order = 1, $fit_constant = 1) : Matrix
    {
        $this->reg_ys = $ys;
        $this->reg_xs = $xs;
        $this->fit_constant = $fit_constant;
        $this->p = $order;
        $this->ν = $this->n - $this->p - $this->fit_constant;
        // y = Xa
        $X = $this->createDesignMatrix($xs);
        $y = new ColumnVector($ys);
        // a = (XᵀX)⁻¹Xᵀy
        $Xᵀ = $X->transpose();
        $this->⟮XᵀX⟯⁻¹ = $Xᵀ->multiply($X)->inverse();
        $temp_matrix = $this->⟮XᵀX⟯⁻¹->multiply($Xᵀ);
        $this->reg_P = $X->multiply($temp_matrix);
        $β_hat = $temp_matrix->multiply($y);
        $this->reg_Yhat = $X->multiply($β_hat)->getColumn(0);
        return $β_hat;
    }