Class OLSMultipleLinearRegression
- java.lang.Object
-
- org.apache.commons.math.stat.regression.AbstractMultipleLinearRegression
-
- org.apache.commons.math.stat.regression.OLSMultipleLinearRegression
-
- All Implemented Interfaces:
MultipleLinearRegression
public class OLSMultipleLinearRegression extends AbstractMultipleLinearRegression
Implements ordinary least squares (OLS) to estimate the parameters of a multiple linear regression model.
The regression coefficients,
b, satisfy the normal equations:XT X b = XT yTo solve the normal equations, this implementation uses QR decomposition of the
Xmatrix. (SeeQRDecompositionImplfor details on the decomposition algorithm.) TheXmatrix, also known as the design matrix, has rows corresponding to sample observations and columns corresponding to independent variables. When the model is estimated using an intercept term (i.e. whenisNoInterceptis false as it is by default), theXmatrix includes an initial column identically equal to 1. We solve the normal equations as follows:XTX b = XT y (QR)T (QR) b = (QR)Ty RT (QTQ) R b = RT QT y RT R b = RT QT y (RT)-1 RT R b = (RT)-1 RT QT y R b = QT yGiven
QandR, the last equation is solved by back-substitution.- Since:
- 2.0
-
-
Constructor Summary
Constructors Constructor Description OLSMultipleLinearRegression()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description doublecalculateAdjustedRSquared()Returns the adjusted R-squared statistic, defined by the formulaRealMatrixcalculateHat()Compute the "hat" matrix.doublecalculateResidualSumOfSquares()Returns the sum of squared residuals.doublecalculateRSquared()Returns the R-Squared statistic, defined by the formuladoublecalculateTotalSumOfSquares()Returns the sum of squared deviations of Y from its mean.voidnewSampleData(double[] y, double[][] x)Loads model x and y sample data, overriding any previous sample.voidnewSampleData(double[] data, int nobs, int nvars)Loads model x and y sample data from a flat input array, overriding any previous sample.-
Methods inherited from class org.apache.commons.math.stat.regression.AbstractMultipleLinearRegression
estimateErrorVariance, estimateRegressandVariance, estimateRegressionParameters, estimateRegressionParametersStandardErrors, estimateRegressionParametersVariance, estimateRegressionStandardError, estimateResiduals, isNoIntercept, setNoIntercept
-
-
-
-
Method Detail
-
newSampleData
public void newSampleData(double[] y, double[][] x)Loads model x and y sample data, overriding any previous sample. Computes and caches QR decomposition of the X matrix.- Parameters:
y- the [n,1] array representing the y samplex- the [n,k] array representing the x sample- Throws:
java.lang.IllegalArgumentException- if the x and y array data are not compatible for the regression
-
newSampleData
public void newSampleData(double[] data, int nobs, int nvars)Loads model x and y sample data from a flat input array, overriding any previous sample.
Assumes that rows are concatenated with y values first in each row. For example, an input
dataarray containing the sequence of values (1, 2, 3, 4, 5, 6, 7, 8, 9) withnobs = 3andnvars = 2creates a regression dataset with two independent variables, as below:y x[0] x[1] -------------- 1 2 3 4 5 6 7 8 9
Note that there is no need to add an initial unitary column (column of 1's) when specifying a model including an intercept term. If
AbstractMultipleLinearRegression.isNoIntercept()istrue, the X matrix will be created without an initial column of "1"s; otherwise this column will be added.Throws IllegalArgumentException if any of the following preconditions fail:
datacannot be nulldata.length = nobs * (nvars + 1)nobs > nvars
This implementation computes and caches the QR decomposition of the X matrix.
- Overrides:
newSampleDatain classAbstractMultipleLinearRegression- Parameters:
data- input data arraynobs- number of observations (rows)nvars- number of independent variables (columns, not counting y)
-
calculateHat
public RealMatrix calculateHat()
Compute the "hat" matrix.
The hat matrix is defined in terms of the design matrix X by X(XTX)-1XT
The implementation here uses the QR decomposition to compute the hat matrix as Q IpQT where Ip is the p-dimensional identity matrix augmented by 0's. This computational formula is from "The Hat Matrix in Regression and ANOVA", David C. Hoaglin and Roy E. Welsch, The American Statistician, Vol. 32, No. 1 (Feb., 1978), pp. 17-22.
- Returns:
- the hat matrix
-
calculateTotalSumOfSquares
public double calculateTotalSumOfSquares()
Returns the sum of squared deviations of Y from its mean.
If the model has no intercept term,
0is used for the mean of Y - i.e., what is returned is the sum of the squared Y values.The value returned by this method is the SSTO value used in the
R-squaredcomputation.- Returns:
- SSTO - the total sum of squares
- Since:
- 2.2
- See Also:
AbstractMultipleLinearRegression.isNoIntercept()
-
calculateResidualSumOfSquares
public double calculateResidualSumOfSquares()
Returns the sum of squared residuals.- Returns:
- residual sum of squares
- Since:
- 2.2
-
calculateRSquared
public double calculateRSquared()
Returns the R-Squared statistic, defined by the formulaR2 = 1 - SSR / SSTO
where SSR is thesum of squared residualsand SSTO is thetotal sum of squares- Returns:
- R-square statistic
- Since:
- 2.2
-
calculateAdjustedRSquared
public double calculateAdjustedRSquared()
Returns the adjusted R-squared statistic, defined by the formula
R2adj = 1 - [SSR (n - 1)] / [SSTO (n - p)]
where SSR is thesum of squared residuals, SSTO is thetotal sum of squares, n is the number of observations and p is the number of parameters estimated (including the intercept).If the regression is estimated without an intercept term, what is returned is
1 - (1 -calculateRSquared()) * (n / (n - p))- Returns:
- adjusted R-Squared statistic
- Since:
- 2.2
- See Also:
AbstractMultipleLinearRegression.isNoIntercept()
-
-