Tag Archives: multiple linear regression

Calculating prediction interval in multiple regression using Mahalanobis distance

In TI Nspire, the prediction interval can easily be obtained from the Statistic function Confidence Interval > Multiple Reg Interval.


On the result section, the lower and upper prediction intervals are displayed for convenient reference.

Alternatively, using Mahalanobis distance, this prediction interval value can be calculated as in below.



White test in TI Nspire and R

The White test is a statistical test to determine whether homoskedasticity exists in a data set. This test is based on the variance from the residual values. The TI Npsire is capable of computing this test even though it is not part of built-in functions, as the residual values can be recalled from regression tests. An example including multiple regression is shown below.

A scatter plot for visual inspection of heteroskedasticity.

In spreadsheet mode the calculation of the data set.

And in R.



Multiple linear regression by weighted least squares on the Nspire

When performing data analysis, it is sometimes desirable to assign weights to selected data according to their perceived values. For example, data that are more reliable are assigned a higher weight, or weight value that is inversely proportional to variance of that data value. This technique can be applied to multiple linear regression as well. In the more common regression method by ordinary least squares (OLS), all observed data are of the same weight. In weighted least squares (WLS), an arbitrary weight value is assigned to each of the observations. WLS is a special case of generalized least squares (GLS) method.

The regression analysis in Nspire supports only the OLS method. Programming is required to adopt the WLS. Fortunately, the built-in programming by the Nspire supports accessing data stored in the spreadsheet application in the form of  lists and matrices, which are heavily relied upon on the calculation of WLS statistics. Needless to say Nspire is good at performing matrix operations.

Similar to OLS, the WLS approach is based on the minimization of the sum of squares between sets of data, from which the parameters for the regression equation are obtained. In WLS, the equation is given by

β = (XT Λ-1 X)-1XTΛ-1Y

where  Λ is the covariance matrix used to determine the weights, and can be represented by the piece-wise equation


The total sum of squares in WLS is given by


and the sum of squares error by


For visualization, the response plane plot of the regression equations obtained from a sample data set by OLS with the Nspire built-in multiple linear regression and the WLS program respectively are generated using the 3D function plot.