Post Reply 
Simple Linear Regression
02-06-2018, 02:44 AM (This post was last modified: 02-06-2018 02:24 PM by Mike (Stgt).)
Post: #1
Simple Linear Regression
Many (most?) HP calculators have the L.R. function and I assume almost all in this forum use it or at least know how to use it. The regression line y = a + bx (the model function) is fitted by minimizing the sum of the squared errors ei in the equation yi = a + bxi + ei. That implies (as I understand it) for the (x, y) pairs you got from sampling that the x values are completely free of error, all measuring errors and stochastic effects are with the y value. (In other words, only the vertical distance from the messured points to the regression line is used to place the best-fit(?) line.)

Question to the engineers: is this reasonable?

Question to the mathematicians: Why not using the horizontal distance, yi = a + b(xi + ei), or - if x and y are metered with similar error - why not the perpendicular distance to the regression line to fit?

The sceptic, who think there is only one best-fit line, just mirror your points at the y=x line (exchange x and y) and do the L.R. once more. Only for correlation r = 1 the mirrored slope will be INV(unmirrored slope), else it differs. For all who do not want to enter all pairs again, here a little routine (HP-10C) to get the "mirrored" slope b' and y-intercept a'.
Code:
01-  42  0  MEAN
02-   45 3  RCL 3 
03-     20  *     
04-     34  x<>y  
05-  42 36  Last X
06-     20  *     
07-   45 4  RCL 4 
08-     30  -     
09-     34  x<>y  
10-   45 5  RCL 5 
11-     30  -     
12-     10  /     
13-   44 6  STO 6 
14-  42  0  MEAN
15-   45 6  RCL 6 
16-     20  *     
17-     30  -     
18-   44 7  STO 7 
19-   45 6  RCL 6 
20-     34  x<>y  
21-  22 00  RTN
(Same procedure as for L.R. - when all points entered press R/S, result a' in X, b' in Y. Press L.R. for usual a and b. Or R/S again if you changed a point/some points.)
This is 'the other' best-fit line. The one with the sum of the squared perpendicular distance minimized is somewhere in between.

One more question to the mathematicians: this error consideration, is the error with x or y only or with both values, does not only apply to the linear model, what about the log, exp, pwr best fit models?

Ciao.....Mike

For who understands German more here, did not yet find similar in English.
EDIT: in the a. m. link the formula for the slope for "Regression von X auf Y" is currently wrong, my routine is correct (well... not mathematically deduced yet).
/M.
Find all posts by this user
Quote this message in a reply
02-06-2018, 12:30 PM
Post: #2
RE: Simple Linear Regression
It is generally known as ‘orthogonal regression’ to statisticians and specifically as ‘Deming regression’ to the clinical chemistry world.
I work for SAS Institute and teach statistics to users of our JMP product. JMP provides this regression with several choices: estimate the variance ratio from data, assume the variance ratio is 1, or assume that all error is in X, not Y.
To answer your first question, there are many real-world cases that violate assumption of error in SLR. That is why orthogonal regression was developed.
The HP calculators can’t possibly provide all of the regression techniques but we can program them for the one we need.
Find all posts by this user
Quote this message in a reply
02-06-2018, 02:10 PM
Post: #3
RE: Simple Linear Regression
(02-06-2018 12:30 PM)mark4flies Wrote:  It is generally known as ‘orthogonal regression’ to statisticians and specifically as ‘Deming regression’ to the clinical chemistry world.

Thank you for your explanation. This two terms I missed, now I get a step or two farther on.

Ciao.....Mike
Find all posts by this user
Quote this message in a reply
02-06-2018, 02:41 PM
Post: #4
RE: Simple Linear Regression
The book "Curve Fitting For Programmable Calculators" (William M. Kolb) presents a method for calculating a linear regression based on minimizing error distance perpendicular to the regression line - what he refers to as isotonic linear regression. The formulas are simple enough that you could likely implement them on just about any programmable calculator with summation statistics and a few storage registers.
Visit this user's website Find all posts by this user
Quote this message in a reply
02-06-2018, 03:04 PM
Post: #5
RE: Simple Linear Regression
(02-06-2018 02:41 PM)Dave Britten Wrote:  The book "Curve Fitting For Programmable Calculators" (William M. Kolb) presents a method for calculating a linear regression based on minimizing error distance perpendicular to the regression line - what he refers to as isotonic linear regression.

Thank you, found it on p. 21 f.

Smile
Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 1 Guest(s)