Simple Linear Regression

02062018, 02:44 AM
(This post was last modified: 02062018 02:24 PM by Mike (Stgt).)
Post: #1




Simple Linear Regression
Many (most?) HP calculators have the L.R. function and I assume almost all in this forum use it or at least know how to use it. The regression line y = a + bx (the model function) is fitted by minimizing the sum of the squared errors ei in the equation yi = a + bxi + ei. That implies (as I understand it) for the (x, y) pairs you got from sampling that the x values are completely free of error, all measuring errors and stochastic effects are with the y value. (In other words, only the vertical distance from the messured points to the regression line is used to place the bestfit(?) line.)
Question to the engineers: is this reasonable? Question to the mathematicians: Why not using the horizontal distance, yi = a + b(xi + ei), or  if x and y are metered with similar error  why not the perpendicular distance to the regression line to fit? The sceptic, who think there is only one bestfit line, just mirror your points at the y=x line (exchange x and y) and do the L.R. once more. Only for correlation r = 1 the mirrored slope will be INV(unmirrored slope), else it differs. For all who do not want to enter all pairs again, here a little routine (HP10C) to get the "mirrored" slope b' and yintercept a'. Code: 01 42 0 MEAN This is 'the other' bestfit line. The one with the sum of the squared perpendicular distance minimized is somewhere in between. One more question to the mathematicians: this error consideration, is the error with x or y only or with both values, does not only apply to the linear model, what about the log, exp, pwr best fit models? Ciao.....Mike For who understands German more here, did not yet find similar in English. EDIT: in the a. m. link the formula for the slope for "Regression von X auf Y" is currently wrong, my routine is correct (well... not mathematically deduced yet). /M. 

02062018, 12:30 PM
Post: #2




RE: Simple Linear Regression
It is generally known as ‘orthogonal regression’ to statisticians and specifically as ‘Deming regression’ to the clinical chemistry world.
I work for SAS Institute and teach statistics to users of our JMP product. JMP provides this regression with several choices: estimate the variance ratio from data, assume the variance ratio is 1, or assume that all error is in X, not Y. To answer your first question, there are many realworld cases that violate assumption of error in SLR. That is why orthogonal regression was developed. The HP calculators can’t possibly provide all of the regression techniques but we can program them for the one we need. 

02062018, 02:10 PM
Post: #3




RE: Simple Linear Regression  
02062018, 02:41 PM
Post: #4




RE: Simple Linear Regression
The book "Curve Fitting For Programmable Calculators" (William M. Kolb) presents a method for calculating a linear regression based on minimizing error distance perpendicular to the regression line  what he refers to as isotonic linear regression. The formulas are simple enough that you could likely implement them on just about any programmable calculator with summation statistics and a few storage registers.


02062018, 03:04 PM
Post: #5




RE: Simple Linear Regression
(02062018 02:41 PM)Dave Britten Wrote: The book "Curve Fitting For Programmable Calculators" (William M. Kolb) presents a method for calculating a linear regression based on minimizing error distance perpendicular to the regression line  what he refers to as isotonic linear regression. Thank you, found it on p. 21 f. 

« Next Oldest  Next Newest »

User(s) browsing this thread: 1 Guest(s)