HP Forums

Full Version: Math Challlenge: Deming Regression for 2 independent variables
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi All,

It's easy to find Demin regression for two variables where both variables have errors in their measurements. You can even find HP calculator code for the simple Deming regression on TOS.

My challenge is for you is to develop the equations for a Deming Multiple Regression.

Good luck to us all!

Namir
Namir,

It has been so long I've forgotten what TOS is.

Have you developed equations for a Deming regression? Do they use summary statistics, or complex variables like the example on Wikipedia?

I derived equations around 1967 when I was sitting around on Air Force time in the wee hours of the morning.

The little routine "Orth" I posted in this thread:

https://www.hpmuseum.org/forum/thread-17...=rosenbaum

uses those equations. Do you have something in mind that does not use summary statistics?
I am looking for a version of the Deming Regression for two or more independent variables. What is available online are equations for one independent variables. I have found nothing on the interne for what I am looking for.

After giving it much thought I have come to accept that multiple regression equations should be able to work with observations that have errors in all variables.

Namir
Hi, Namir,

Please remind me of what TOS is, and give me a link to some calculator code for the single independent variable orthogonal regression case. :-)

I find methods on the web using Principal Components Analysis like this:
https://www.mathworks.com/help/stats/fit...lysis.html

You posed a challenge "to develop the equations for a Deming Multiple Regression". Sometimes when a challenge is posted, the poster already has the solution to the challenge. Do you have the solution already? Do you have any hints?
(01-02-2023 01:50 AM)Rodger Rosenbaum Wrote: [ -> ]Please remind me of what TOS is...

It's "that other site" which this site does not link to or mention by name. In an earlier discussion we see some backstory and the advice

Quote:Search for HP-41C at Wikipedia and take a look at the External Links section
Based on Deming Regression Calculator, for 2 dimensions.

Line: y = p*x + q

G(p, q) = 1/(1+p²) * ∑(yi - p*xi - q)²

minimize G      → ∂G/∂q = ∂G/∂p = 0

For multi-dimensions, G must be symmetric, and will reduce to above for 2 dimensions.
My guess is it is the same formula, except p, xi are vector, and p*xi is dot product.

I'd suggest we makeup an example, and check it ...
(01-02-2023 01:50 AM)Rodger Rosenbaum Wrote: [ -> ]Hi, Namir,

Please remind me of what TOS is, and give me a link to some calculator code for the single independent variable orthogonal regression case. :-)

I find methods on the web using Principal Components Analysis like this:
https://www.mathworks.com/help/stats/fit...lysis.html

You posed a challenge "to develop the equations for a Deming Multiple Regression". Sometimes when a challenge is posted, the poster already has the solution to the challenge. Do you have the solution already? Do you have any hints?

Here is the link to Wikipedia. The TOS has HP41C code for the Deming Regression. I don't think I can post a link on this web site.

Namir
(01-02-2023 01:50 AM)Rodger Rosenbaum Wrote: [ -> ]Hi, Namir,

Please remind me of what TOS is, and give me a link to some calculator code for the single independent variable orthogonal regression case. :-)

I find methods on the web using Principal Components Analysis like this:
https://www.mathworks.com/help/stats/fit...lysis.html

You posed a challenge "to develop the equations for a Deming Multiple Regression". Sometimes when a challenge is posted, the poster already has the solution to the challenge. Do you have the solution already? Do you have any hints?

Thanks for the Matlab link. I think it gives me something I am looking for. When I posted the challenge I had no solution. I was hoping hat someone (like yourself) has a clever finding for the solution, and you did! Thank you

I will be searching the web for "Orthogonal Regression" to look at additional information.

Namir
This is a rather common problem but the computations are not as simple as in the two-dimensional case where one variable is exact. The problem of "best fit" to curves in high dimensions seems popular.

https://www.sciencedirect.com/science/ar...2703004485

https://ieeexplore.ieee.org/stamp/stamp....er=8674762

One can find other discussions looking for "orthogonal least squares."
Reference URL's