Post Reply 
(32SII) Curve Fitting
09-12-2019, 12:34 PM
Post: #1
(32SII) Curve Fitting
Introduction

The curve fitting program uses the linear regression module to determine the parameters b ("intercept") and m ("slope") in non-linear curves using following transformations:

Logarithmic Regression: y = b + m * ln x
Transformations: ( ln x, y, b, m )

Inverse Regression: y = b + m / x
Transformations: ( 1/x, y, b, m )

Exponential Regression: y = b * e^(m * x)
Transformation: ( x, ln y, e^b, m )

Power Regression: y = b * x^m
Transformation: ( ln x, ln y, e^b, m )

Geometric (Exponent) Regression: y = b * m^x
Transformation: ( x, ln y, e^b, e^m )

Simple Logistic Regression: y = 1 / (b + m * e^(-x))
Transformation: ( e^(-x), 1/y, b, m )

HP 32SII Program: Curve Fitting

Note:
1. This can be adapted into the HP 35S under one label. Just take note of the where the label points are.
2. The total amount of bytes used is 90.
3. Flags 1 and 2 are used. If flag 1 is set, e^m is calculated as slope. If flag 2 is set, e^b is calculated as intercept.

Program:
Code:
// Initialize - LBL X
LBL X
CF 1
CF 2
CLΣ
0
RTN

// Calculation - LBL Y
Code:
LBL Y

FS? 2
e^x
STO B
VIEW B
m
FS? 1
e^x
STO M
VIEW M

STO R
VIEW R
RTN

// Logarithmic Regression - LBL L
Code:
LBL L
LN 
R/S
GTO L

// Inverse Regression - LBL I
Code:
LBL I
1/x
R/S
GTO I

// Exponential Regression - LBL E
Code:
LBL E
SF 2
x<>y
LN 
x<>y
R/S
GTO E

// Power Regression - LBL P
Code:
LBL P
SF 2
LN 
x<>y
LN 
x<>y
R/S
GTO P

// Geometric/Exponent Regression - LBL G
Code:
LBL G
SF 1
SF 2
x<>y
LN
x<>y
R/S 
GTO G

// Simple Logistic Regression - LBL S
Code:
LBL S
+/-
e^x
x<>y
1/x 
x<>y
STOP 
GTO S

Instructions:
1. Clear the statistics data and flags by pressing [XEQ] X.
2. Enter data points, run the proper label, and press [ Σ+ ] or [ Σ- ].

For example, for Logarithmic fit:
y_data [ENTER] x_data [XEQ] L [ Σ+ ]

Subsequent Data:
y_data [ENTER] x_data [R/S] [ Σ+ ]

This scheme allows for undoing data:
y_data [ENTER] x_data [XEQ] L [ Σ- ]

3. Calculate intercept (B), slope (M), and correlation (R), press [XEQ] Y.

Examples

All results are rounded.

Example 1: Logarithmic Regression
Data (x,y):
(33.8, 102.4)
(34.6, 103.8)
(36.1, 105.1)
(37.8, 106.9)

Results:
B: -33.4580
M: 38.6498
R: 0.9941

y ≈ -33.4580 + 38.6498 ln x

Example 2: Inverse Regression
Data (x,y):
(100, 425)
(105, 429)
(110, 444)
(115, 480)

B: 823.80396
M: -40664.72143
R: -0.91195

y ≈ 823.80396 - 40664.72143/x

Example 3: Simple Logistic Regression
Data (x,y):
(1, 11)
(1.3, 9.615)
(1.6, 8.75)
(1.9, 8.158)
(2.6, 7.308)

B: 0.14675
M: -0.15487
R: -0.99733

y ≈ 1 / (0.14675 - 0.15487*e^(-x))


Blog link: https://edspi31415.blogspot.com/2019/09/...tting.html
Visit this user's website Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 1 Guest(s)