dttom Posted September 15, 2010 Posted September 15, 2010 (edited) I am not sure if I call the topic 'linear regression' is right, actually it is about 'least square analysis' in constructing a best-fit line, but when I googles the phase 'linear regression' some relevant information pops out. So assuming now we have a set of data, y values correspond to x values; plotting it in y vs. x manner it appears to form a straight line, but now we want the best-fit line. The method is to select an m value and a c value, by y = mx + c, to minimize di = sum[(actual y)-(computed y)]^2, I can do this step. What I got difficulty is to find the standard deviation of m and c. Sy^2 = di/(n-2); In a lecture I heard about: Sm^2 = (Sy^2)(n/D) where n is the observation size and D = nsum(Xi^2)-(sum(Xi))^2 and; Sc^2 = (Sy^2)(sum(Xi^2)) I do not know how these equations are derived. There is no explanation in the lecture as it simply uses them in applications, while I am interested in the derive-process. I googled it and some suggested Sm^2 = (Sy^2)(sum(dm/dyi)) ===partial differentiation, similar formula is for Sc^2 by replacing dm/dyi with dc/dyi. Again I do not know how they are derived. Could somebody help? Edited September 15, 2010 by dttom
ewmon Posted September 17, 2010 Posted September 17, 2010 The LSM (Least Squares Method) is a form of linear regression. minimize di = sum[(actual y)-(computed y)]^2 I think this should say: minimize di = sum{[(actual y)-(computed y)]^2} I've studied the LSM, and I don't recall a standard deviation of m or c.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now