Least Squares Regression: Some Notes

  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Least Squares Regression: Some Notes as PDF for free.

More details

  • Words: 266
  • Pages: 1
Least-Squares Regression | SHUBLEKA    

™ A regression line is a line that describes how a response variable y changes as an explanatory variable x changes. We often use a regression line to predict the value of y for a given value of x. ™ The regression line is a mathematical model for the data, much like the density curves.

y = a + bx

™ ™ ™ ™ ™ ™

y = response variable x = explanatory variable b = slope (the amount by which y changes when x increases by 1 unit) a = y-intercept, the value of y when x = 0 Correct interpretations of slope and y-intercept are especially important for every linear regression model. Prediction: we can use the regression model to predict the response y for a specific value of the explanatory variable x. Extrapolation = the use of a regression line for prediction outside of the range of values of the explanatory variable x used to obtain the line. Such predictions are often not accurate. Least Squares Regression: Error = Observed – Predicted A least squares regression line makes the sum of squares of distances point – line the least possible (minimum). Equation:

yˆ = a + bx sy b=r sx a = y − bx ™ Note: The slope and intercept of the regression line depend on the unit of measurement. We can’t conclude anything based on their magnitude. ™

( x, y )

™ The square of correlation is the fraction of variation in y values that is explained by the least squares regression of y on x.

Related Documents