Regression

1 minute read

Published:

This post covers Introduction to probability from Statistics for Engineers and Scientists by William Navidi.

Basic Ideas

  • The Least-Squares Line

    • In this section we will learn how to compute the least-squares line and how it can be used to draw conclusions from data.

    • A spring is hung vertically with the top end fixed, and weights are hung one at a time from the other end.

    • After each weight is hung, the length of the spring is measured. Let $x_1,\ldots , x_n$ represent the weights, and let $l_i$ represent the length of the spring under the load $x_i$. Hooke’s law states that


      \(\begin{align*} l_i &= \beta_0 + \beta_1 x_i \\ \end{align*}\)

      where $\beta_0$ is the length of the spring when unloaded and $\beta_1$ is the spring constant. Let $y_i$ be the measured length of the spring under load $x_i$. Because of measurement error, $y_i$ will differ from the true length $l_i$. We write \(\begin{align*} y_i &= l_i + \epsilon_i\\ \end{align*}\)

      where $\epsilon_i$ is the error in the $i^{th}$ measurement. Combining prior two equations, we obtain In Equation, $y_i$ is called the dependent variable, $x_i$ is called the independent \(\begin{align*} y_i &= \beta_0 + \beta_1x_i + \epsilon_i\\ \end{align*}\) variable, $𝛽_0$ and $\beta_1$ are the regression coefficients, and $\epsilon_i$ is called the error. Equation is called a linear model.