## Simple linear regression from sum and sum of squares

Let `$(x_i, y_i), i=1,2, \cdots , n$`

be $n$ pairs of observations.

The simple linear regression model of $Y$ on $X$ is

$$y_i=\beta_0 + \beta_1x_i +e_i$$ where,

- $y$ is a dependent variable,
- $x$ is an independent variable,
- $\beta_0$ is an intercept,
- $\beta_1$ is the slope,
- $e$ is the error term.

## Formula

By the method of least square, the model parameters $\beta_0$ and $\beta_1$ can be estimated as

The regression coefficients $\beta_0$ (intercept) and $\beta_1$ (slope) can be estimated as

`$\hat{\beta}_1 = \dfrac{n \sum xy - (\sum x)(\sum y)}{n(\sum x^2) -(\sum x)^2}$`

`$\hat{\beta}_0=\overline{y}-\hat{\beta}_1\overline{x}$`

where,

`$\overline{x}=\dfrac{1}{n}\sum_{i=1}^n x_i$`

is the sample mean of $X$,`$\overline{y}=\dfrac{1}{n}\sum_{i=1}^n y_i$`

is the sample mean of $Y$,- $n$ is the number of data points.

## Related Resources

Suggestions and comments will be appreciated.