# Application to linear models

## Everything You Need in One PlaceHomework problems? Exam preparation? Trying to grasp a concept or just brushing up the basics? Our extensive help & practice library have got you covered. | ## Learn and Practice With EaseOur proven video lessons ease you through problems quickly, and you get tonnes of friendly practice on questions that trip students up on tests and finals. | ## Instant and Unlimited HelpOur personalized learning platform enables you to instantly find the exact walkthrough to your specific type of question. Activate unlimited help now! |

#### Make math click 🤔 and get better grades! 💯Join for Free

Get the most by viewing this topic in your current grade. __Pick your course now__.

##### Intros

###### Lessons

**Applications to Linear Models Overview:**__Applying Least-Squares Problem to Economics__

• Go from $Ax=b$ to $X\beta=y$

• $X$→ design matrix

• $\beta$→ parameter vector

• $y$→ observation vector__Least-Squares Line__

• Finding the best fit line

• Turning a system of equations into $X\beta =y$

• Using the normal equation $X^T X\beta=X^T y$

• Introduction of the residual vector__Least-Squares to Other Curves__

• Finding the Best Fit Curve (not a line)

• Using the normal equation $X^T X\beta=X^T y$__Least-Squares to Multiple Regressions__

• Multiple Regression → multivariable function

• Finding a Best Fit Plane

• Using the normal equation $X^T X\beta=X^T y$

##### Examples

###### Lessons

**Finding the Least-Squares Line**

Find the equation $y=\beta_0+\beta_1 x$ of the least-squares line that best fits the given data points:

$(0,1),(1,2),(2,3),(3,3)$ **Finding the Least-Squares of Other Curves**

Suppose the monthly costs of a product depend on seasonal fluctuations. A curve that approximates the cost is

$y= \beta _0 + \beta _1 x+ \beta _2 x^2 + \beta_3 \cos$ ($\frac{2 \pi x}{12}$)

Suppose you want to find a better approximation in the future by evaluating the residual errors in each data point. Let's assume the errors for each data point to be $\epsilon_1,\epsilon_2,\cdots,\epsilon_n$.

Give the design matrix, parameter vector, and residual vector for the model that leads to a least-squares fit for the equation above. Assume the data are $(x_1,y_1 ),\cdots,(x_n,y_n).$- An experiment gives the data points $(0,1) , (1,3) , (2, 4), (3, 5)$. Suppose we wish to approximate the data using the equation

$y=A+Bx^2$

First find the design matrix, observational vector, and unknown parameter vector. No need to find the residual vector. Then find the least-squares curve for the data. **Finding the Least Squares of Multiple Regressions**

When examining a local model of terrain, we examine the data points to be $(1,1, 3), (2, 2, 5),$ and $(3, 1, 3)$. Suppose we wish to approximate the data using the equation

$y=\beta_0 u+\beta_1 v$

First find the design matrix, observational vector, and unknown parameter vector. No need to find the residual vector. Then find the least-squares curve for the data.**Proof Question Relating to Linear Models**

Show that

$\lVert X \hat{\beta} \rVert^2=\beta^TX^Ty$