Skip to content

Commit

Permalink
Added notes(lec 15) and assignment (#18)
Browse files Browse the repository at this point in the history
* Added notes(lec 15) and assignment

* Added notes for 16 and renamed the assignment file

* Make requested changes

* Small typo correction

* Removed line

* Correct typo
  • Loading branch information
kkothari2001 authored Jan 26, 2022
1 parent d53e2db commit e21cb71
Show file tree
Hide file tree
Showing 5 changed files with 49 additions and 0 deletions.
7 changes: 7 additions & 0 deletions Assignments/Lec5_MIT-13to16_Assignment.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Assignment 5: Implementing Simple linear Regression

Implement a simple linear regression model from scratch using only numpy and other basic libraries (sklearn only for loading datasets)

Implement it on the following datasets:
1. Iris between Petal Length and Petal Width (learn to split data on your own :))(10 points)
2. (Bonus 10 points) https://en.wikipedia.org/wiki/Transistor_count#Microprocessors Verify accuracy of Moore's law from the data available here (no libs allowed for OLS regression).
Binary file added RevisionNotes/Images/MIT_15_basic.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added RevisionNotes/Images/MIT_15_projection.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
27 changes: 27 additions & 0 deletions RevisionNotes/MIT_Lec_15.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# 15 - Projections into subspaces

### The basics of projection
![Projection image](./Images/MIT_15_projection.png)

### Defining a basic formula for projecting b onto a line formed by a single vector.
![Projection derivation example](./Images/MIT_15_basic.png)

- Let `p` be the projection of `b` onto the line going through `a`.
- Since `p` is at the shortest distance possible from `b`, the line joining them must be perpendicular to the line formed by multiples of `a`.
- Therefore, if `e = b - p`, then e must be perpendicular to a.
- Now `p` is a scalar multiple of `a`, so let `p = ax` with x being a scalar.
- We get a<sup>T</sup> (b- ax) = 0.
- Rearranging and substituting x, p = a(a<sup>T</sup> . b / a<sup>T</sup> . a)
- By using the associative law, we see that p = (a . a<sup>T</sup> / a<sup>T</sup> . a)b with the first bracket being known as the projection matrix.

### Applying this formula to approximating Ax = b when there are no solutions to it.

- Instead of solving `Ax=b` we solve `Ax=p`, where `p` is the projection of `b` onto the coloumn space of A.
- So we first find `p` and then predict `x` as the closest answer.
- So in this case `b - p` i.e. `b - Ax` has to be perpendicular to the plane with basis vectors as `a1` and `a2` (this is an example of a 2-dimensional subspace)
- So, a1<sup>T</sup> (b - Ax) = 0 and a2<sup>T</sup> (b - Ax) = 0, combining which vertically gives, A<sup>T</sup> (b - Ax) = 0.
- Rearranging and substituting, we get A<sup>T</sup> Ax = A<sup>T</sup> b, p= A (A<sup>T</sup> A)^-1 A<sup>T</sup> b

### Some important properties of P (the projecting matrix)'
- P<sup>T</sup> = P
- P . P = P
15 changes: 15 additions & 0 deletions RevisionNotes/MIT_Lec_16.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# 16 - Least Square Approximations

In this chapter we come to an important application of projections, and we come across what is perhaps the most basic formula of statistics when we are trying to predict something.

As usual we are trying to fit a linear hyperplane to a set of points, we call it **linear regression**.

So here approximating `Ax=b` what will A, x and b be?

A = The individual points that have to be fit.
x = The vector of weights for each feature
b = The vector of independant variable ( the prediction to be made )

Here estimating x is essentially estimating the weights of the variables.

So we are done estimating variables and fiting them to a hyperplane.

0 comments on commit e21cb71

Please sign in to comment.