Hypothesis |
---|
$$ \begin{align*} h_\theta(x) &= \theta^{T} x \\ x_0 &= 1 \end{align*} $$ |
Cost Function |
$$ \begin{align*} J(\theta) &= \dfrac {1}{2m} \sum _{i=1}^m \left (h_\theta (x^{(i)}) - y^{(i)} \right)^2 + \dfrac{\lambda}{2m} \sum _{j=1}^n \theta_j^2 \\ \end{align*} $$ |
Algorithms |
|
Gradient descent with \(m = 6, n = 1, \lambda = 0\) desmos |
$$ \begin{align*} h_\theta(x) &= \theta_0 + \theta_1 x \\ J(\theta_0, \theta_1) &= \dfrac {1}{2m} \sum _{i=1}^m \left (h_\theta (x^{(i)}) - y^{(i)} \right)^2 \end{align*} $$ |