Programming Ocean Academy

AI Algorithms Explainer

$$ y = f\left(\sum_{i=1}^{n}w_ix_i + b\right) $$

Linear Regression

$$ P(y|x) = \frac{e^{w^Tx}}{\sum_{k=1}^{K}e^{w_k^Tx}} $$

Softmax Function

$$ L = -\frac{1}{N}\sum_{i=1}^{N} \left[ y_i\log(p_i) + (1-y_i)\log(1-p_i) \right] $$

Binary Cross-Entropy Loss

$$ \theta_t = \theta_{t-1} - \frac{\eta}{\sqrt{\hat{v_t}} + \epsilon} \hat{m_t} $$

Adam Optimizer Algorithm

Adam Optimizer

$$ \theta_t = \theta_{t-1} - \eta \nabla_\theta J(\theta) $$

Stochastic Gradient Descent (SGD)

SGD Illustration

$$ P(y|x) = \sigma(w^Tx + b) = \frac{1}{1 + e^{-(w^Tx + b)}} $$

Binary Classification Algorithm

Binary Classification Illustration

$$ d(x, x_i) = \sqrt{\sum_{j=1}^n (x_j - x_{ij})^2} $$

K-Nearest Neighbors (KNN) Algorithm

KNN Classification Illustration

$$ f(x) = \text{sign}(w^T \phi(x) + b) $$

Support Vector Classifier (SVC)

SVC Classification Illustration