Home
weirme
Cancel

Regularization

Factors Leading to Overfitting Number of samples is too small Too much noise Excessively complicated model Weight-decay Regularization Firstly, look at a typical example of overfitting: ...

Nonlinear Transformation

Consider classification problem as follows: obviously, it is a nonlinear separable problem, but we can use a circle to reach our goal: hypothesis of this model can be written as [h(\boldsymb...

Linear Model for Classification

Summary of Error Function All three algorithm we mentioned above have the same linear scoring function $s=\boldsymbol w^{\text T}\boldsymbol x$, now we discuss about the similarity of these algori...

Logistic Regression

In some cases, we want to learn the probability of something happening. In order to meet this goal, we can use a sigmoid function $\sigma$ which map the result of linear regression to a range bet...

Linear Regression

Given a dataset $\mathcal D = \lbrace(\boldsymbol {x_i},y_i)\rbrace_{i=1}^N$, where $\boldsymbol x \in \mathbb R^d$. we try to get a weight vector $(w_1,w_2,…,w_d)^\text T$ such that [y_i \approx ...

Perceptron

For $\boldsymbol{x}=(x_1,x_2,…,x_d)$, compute a weighted ‘score’. Judging as positive if $\sum_{i=1}^dw_ix_i>\text{threshold}$ while judging as negative if $\sum_{i=1}^dw_ix_i<\text{threshold...

Computational Theory

Components of Machine Learning Input (sample vector): $\boldsymbol{x}\in\mathcal{X}$ Output: $y\in\mathcal{Y}$ Unknown pattern to be leared (target function): $f:\mathcal{X}\to\mathcal{Y}$ ...