Introduction
Backpropagation is being widely used in neural networks to enable computers learn weights in each layer of a neural network. From mathematics perspective, it is just an implementation of chain rule in Calculus. However, I think its power (in practice) enables the computer to calculate derivative of complex functions (e.g. deep neural networks) without explicitly writing down the mathematical formulas.
In this post, I will use a simple example to illstrate how backprogagtion calculates the derivatives of a function and implement it in python.
Introduction
In A short note on support vector machine (part 1), we started from the motivation for SVM, formulated it as an optimization problem and ended by a QP solution of the optimizaiton problem. Taking from there, we will discuss the dual version of SVM. This post aims to answer the following question.
What is the dual problem of the original SVM problem and why we care it.
How do we derive the dual problem from the original one.
Introduction
This is part of my self-note on understanding the derivation of support vector machine (SVM). This short note aims to answer the following question
What motivats the SVM
How we formulate the motivation into an mathematical optimization problem
How to simplify the original optimization problem and solve it by Quadratic Programming (QP)
This short note is based on the understanding of Prof. Hsuan-Tien Lin’s Video on Machine Learning Techniques