3.1: Introduction to Session 3 – What is Machine Learning?
Machine learning overview
Common Algorithm (Machine Learning Recipe):
-
KNN [k近傍法]
-
SVM
-
*ANN
Types of learning
-
Supervised learning
-
Tech the system with the input and outputs
-
Have training data
-
Have test data (the data excludes from training data)
-
Have Samples (Unknown data)
-
-
Unsupervised learning
-
Data that know nothing about *
-
e.g. songs patterns
-
-
Identify the commonalities in the data
-
-
*Reinforcement Learning
-
Reward based
-
3.2: Linear Regression with Ordinary Least Squares Part 1 – Intelligence and Learning
Linear Regression「線形回帰」
A linear approach to modelling the relationship between a scalar response
Ordinary Least Squares (OLS)「最小二乗法」
a type of linear least squares methods for estimating the unknown parameters in a linear regression model.
Linear Model formula:
formula to find the slop:
(where and
are the average)
issue with different data sets
When to use linear regression?
ref link: https://kwichmann.github.io/ml_sandbox/linear_regression_diagnostics/
- Residual plot should be random
3.4 Linear Regression with Gradient Descent
Gradient Descent 「最急降下法」
Gradient descent is a first-order iterative optimization algorithm for finding the minimum of a function.
Required Math
- Calculus
- derivative
- partical derivative
Basic steps to implement the gradient descent
- Obtain the results from the correct result sets
- Obtain the evaluated results from the program
- Calculate the error (where the error is the bias from program result to the correct results)
- Adjusted the program parameter according to the learning rate (here is where the calculus comes in)