• basis knowledge

    • norm of vector

    • eigenvector, eigenvalue Ax = lambda x

    • eigen decomposition 고유값 분해 A = V D VT

      Untitled

  • Least Square Method(Linear Regression)

    • model: y=ax+b
    • estimate a,b with least square errors
  • Ordinary Linear Least Square object function y=ax+b

    Untitled

    • limitation: not rotation invarient no horizontal, vertical line sensitive to noise
  • Total Linear Least Square regression with perpendicular distance

    Untitled

    Untitled

  • Error Handling: RANSAC RANdom SAmple Consensus

    • choose random points
    • fit model, find close points, reject other as outliers
    • repeat, choose best one
  • Fitting Linear Classifiers Perceptron: 다수의 입력에서 하나의 값을 내보내는 알고리즘

    • Classification regression이 f(x) = ax+b를 찾는거라면, classification은 f(x) = 1 or -1의 경계를 찾는 것이다.

    • Perceptron as simple linear classifier data와 weight의 곱+bias

      Untitled

      • Find w, minimizing the error
    • Activation function: 조건을 넘을 때 어떤 값을 내보내는지 결정

      Untitled

      • error: 아래 예시는 hinge loss(multiclass SVM loss function)

        Untitled

      • Gradient Descent 경사하강법

        • set initial parameter

        • calculate slope(Loss function to weight)

        • move weight: x_(t+1) = x_t + delta w 이때의 delta w:

          Untitled

          Untitled