티스토리 뷰

반응형

Optim

julia Optim 패키지를 이용하여 Himmelblau's function의 극값을 찾는 방법을 알아보겠다.

\[ f(x, y) = (x^2+y-11)^2 + (x+y^2-7)^2\]

이 함수는 1개의 극대값(maximum)과 4개의 극소값(minima)을 가진다.

먼저, 필요한 패키지 로드 한다.

julia> using Optim

그리고 목적함수와 gradient 그리고 Hessian 함수를 설정한다.

julia> function himmelblau(x::Vector)
  (x[1]^2 + x[2] - 11)^2 + (x[1] + x[2]^2 - 7)^2
end
himmelblau (generic function with 1 method)

julia> function himmelblau_gradient!(x::Vector, gradient::Vector)
  gradient[1] = 4 * x[1] * (x[1]^2 + x[2] - 11) + 2 * (x[1] + x[2]^2 - 7)
  gradient[2] = 2 * (x[1]^2 + x[2] - 11) + 4 * x[2] * (x[1] + x[2]^2 - 7)
end
himmelblau_gradient! (generic function with 1 method)

julia> function himmelblau_hessian!(x::Vector, hessian::Matrix)
   hessian[1, 1] = 4 * (x[1]^2 + x[2] - 11) + 8 * x[1]^2 + 2
   hessian[1, 2] = 4 * x[1] + 4 * x[2]
   hessian[2, 1] = 4 * x[1] + 4 * x[2]
   hessian[2, 2] = 4 * (x[1] + x[2]^2 -  7) + 8 * x[2]^2 + 2
end
himmelblau_hessian! (generic function with 1 method)

극값을 구하기 위한 여러 알고리즘이 있다. 먼저 Nelder Mead 방법을 사용하겠다.

julia> optimize(himmelblau, [2.5, 2.5], NelderMead())
Results of Optimization Algorithm
 * Algorithm: Nelder-Mead
 * Starting Point: [2.5,2.5]
 * Minimizer: [3.0000037281643586,2.000010544994531]
 * Minimum: 3.190899e-09
 * Iterations: 35
 * Convergence: true
   *  √(Σ(yᵢ-ȳ)²)/n < NaN: false
   * Reached Maximum Number of Iterations: false
 * Objective Function Calls: 70

이번엔 BFGS 알고리즘의 최소 메모리 버전을 사용하여 해를 구해보겠다.

julia> optimize(himmelblau, himmelblau_gradient!, [2.5, 2.5], LBFGS())

Results of Optimization Algorithm
 * Algorithm: L-BFGS
 * Starting Point: [2.5,2.5]
 * Minimizer: [2.9999999999993854,2.0000000000001963]
 * Minimum: 1.222536e-23
 * Iterations: 6
 * Convergence: true
   * |x - x'| < 1.0e-32: false
   * |f(x) - f(x')| / |f(x)| < 1.0e-32: false
   * |g(x)| < 1.0e-08: true
   * Reached Maximum Number of Iterations: false
 * Objective Function Calls: 25
 * Gradient Calls: 25

마지막으로 Newton's 방법을 사용하여 해를 구해보겠다.

julia> optimize(himmelblau, himmelblau_gradient!, himmelblau_hessian!, [2.5, 2.5], Newton())
Results of Optimization Algorithm
 * Algorithm: Newton's Method
 * Starting Point: [2.5,2.5]
 * Minimizer: [3.0,2.0]
 * Minimum: 0.000000e+00
 * Iterations: 5
 * Convergence: true
   * |x - x'| < 1.0e-32: false
   * |f(x) - f(x')| / |f(x)| < 1.0e-32: false
   * |g(x)| < 1.0e-08: true
   * Reached Maximum Number of Iterations: false
 * Objective Function Calls: 19
 * Gradient Calls: 19
반응형
최근에 올라온 글
최근에 달린 댓글
Total
Today
Yesterday
«   2024/05   »
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31
글 보관함