Getting more details from the optimization function from R

I am not very familiar with the optim function, and I wanted to get this information from its results: a) how many iterations were needed to achieve the result? and b) construct a sequence of particular solutions, i.e. solution obtained at the end of each iteration.

My code so far looks like this:

  f1 <- function(x) {
  x1 <- x[1]
  x2 <- x[2]
  x1^2 + 3*x2^2
}

res <- optim(c(1,1), f1, method="CG")

How can I improve it to get more information?

Thanks in advance

+4
source share
3 answers

You can change your function to save the values ​​passed to it in the global list.

i <- 0  
vals <- list()
f1 <- function(x) {
  i <<- i+1
  vals[[i]] <<- x

  x1 <- x[1]
  x2 <- x[2]
  x1^2 + 3*x2^2  
}

res <- optim(c(1,1), f1, method="CG")

, vals , , . , , . ​​

+5

trace=1 optim :

res <- optim(c(1,1), f1, method="CG", control=list(trace=1))
# Conjugate gradients function minimizer
# Method: Fletcher Reeves
# tolerance used in gradient test=3.63798e-12
# 0 1 4.000000
# parameters    1.00000    1.00000 
# * i> 1 4 0.480000
# parameters    0.60000   -0.20000 
#   i> 2 6 0.031667
# ......
# * i> 13 34 0.000000
# parameters   -0.00000    0.00000 
# 14 34 0.000000
# parameters   -0.00000    0.00000 
# Exiting from conjugate gradients minimizer
#   34 function evaluations used
#   15 gradient evaluations used

, , , sink , , .

+5

, , , . $counts :

 counts: A two-element integer vector giving the number of calls to
          ‘fn’ and ‘gr’ respectively. This excludes those calls needed
          to compute the Hessian, if requested, and any calls to ‘fn’
          to compute a finite-difference approximation to the gradient.

@Dason - .

+4

Source: https://habr.com/ru/post/1542786/


All Articles