Category Archives: Optimization

Updated – on CRAN: CEGO Version 2.2.0

A new version (2.2.0) of my R-package CEGO was just uploaded to CRAN.

This update contained some various fixes to code and documentation. Also, the interfacing of C code was reworked, following recent changes to CRAN checks (registering of entry points to compiled code). This may have yielded a speed up for some of the distance calculations. The update also contains code that has been employed in the article Simulation-based Test Functions for Optimization Algorithms (see Publications for the pre-print PDF). This mostly includes functions for Kriging simulation as well as for the generation of corresponding test functions.

See also (external link) https://cran.r-project.org/package=CEGO.

Presentation at 15th Workshop on Quality Improvement Methods

Recently, I had the opportunity to give a talk on the topic of the optimization of the feed material of a biogas plant, at the 15th Workshop on Quality Improvement Methods in Dortmund, Germany. The slides can be found in the presentations section of this site.

I really liked the active discussions at the Workshop. So thanks again to the organizers for inviting me. See also the nice photo of the participants here:

(external link:) https://www.statistik.tu-dortmund.de/qim15.html

You can also find the program and abstracts of all the talks via the above link.

Article Update: Model-based methods for continuous and discrete global optimization

Just a small update for the article mentioned in the last post: Model-based methods for continuous and discrete global optimization.

Until April 11, 2017, the following link will direct you to the final version of the article on ScienceDirect (free, without personal or institutional registration).

(external link:) https://authors.elsevier.com/a/1Ub295aecSVmv2

Recent Publication: Model-based Methods for Continuous and Discrete Global Optimization

The accepted manuscript of the article Model-based Methods for Continuous and Discrete Global Optimization by Thomas Bartz-Beielstein and myself has been published in the Elsevier Journal Applied Soft Computing.

External link: http://dx.doi.org/10.1016/j.asoc.2017.01.039

Among other important issues of Model-based Optimization, the article deals with methods for discrete or combinatorial optimization problems. A tabular survey of works in this field is part of the supplementary material. A PDF of this table will be kept up-to-date on this website, see:

Survey: Combinatorial Surrogate Models

If you have comments or additions with respect to that collection, please let me know.

Also, you can find an earlier preprint version in the publications section.

 

PPSN Tutorial 2016: Surrogate Model Optimization

>>>Tutorial PPSN 2016<<<

In the above attached PDF you can find our slides for the tutorial “Meta-Model Assisted (Evolutionary) Optimization”, held at PPSN 2016.

In addition, please find below the code that is used in the tutorial’s example with R:


## Introductory comments:
##
## To run this script, you need to install R, the language for statistical computing.
## https://cran.r-project.org/
## For ease of use, you may consider RStudio IDE, but this is not required.
## https://www.rstudio.com/
## For a tutorial/introduction to R, see e.g.:
## https://cran.r-project.org/doc/manuals/r-release/R-intro.html

## Preparation:
##
## You may need to install some of the packages that this example
## for the PPSN 2016 tutorial is using. If they are available,
## the library("...") command used in the following will
## load the successfully. Otherwise you may receive an error.
## To install the required packages, uncomment the following lines:
# install.packages("SPOT") 
library("SPOT") #load required package: SPOT

## Initialize random number generator seed. Reproducibility.
set.seed(1)

## Main part
## This example should demonstrate the use of surrogate-modeling
## in optimization. To that end, we first need to define an
## optimization problem to be solved by such methods.
## Clearly, we lack the time to consider a real-world, expensive
## optimization problem. Hence, use the following simple, 
## one-dimensional test function, from the book 
## A. I. J. Forrester, A. Sobester, A. J. Keane; 
## "Engineering Design via Surrogate Modeling"; 
## Wiley (2008)

objectFun <- function(x){
	(6*x-2)^2 * sin(12*x-4)
}

## Plot the function:
par(mar=c(4,4,0.5,4),mgp=c(2,1,0))
curve(objectFun(x),0,1)

## Now, let us assume objectFun is expensive.
## First, we start with making some initial
## design of experiment, which in this case
## is simply a regular grid:
x <- seq(from=0, by=0.3,to=1)

## Evaluate with objective:
y <- sapply(x,objectFun)

## Add to plot:
points(x,y)

## Build a model (here: Kriging, with the SPOT package. 
## But plenty of alternatives available)
fit <- forrBuilder(as.matrix(x),as.matrix(y),
    control=list(uselambda=FALSE #do not use nugget effect (regularization)
    ))

## Evaluate prediction based on model fit
xtest <- seq(from=0, by=0.001,to=1)
pred <- predict(fit,as.matrix(xtest),predictAll=T)
ypred <- pred$f
spred <- pred$s

## Plot the prediction of the model:
lines(xtest,ypred,lty=2)

## Plot suggested candidate solution
points(xtest[which.min(ypred)],ypred[which.min(ypred)],col="black",pch=20)

## Calculate expected improvement (EI)
ei <- 10^(-spotInfillExpImp(ypred,spred,min(y)))
par(new = T)
plot(xtest,ei,lty=3, type="l", axes=F, xlab=NA, ylab=NA, 
     ylim=rev(range(ei)))
axis(side = 4); mtext(side = 4, line = 2, 'EI')
## but note: EI is on a different scale


## Plot suggested candidate solution, based on EI
points(xtest[which.max(ei)],ei[which.max(ei)],col="red",pch=20)
newx <- xtest[which.max(ei)]

## Add data
x <- c(x,newx)
y <- c(y,objectFun(newx))

## Now repeat the same as often as necessary:
repeatThis <- expression({
  curve(objectFun(x),0,1)
  points(x,y)
  fit <- forrBuilder(as.matrix(x),as.matrix(y),
                    control=list(uselambda=FALSE
                    ))
  xtest <- seq(from=0, by=0.001,to=1)
  pred <- predict(fit,as.matrix(xtest),predictAll=T)
  ypred <- pred$f
  spred <- pred$s
  lines(xtest,ypred,lty=2)
  points(xtest[which.min(ypred)],ypred[which.min(ypred)],col="black",pch=20)  
  ei <- 10^(-spotInfillExpImp(ypred,spred,min(y)))
  par(new = T)
  plot(xtest,ei,lty=3, type="l", axes=F, xlab=NA, ylab=NA, 
       ylim=rev(range(ei)))
  axis(side = 4); mtext(side = 4, line = 2, 'EI')
  points(xtest[which.max(ei)],ei[which.max(ei)],col="red",pch=20)
  newx <- xtest[which.max(ei)]
  x <- c(x,newx)
  y <- c(y,objectFun(newx)) }) 
eval(repeatThis) 
eval(repeatThis) 
eval(repeatThis) 
eval(repeatThis) 
eval(repeatThis) 
eval(repeatThis) 
eval(repeatThis) 
eval(repeatThis) 
## Observation: 
## EI looks noisy, strange. 
## Predicted mean has low accuracy. 
## Why?  
## If repeated to often -> Numerical issues: 
## Due to close spacing of candidates -> Problem for Kriging model
## Potential remedy: use regularization with nugget and reinterpolation.
## Note: Other interpretation of such an issue may be convergence of
## the optimization process. But this is not necessarily correct.

## repeat as often as necessary (but now with regularization):
repeatThis <- expression({
  curve(objectFun(x),0,1)
  points(x,y)
  fit <- forrBuilder(as.matrix(x),as.matrix(y),
              control=list(
              uselambda=TRUE, # Use nugget (parameter lambda)
              reinterpolate=T # Reinterpolation, to fix uncertainty estimates, etc.
                     ))
  xtest <- seq(from=0, by=0.001,to=1)
  pred <- predict(fit,as.matrix(xtest),predictAll=T)
  ypred <- pred$f
  spred <- pred$s
  lines(xtest,ypred,lty=2)
  points(xtest[which.min(ypred)],ypred[which.min(ypred)],col="black",pch=20)  
  ei <- 10^(-spotInfillExpImp(ypred,spred,min(y)))
  par(new = T)
  plot(xtest,ei,lty=3, type="l", axes=F, xlab=NA, ylab=NA, 
       ylim=rev(range(ei)))
  axis(side = 4); mtext(side = 4, line = 2, 'EI')
  points(xtest[which.max(ei)],ei[which.max(ei)],col="red",pch=20)
  newx <- xtest[which.max(ei)]
  x <- c(x,newx)
  y <- c(y,objectFun(newx))
})
eval(repeatThis)
eval(repeatThis)