Derivative Free Optimization / Optimisation sans Gradient

 

 (V04 course, saison 2015/2016)

Common course between M2 AMS et M2 Optimisation (Paris Saclay)

 Anne Auger (INRIA), Laurent Dumas (UVSQ)



 
Optimization problems are encountered in many fields of engineering for which the associated cost function may be of various type : black box or explicit, with continuous or discrete variables, costly or not to compute, etc…

In many cases, the gradient of the cost function is not easy or even impossible to compute or it can exhibit many local minima leading to consider Derivative Free Optimization (DFO) methods.

This course deals with a large number of Derivative Free Optimization methods that have been recently developped, either local or global, deterministic or stochastic. It will be illustrated by various examples issued from industrial or medical fields.

 
 
Previous courses :

 

 cours 2011/2012, cours 2012/2013, cours 2013/2014  cours 2014/2015

 

 Outline

-PART 1 : STOCHASTIC METHODS

 

1) General introduction / Motivation for DFO

2) Stochastic algorithms framework (essentially ES)

3) General principles for step size and covariance matrix adaptation. CMA-ES algorithm.

4) Comparisons between CMA/PSO/NEWUOA/BFGS

 

-PARTIE 2 : DETERMINSTIC METHODS

 

1) Local methods: direct methods (Pattern Search, Nelder Mead, MDS), trust region methods (NEWUOA)

2) Global methods: DIRECT, response surface methods  (RBF, kriging)

 

 
 
Schedule

 

 

Vendredi 27 novembre 2015, 14h00-17h15 (ENSTA, A. Auger) 

Vendredi 04 décembre 2015, 14h00-17h15 (ENSTA, A. Auger) 

Vendredi 11 décembre 2015, 14h00-17h15 (ENSTA, A. Auger) 

Vendredi 18 décembre 2015, 14h00-17h15 (ENSTA, A. Auger) 

Vendredi 08 janvier 2016, 14h00-17h15 (ENSTA, A. Auger) 

Vendredi 15 janvier 2016, 14h00-17h15 (ENSTA, L. Dumas) 

Vendredi 22 janvier 2016, 14h00-17h15 (ENSTA, L. Dumas) 

Vendredi 29 janvier 2016, 14h00-17h15 (ENSTA, L. Dumas) 

Vendredi 05 février 2016, 14h00-17h15(ENSTA, L. Dumas) 

Vendredi 12 février 2016, 14h00-17h15 (ENSTA, L. Dumas) 

 

Exam : vendredi 19 février2016, 14h00-17h15  sujet.pdf

 

 
 
Scilab/ Matlab scripts (deterministic part)

 

Pattern Search : pattern.sci

Nelder Nead : NelderMead.sci

MDS : MDS.sci

Surrogate models : RBF.sci, krige.sce

 

Computer session :

 

For the Three Hump camelback function, on [-2,2.3] and for a maximal number of evaluations of 100, estimate the global minimum of the function  with :

 

-   a DFO method based on a direct search (pattern search, Nelder Mead, MDS)

-   a DFO method based on a metamodel (RBF, kriging)

-   the DIRECT method

 

Send by email the obtained results of your code for a random initialization with a fixed seed.

 

 

 

 Bibliography (determinsitic part)

A. Conn, K. Scheinberg and L. Vincente, Introduction to Derivative Free Optimization, SIAM, 2009.

A review article on DFO : Journal Global Opt. 2013

 

An article on the non convergence of Nelder Mead : SIAM J. Opt.1998

 

An article on the convergence  of MDS:  PhD1989

Description of the DIRECT method : DIRECT.pdf (Journal of Optimization theory and application, 1993)