# Selecting the Granularity You Want in GlobalSearch or MultiStart 2

Posted by **Loren Shure**,

I'd like to introduce this week's guest blogger Alan Weiss. Alan writes documentation for mathematical toolboxes here at MathWorks.

Do you use `GlobalSearch` or `MultiStart` for finding multiple local solutions to your optimization problems? Both of these solvers can report the locations and objective function values of the local solutions they find, as well as the starting points that led to each local solution.

Sometimes, though, the solvers report too many local solutions. Their definition of what constitutes "distinct" solutions can differ from yours. This article shows the problem, and two solutions. One solution involves setting `GlobalSearch` or `MultiStart` tolerances. But this can be difficult if you don't know beforehand how close solutions can be. The other involves the `clumpthem` function; download it here.

### Contents

#### A Function with Six Local Minima

For example, look at the six-hump camelback function described in the documentation. Get the `sixhumps` code.

sixmin = sixhumps

sixmin = @(x)(4*x(:,1).^2-2.1*x(:,1).^4+x(:,1).^6/3+x(:,1).*x(:,2)-4*x(:,2).^2+4*x(:,2).^4)

You can see in the contour plot that there are six local minima. The colored asterisks mark the minima.

`MultiStart` Finds 50 Local Minima

Take a look at the number of local minima reported by running `MultiStart` with the `fmincon` local solver and active-set algorithm.

ms = MultiStart; opts = optimoptions(@fmincon,'Algorithm','active-set'); problem = createOptimProblem('fmincon','x0',[-1,2],... 'objective',sixmin,'lb',[-3,-3],'ub',[3,3],... 'options',opts); rng default % for reproducibility [xminm,fminm,flagm,outptm,manyminsm] = run(ms,problem,50); % run fmincon 50 times size(manyminsm)

MultiStart completed the runs from all start points. All 50 local solver runs converged with a positive local solver exit flag. ans = 1 50

How could there be 50 separate minima reported when you know that there are only six points that are local minima? The answer is that many of these minima are close to each other. For example, the first three points look the same.

manyminsm(1).X manyminsm(2).X manyminsm(3).X

ans = -0.089843 0.71266 ans = -0.089844 0.71265 ans = -0.089834 0.71265

Of course, these reported minima aren't really the same.

norm(manyminsm(1).X - manyminsm(2).X)

ans = 7.6691e-06

The point is, its default tolerances cause `MultiStart` to report different minima if they differ by more than 1e-6 in value or position. And these minima are more than 1e-6 apart.

#### Clump the Minima with `MultiStart`

You can loosen the default `MultiStart` tolerances to have `MultiStart` automatically combine similar minima.

rng default % for reproducibility ms.TolX = 1e-3; ms.TolFun = 1e-4; [xminloose,fminloose,flagloose,outptloose,manyminsloose] = run(ms,problem,50); size(manyminsloose)

MultiStart completed the runs from all start points. All 50 local solver runs converged with a positive local solver exit flag. ans = 1 6

Loosening the tolerances caused `MultiStart` to give just the six truly different local minima. But this process involved rerunning the solver.

#### Clump the Minima Programmatically

Wouldn't it be better to clump the minima without running the solver again? That is the point of the `clumpthem` function. It takes `MultiStart` or `GlobalSearch` solutions and clumps them to the tolerances you want. For example

```
clumped = clumpthem(manyminsm,1e-3,1e-4); % the first tolerance is TolX, then TolFun
size(clumped)
```

ans = 1 6

The answers you get either way are identical.

isequal(manyminsloose(1).X, clumped(1).X)

ans = 1

How does `clumpthem` work? It uses the fact that `MultiStart` and `GlobalSearch` return solutions in order, from best (lowest objective function value) to worst. It takes the first solution and compares it to the second, both in terms of objective function values and distance between the solutions. If both comparisons are below the tolerances, `clumpthem` removes the second solution, adding its starting point to the list of `X0` starting points. If the objective function value of the second solution is too high, then `clumpthem` starts a new clump. If the objective function difference is small enough, but the distance between solutions is too large, `clumpthem` proceeds to compare the first solution with the third. It continues in this way until it runs out of solutions to clump.

This procedure is quite speedy, as you will see in the next section.

#### Performance

Check the speed of `clumpthem`.

tic; clumped = clumpthem(manyminsm,1e-3,1e-4); toc

Elapsed time is 0.006322 seconds.

`MultiStart` is much slower, even with this fairly simple function:

rng default % for reproducibility tic; [xminloose,fminloose,flagloose,outptloose,manyminsloose] = run(ms,problem,50); toc

MultiStart completed the runs from all start points. All 50 local solver runs converged with a positive local solver exit flag. Elapsed time is 0.863509 seconds.

#### Feedback

Do you use `MultiStart` or `GlobalSearch` to search for multiple local solutions? Have you been annoyed by finding spurious differences in solutions? Tell us about it here.

Get the MATLAB code

Published with MATLAB® R2014a

**Category:**- How To,
- Optimization

## 2 CommentsOldest to Newest

**1**of 2

I’ve tried working with pretty much every search tool in the GADS toolbox trying to find the best solver for my problem, and I would appreciate any insight you could offer. I’m working in a system with a potentially enormous number of local minima, with very steep slopes. What I’m trying to do is to fit a combination of decaying sinusoids (say 4-8) to irregularly sampled noisy data. i.e.

y = sum(amp.*exp(-decay*(t-t0))*sin(f*(t-t0)) Assuming all the individual sinusoids share the same t0, but each will have its own amplitude, decay rate, and frequency

So far the only moderately effective solution I’ve found is to use a genetic solver with a custom mutation functions where I combine adaptive feasible, small spread gaussians, and uniform mutation AND I periodically run lsqnonlin in order to make sure that I’m at a local minimum. I’ve customized the crossover function so that I’m combining the contributions of the sinusoids as groups (i.e. keep a particular frequency coupled with it’s amplitude and decay rate). I’ve bound the problem as tightly as I can, which obviously helps. The problem is that I can often get stuck in solutions that are local minima.

I’ve tried patternsearch, globalsearch, multistart, simulatedannealing, etc. But I’ve not had great luck finding a computation that can reliably even get close. I’ve also tried using multiple populations with migration, in the hopes that each population will find a piece of the problem and then they will combine later.

Any suggestions?

Thanks.

**2**of 2

Dan, it sounds as if you have a very difficult global minimization problem. It sounds as if the problem is that nearly any algorithm quickly gets stuck in a local minimum.

I guess that I would try using MultiStart with lsqnonlin in order to sample as many start points as possible. You might want to look at Isolated Global minimum in the documentation for a sort of case study of a related problem.

Good luck,

Alan

## Recent Comments