Skip to content

Commit

Permalink
Bug fix in trust region subproblem - upgrade to version 1.0.1
Browse files Browse the repository at this point in the history
  • Loading branch information
lindonroberts committed Feb 20, 2018
1 parent ce35d07 commit dc229d2
Show file tree
Hide file tree
Showing 9 changed files with 44 additions and 24 deletions.
8 changes: 5 additions & 3 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
==================================================================
DFO-LS: Derivative-Free Optimizer for Least-Squares |PyPI Version|
==================================================================
DFO-LS is a flexible package for solving nonlinear least-squares minimisation, without requiring derivatives of the objective.
DFO-LS is a flexible package for solving nonlinear least-squares minimisation, without requiring derivatives of the objective. It is particularly useful when evaluations of the objective function are expensive and/or noisy.

This is an implementation of the algorithm from our paper: C. Cartis, J. Fiala, B. Marteau and L. Roberts, Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers, technical report, University of Oxford, (2018).
This is an implementation of the algorithm from our paper: C. Cartis, J. Fiala, B. Marteau and L. Roberts, Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers, technical report, University of Oxford, (2018). DFO-LS is more flexible version of `DFO-GN <https://github.com/numericalalgorithmsgroup/dfogn>`_.

If you are interested in solving general optimization problems (without a least-squares structure), you may wish to try `Py-BOBYQA <https://github.com/numericalalgorithmsgroup/pybobyqa>`_, which has many of the same features as DFO-LS.

Documentation
-------------
Expand All @@ -23,7 +25,7 @@ Additionally, the following python packages should be installed (these will be i

Installation using pip
----------------------
For easy installation, use *pip* (http://www.pip-installer.org/) as root::
For easy installation, use `pip <http://www.pip-installer.org/>`_ as root::

$ [sudo] pip install DFO-LS

Expand Down
9 changes: 6 additions & 3 deletions dfols/tests/test_trust_region.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,8 @@ def runTest(self):
s_cauchy, red_cauchy, crvmin_cauchy = cauchy_pt(g, hess, Delta)
self.assertTrue(est_min <= red_cauchy, 'Cauchy reduction not achieved')
self.assertTrue(np.all(gnew == g + hess.vec_mul(d)), 'Wrong gnew')
self.assertAlmostEqual(crvmin, -1.0, 'Wrong crvmin')
print(crvmin)
self.assertAlmostEqual(crvmin, 1.2, 'Wrong crvmin')


class TestUncBdry(unittest.TestCase):
Expand Down Expand Up @@ -210,7 +211,8 @@ def runTest(self):
# print(d)
self.assertTrue(est_min <= red_cauchy, 'Cauchy reduction not achieved')
self.assertTrue(np.all(gnew == g + hess.vec_mul(d)), 'Wrong gnew')
self.assertAlmostEqual(crvmin, 1.5, 'Wrong crvmin')
print(crvmin)
self.assertAlmostEqual(crvmin, -1.0, 'Wrong crvmin')


class TestConBdry(unittest.TestCase):
Expand All @@ -233,7 +235,8 @@ def runTest(self):
s_cauchy, red_cauchy, crvmin_cauchy = cauchy_pt_box(g, hess, Delta, sl - xopt, su - xopt)
self.assertTrue(est_min <= red_cauchy, 'Cauchy reduction not achieved')
self.assertTrue(np.max(np.abs(gnew - g - hess.vec_mul(d))) < 1e-10, 'Wrong gnew')
self.assertAlmostEqual(crvmin, 1.0, 'Wrong crvmin')
print(crvmin)
self.assertAlmostEqual(crvmin, -1.0, 'Wrong crvmin')
# self.assertAlmostEqual(crvmin, crvmin_cauchy, 'Wrong crvmin')


Expand Down
12 changes: 6 additions & 6 deletions dfols/trust_region.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ def trsbox(xopt, g, hess, sl, su, delta):

# Reduce STPLEN if necessary in order to preserve the simple bounds,
# letting IACT be the index of the new constrained variable.
iact = -1
iact = None
for i in range(n):
if s[i] != 0.0:
temp = (su[i] - xopt[i] - d[i] if s[i] > 0.0 else sl[i] - xopt[i] - d[i]) / s[i]
Expand All @@ -166,8 +166,8 @@ def trsbox(xopt, g, hess, sl, su, delta):
if stplen > 0.0:
iterc += 1
temp = shs / stepsq
if iact == 0 and temp > 0.0:
crvmin = (min(crvmin, temp) if crvmin != -1.0 else temp)
if iact is None and temp > 0.0:
crvmin = min(crvmin, temp) if crvmin != -1.0 else temp
ggsav = gredsq
gnew += stplen * hs
d += stplen * s
Expand All @@ -176,7 +176,7 @@ def trsbox(xopt, g, hess, sl, su, delta):
qred += sdec

# Restart the conjugate gradient method if it has hit a new bound.
if iact > -1:
if iact is not None:
nact += 1
xbdi[iact] = (1 if s[iact] >= 0.0 else -1)
delsq = delsq - d[iact] ** 2
Expand Down Expand Up @@ -253,7 +253,7 @@ def alt_trust_step(n, xopt, hess, sl, su, d, xbdi, nact, gnew, qred):
# bound, there is a branch back to label 100 after fixing that variable.
free_variable_reached_bound = False
angbd = 1.0
iact = -1
iact = None
for i in range(n):
if xbdi[i] == 0:
tempa = xopt[i] + d[i] - sl[i]
Expand Down Expand Up @@ -347,7 +347,7 @@ def alt_trust_step(n, xopt, hess, sl, su, d, xbdi, nact, gnew, qred):
hred = cth * hred + sth * hs

qred += sdec
if iact > -1 and isav == iu - 1:
if iact is not None and isav == iu - 1:
nact += 1
xbdi[iact] = xsav
restart_alt_loop = True
Expand Down
2 changes: 1 addition & 1 deletion dfols/version.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,4 @@
"""

__version__ = '1.0'
__version__ = '1.0.1'
12 changes: 12 additions & 0 deletions docs/history.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
Version History
===============
This section lists the different versions of DFO-LS and the updates between them.

Version 1.0 (6 Feb 2018)
------------------------
* Initial release of DFO-LS

Version 1.0.1 (20 Feb 2018)
---------------------------
* Minor bug fix to trust region subproblem solver (the output :code:`crvmin` is calculated correctly) - this has minimal impact on the performance of DFO-LS.

5 changes: 4 additions & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,9 @@ That is, DFO-LS solves
\min_{x\in\mathbb{R}^n} &\quad f(x) := \sum_{i=1}^{m}r_{i}(x)^2 \\
\text{s.t.} &\quad a \leq x \leq b
Full details of the DFO-LS algorithm are given in our paper: C. Cartis, J. Fiala, B. Marteau and L. Roberts, Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers, technical report, University of Oxford, (2018).
Full details of the DFO-LS algorithm are given in our paper: C. Cartis, J. Fiala, B. Marteau and L. Roberts, Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers, technical report, University of Oxford, (2018). DFO-LS is a more flexible version of `DFO-GN <https://github.com/numericalalgorithmsgroup/dfogn>`_.

If you are interested in solving general optimization problems (without a least-squares structure), you may wish to try `Py-BOBYQA <https://github.com/numericalalgorithmsgroup/pybobyqa>`_, which has many of the same features as DFO-LS.

DFO-LS is released under the GNU General Public License. Please `contact NAG <http://www.nag.com/content/worldwide-contact-information>`_ for alternative licensing.

Expand All @@ -33,6 +35,7 @@ DFO-LS is released under the GNU General Public License. Please `contact NAG <ht
userguide
advanced
diagnostic
history

Acknowledgements
----------------
Expand Down
18 changes: 9 additions & 9 deletions docs/userguide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -162,9 +162,9 @@ DFO-LS correctly finds the solution to the constrained problem:
Solution xmin = [ 0.9 0.81]
Residual vector = [ 0. 0.1]
Objective value f(xmin) = 0.01
Needed 64 objective evaluations (at 64 points)
Needed 65 objective evaluations (at 65 points)
Approximate Jacobian = [[ -1.79999998e+01 9.99999990e+00]
[ -1.00000000e+00 -1.26970349e-09]]
[ -9.99999998e-01 -2.53940698e-09]]
Exit flag = 0
Success: rho has reached rhoend
****************************
Expand Down Expand Up @@ -196,8 +196,8 @@ And we can now see each evaluation of :code:`objfun`:
Function eval 2 at point 2 has f = 14.337296 at x = [-1.08 0.85]
Function eval 3 at point 3 has f = 55.25 at x = [-1.2 0.73]
...
Function eval 63 at point 63 has f = 0.0100000029949496 at x = [ 0.89999999 0.81 ]
Function eval 64 at point 64 has f = 0.00999999999999993 at x = [ 0.9 0.81]
Function eval 64 at point 64 has f = 0.0100000029949496 at x = [ 0.89999999 0.81 ]
Function eval 65 at point 65 has f = 0.00999999999999993 at x = [ 0.9 0.81]
Did a total of 1 run(s)
If we wanted to save this output to a file, we could replace the above call to :code:`logging.basicConfig()` with
Expand Down Expand Up @@ -453,11 +453,11 @@ The output of this is
****** DFO-LS Results ******
Solution xmin = [ 0.09777309 -2.32510588]
Residual vector = [ -3.16191517e-13 -3.58602037e-12]
Objective value f(xmin) = 1.295951917e-23
Needed 17 objective evaluations (at 17 points)
Approximate Jacobian = [[ 3.32510506 0.9022256 ]
[ 10.22775528 -1.00001417]]
Residual vector = [ 2.89990254e-13 3.31557004e-12]
Objective value f(xmin) = 1.107709904e-23
Needed 18 objective evaluations (at 18 points)
Approximate Jacobian = [[ 3.32510429 0.90222738]
[ 10.22774647 -0.9999939 ]]
Exit flag = 0
Success: Objective is sufficiently small
****************************
Expand Down
Binary file modified manual.pdf
Binary file not shown.
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@
author='Lindon Roberts',
author_email='[email protected]',
url='https://github.com/numericalalgorithmsgroup/dfols/',
download_url='https://github.com/numericalalgorithmsgroup/dfols/archive/v1.0.tar.gz',
download_url='https://github.com/numericalalgorithmsgroup/dfols/archive/v1.0.1.tar.gz',
packages=['dfols'],
license='GNU GPL',
keywords = 'mathematics derivative free optimization nonlinear least squares',
Expand Down

0 comments on commit dc229d2

Please sign in to comment.