Skip to content

lbl-camera/HGDL

Repository files navigation

HGDL

PyPI Documentation Status HGDL CI Codecov PyPI - License DOI

HGDL is an API for HPC distributed constrained function optimization. At the core, the algorithm uses local and global optimization and bump-function-based deflation to provide a growing list of unique optima of a differentiable function. This tackles the common problem of non-uniquness of optimization problems, especially in machine learning.

Usage

The following demonstrates a simple usage of the HGDL API.

import numpy as np
from hgdl.hgdl import HGDL as hgdl
from hgdl.support_functions import *
import dask.distributed as distributed

bounds = np.array([[-500,500],[-500,500]])
#dask_client = distributed.Client("10.0.0.184:8786")
a = hgdl(schwefel, schwefel_gradient, bounds,
        global_optimizer = "genetic",
        local_optimizer = "dNewton", #put in local optimzers from scipy.optimize.minimize
        number_of_optima = 30000,
        num_epochs = 100)

x0 = np.random.uniform(low = bounds[:, 0], high = bounds[:,1],size = (20,2))
a.optimize(x0 = x0)

###the thread is now released, but the work continues in the background

a.get_latest() ##prints the current result whenever queried

a.kill_client() ##stops the execution and returns the result

Credits

Main Developers: Marcus Noack ([email protected]) and David Perryman. Several people from across the DOE national labs have given insights that led to the code in its current form. See AUTHORS for more details on that. HGDL is based on the HGDN algorithm by Noack and Funke.