Code Documentation

‘base’ Module

This module implements general containers for typical optimization methods.

class base.Base(obj, **kwargs)[source]

This is the base class that serves all the iterative optimization methods. An object derived from Base requires the following parameters:

Parameters:
  • objMANDATORY- is a real valued function to be minimized
  • x0 – an initial guess of the optimal point
  • method – the optimization class which implements iterate and terminate procedures (default: OptimTemplate that returns the value of the function at the initial point x0)
  • VerboseBoolean- If True prompts messages at every stage of the iteration as well as termination
  • kwargs – the rest of parameters that will ba passed to method

The object then passes all other given parameters to the method class for further processes. When a termination condition is satisfied, the object fills the results in the solution attribute which is an instance of Solution class. The given class method can pass arbitrary pieces of information to the solution by modifying its MetaData dictionary.

When an object optim of type Base initiated, the optimization process can be invoked by calling the object itself like a function:

optim = Base(f, method=QuasiNewton, x0=init_point)
optim()
print(optim.solution)
class base.OptimTemplate(obj, **kwargs)[source]

Provides a template for an iterative optimization method.

Parameters:
  • obj – a real valued function (objective function)
  • x0 – an initial guess for a (local) minimum
  • jac – a vector calculating the gradient of the objective function (optional, if not given will be numerically approximated)
  • difftool – an object to calculate Gradient and Hessian of the objective (optional, default NumericDiff.Simple)
class base.Solution[source]

A class to keep outcome and details of the optimization run.

‘QuasiNewton’ Module

This module contains implementations of variations of unconstrained optimization methods known as Quasi-Newton methods. A Quasi-Newton method is an iterative algorithm that approximates a local minima of the objective function. Starting with a given initial point x0, each iteration consists of three major subprocedures:

  • Finding a descent direction (via DescentDirection class)
  • Finding the length of descent (using LineSearch class)
  • Check the termination condition (Termination class).

The right strategy to attempt an optimization problem must be pre-determined, otherwise, it uses a default set up to solve the problem.

class QuasiNewton.Barrier(QNRef, **kwargs)[source]

Implementation of some barrier functions to be used for constrained optimization problems. Three barrier functions are implemented:

  • Carrol: is the standard barrier function defined by \(-\frac{1}{g_i(x)}\)
  • Logarithmic: is the standard barrier function defined by \(-\log(g_i(x))\)
  • Expn: is the standard barrier function defined by \(e^{-g_i(x)+\epsilon}\)

The only barrier function implemented for the equality is the typical function known as Courant function which is simply \(h_j^2(x)\).

The default barrier function is Carrol and the default penalty factor is 10^{-5}. To specify the barrier function and penalty factor initiate the optimizer with keywords br_func that accepts one of the above three values and penalty that must be a positive real number.

class QuasiNewton.DescentDirection(QNRef, **kwargs)[source]

Implements various descent direction methods for Quasi-Newton methods. The descent method can be determined at initiation using dd_method parameter. The following values are acceptable:

  • ‘Gradient’: (default) The steepest descent direction.
  • ‘Newton’: Newton Conjugate Gradient method.
  • ‘FletcherReeves’: Fletcher-Reeves method.
  • ‘PolakRibiere’: Polak-Ribiere method.
  • ‘HestenesStiefel’: Hestenes-Stiefel method.
  • ‘DaiYuan’: Dai-Yuan method
  • ‘DFP’: Davidon-Fletcher-Powell formula.
  • ‘BFGS’: Broyden-Fletcher-Goldfarb-Shanno algorithm.
  • ‘Broyden’: Broyden’s method.
  • ‘SR1’: Symmetric rank-one method.

To calculate derivatives, the QuasiNewton class uses the object provided as the value of the difftool variable at initiation.

BFGS()[source]
Returns:the descent direction determined by Broyden-Fletcher-Goldfarb-Shanno algorithm
Broyden()[source]
Returns:the descent direction determined by Broyden’s method
DFP()[source]
Returns:the descent direction determined by Davidon-Fletcher-Powell formula
DaiYuan()[source]
Returns:the descent direction determined by Dai-Yuan method
FletcherReeves()[source]
Returns:the descent direction determined by Fletcher-Reeves method
Gradient()[source]
Returns:the gradient at current point
HestenesStiefel()[source]
Returns:the descent direction determined by Hestenes-Stiefel method
Newton()[source]
Returns:the descent direction determined by Newton Conjugate Gradient method
PolakRibiere()[source]
Returns:the descent direction determined by Polak-Ribiere method
SR1()[source]
Returns:the descent direction determined by Symmetric rank-one method
class QuasiNewton.LineSearch(QNref, **kwargs)[source]

This class provides the step length at each iteration. The value of ls_bt_method at initiation determines whether to use ‘BarzilaiBorwein’ method or a variation of ‘Backtrack’ (default). It accepts a control parameter tau and max_lngth at initiation. tau must be a positive real less than 1. The variation of the backtrack is then determined by the value of ls_bt_method which can be selected among the following:

  • ‘Armijo’: indicates Armijo condition and the parameter c1 can be modified at initiation as well.
  • ‘Wolfe’: indicates Wolfe condition and the parameters c1 and c2 can be modified at initiation.
  • ‘StrongWolfe’: indicates StrongWolfe condition and the parameters c1 and c2 can be modified at initiation.
  • ‘Goldstein’: indicates Goldstein condition and the parameter c1 can be modified at initiation.
Armijo(alpha, ft_x, tx)[source]

Implementation of Armijo.

Parameters:
  • alpha – current candidate for step length
  • ft_x – value of the objective at the candidate point
  • tx – the candidate point
Returns:

True or False

Backtrack()[source]

A generic implementation of Backtrack.

Returns:step length
BarzilaiBorwein()[source]

Implementation of Barzilai-Borwein.

Returns:step length
Goldstein(alpha, ft_x, tx)[source]

Implementation of Goldstein.

Parameters:
  • alpha – current candidate for step length
  • ft_x – value of the objective at the candidate point
  • tx – the candidate point
Returns:

True or False

StrongWolfe(alpha, ft_x, tx)[source]

Implementation of Strong Wolfe.

Parameters:
  • alpha – current candidate for step length
  • ft_x – value of the objective at the candidate point
  • tx – the candidate point
Returns:

True or False

Wolfe(alpha, ft_x, tx)[source]

Implementation of Wolfe.

Parameters:
  • alpha – current candidate for step length
  • ft_x – value of the objective at the candidate point
  • tx – the candidate point
Returns:

True or False

class QuasiNewton.QuasiNewton(obj, **kwargs)[source]
Parameters:
  • obj – the objective function
  • ineq(optional) list of inequality constraints, default: []
  • eq(optional) list of equality constraints, default: []
  • ls_method(optional) the line search strategy, default: Backtrack
  • ls_bt_method(optional) the backtrack termination condition, default: Armijo
  • dd_method(optional) the descent direction method, default: Gradient
  • t_method(optional) termination condition, default: Cauchy
  • br_func(optional) barrier function family, default: Carrol
  • penalty(optional) penalty factor for the barrier function, default: 1.e5
  • max_iter(optional) maximum number of iterations, default: 100

This class hosts a family of first and second order iterative methods to solve an unconstrained optimization problem. The general schema follows the following steps:

  • Given the point \(x\), find a suitable descent direction \(p\).
  • Find a suitable length \(\alpha\) for the direction \(p\) such that \(x+\alpha p\) results in an appropriate decrease in values of the objective.
  • Update \(x\) to \(x+\alpha p\) and repeat the above steps until a termination condition is satisfied.

The initial value for x can be set at initiation by passing x0=init_point to the Base instance. There are various methods to determine the descent direction p at each step. The DescentDirection class implements a variety of these methods. To choose one of these methods one should pass the method by its known name at initiation simply by setting dd_method=’method name’. This parameter will be passed to DescentDirection class (see the documentation for DescentDirection). Also, to determine a suitable value for \(alpha\) various options are available the class LineSearch is responsible for handling the computation for \(alpha\). The parameters ls_method and ls_bt_method can be set at initiation to determine the details for line search. The termination condition also can vary and the desired condition can be determined by setting t_method at initiation which will be passed to the Termination class. Each of these classes may accept other parameters that can be set at initiation. To find out about those parameter see the corresponding documentation.

iterate()[source]

This method updates the iterate method of the OptimTemplate by customizing the descent direction method as well as finding the descent step length. These method can be determined by the user.

Returns:None
terminate()[source]

This method updates the terminate method of the OptimTemplate which is given by user.

Returns:True or False
class QuasiNewton.Termination(QNRef, **kwargs)[source]

Implements various termination criteria for Quasi-Newton loop. A particular termination method can be selected at initiation of the Base object by setting t_method with the name of the method as an string. The following termination criteria are implemented:

  • ‘Cauchy’: Checks of the changes in the values of the objective function are significant enough or not.
  • ‘ZeroGradient’: Checks if the gradient of the objective is close to zero or not.

The value of the tolerated error is a property of OptimTemplate and hence can be modified as desired.

Cauchy()[source]

Checks if the values of the objective function form a Cauchy sequence or not.

Returns:True or False
Cauchy_x()[source]

Checks if the sequence of points form a Cauchy sequence or not.

Returns:True or False
ZeroGradient()[source]

Checks if the gradient vector is small enough or not.

Returns:True or False

‘NumericDiff’ Module

This module provides very basic way to numerically approximate partial derivatives, gradient and Hessian of a function.

class NumericDiff.Simple(**kwargs)[source]

A simple class to calculate partial derivatives of a given function. Passing a value for Infinitesimal forces the calculations to be done according to the infinitesimal value provided by user. Otherwise, the default value (1.e-7) is used.

Diff(f, i=0)[source]
Parameters:
  • f – a real valued function
  • i – the index variable for differentiation
Returns:

partial derivative of \(f\) with respect to :math:’i^{th}` variable as a function.

Gradient(f)[source]
Parameters:f – a real valued function
Returns:a vector function that returns the gradient vector of f at each point.
Hessian(f)[source]
Parameters:f – a real valued function
Returns:the Hessian matrix of \(f\) at each point

‘excpt’ Module

This module provides descriptive exception for the other modules.

exception excpt.DiffEror(*args)[source]

Errors that may have happened while calculating derivatives of functions.

exception excpt.DirectionError(*args)[source]

Handles the errors caused during the computation of a descent direction.

exception excpt.Error(*args)[source]

Generic errors that may occur in the course of a run.

exception excpt.MaxIterations(*args)[source]

Errors caused by reaching the preset maximum number of iterations.

exception excpt.Undeclared(*args)[source]

Raised when an undeclared function is used.

exception excpt.ValueRange(*args)[source]

Errors resulted from computations over unauthorized regions.