package scipy

  1. Overview
  2. Docs
Legend:
Library
Module
Module type
Parameter
Class
Class type
val get_py : string -> Py.Object.t

Get an attribute of this module as a Py.Object.t. This is useful to pass a Python function to another function.

module LineSearchWarning : sig ... end

As `scalar_search_wolfe1` but do a line search to direction `pk`

Parameters ---------- f : callable Function `f(x)` fprime : callable Gradient of `f` xk : array_like Current point pk : array_like Search direction

gfk : array_like, optional Gradient of `f` at point `xk` old_fval : float, optional Value of `f` at point `xk` old_old_fval : float, optional Value of `f` at point preceding `xk`

The rest of the parameters are the same as for `scalar_search_wolfe1`.

Returns ------- stp, f_count, g_count, fval, old_fval As in `line_search_wolfe1` gval : array Gradient of `f` at the final point

val line_search_BFGS : ?args:Py.Object.t -> ?c1:Py.Object.t -> ?alpha0:Py.Object.t -> f:Py.Object.t -> xk:Py.Object.t -> pk:Py.Object.t -> gfk:Py.Object.t -> old_fval:Py.Object.t -> unit -> Py.Object.t

Compatibility wrapper for `line_search_armijo`

val line_search_armijo : ?args:Py.Object.t -> ?c1:float -> ?alpha0:[ `F of float | `I of int | `Bool of bool | `S of string ] -> f:Py.Object.t -> xk:[> `Ndarray ] Np.Obj.t -> pk:[> `Ndarray ] Np.Obj.t -> gfk:[> `Ndarray ] Np.Obj.t -> old_fval:float -> unit -> Py.Object.t

Minimize over alpha, the function ``f(xk+alpha pk)``.

Parameters ---------- f : callable Function to be minimized. xk : array_like Current point. pk : array_like Search direction. gfk : array_like Gradient of `f` at point `xk`. old_fval : float Value of `f` at point `xk`. args : tuple, optional Optional arguments. c1 : float, optional Value to control stopping criterion. alpha0 : scalar, optional Value of `alpha` at start of the optimization.

Returns ------- alpha f_count f_val_at_alpha

Notes ----- Uses the interpolation algorithm (Armijo backtracking) as suggested by Wright and Nocedal in 'Numerical Optimization', 1999, pp. 56-57

val line_search_wolfe1 : ?gfk:[> `Ndarray ] Np.Obj.t -> ?old_fval:float -> ?old_old_fval:float -> ?args:Py.Object.t -> ?c1:Py.Object.t -> ?c2:Py.Object.t -> ?amax:Py.Object.t -> ?amin:Py.Object.t -> ?xtol:Py.Object.t -> f:Py.Object.t -> fprime:Py.Object.t -> xk:[> `Ndarray ] Np.Obj.t -> pk:[> `Ndarray ] Np.Obj.t -> unit -> [ `ArrayLike | `Ndarray | `Object ] Np.Obj.t

As `scalar_search_wolfe1` but do a line search to direction `pk`

Parameters ---------- f : callable Function `f(x)` fprime : callable Gradient of `f` xk : array_like Current point pk : array_like Search direction

gfk : array_like, optional Gradient of `f` at point `xk` old_fval : float, optional Value of `f` at point `xk` old_old_fval : float, optional Value of `f` at point preceding `xk`

The rest of the parameters are the same as for `scalar_search_wolfe1`.

Returns ------- stp, f_count, g_count, fval, old_fval As in `line_search_wolfe1` gval : array Gradient of `f` at the final point

val line_search_wolfe2 : ?gfk:[> `Ndarray ] Np.Obj.t -> ?old_fval:float -> ?old_old_fval:float -> ?args:Py.Object.t -> ?c1:float -> ?c2:float -> ?amax:float -> ?extra_condition:Py.Object.t -> ?maxiter:int -> f:Py.Object.t -> myfprime:Py.Object.t -> xk:[> `Ndarray ] Np.Obj.t -> pk:[> `Ndarray ] Np.Obj.t -> unit -> float option * int * int * float option * float * float option

Find alpha that satisfies strong Wolfe conditions.

Parameters ---------- f : callable f(x,*args) Objective function. myfprime : callable f'(x,*args) Objective function gradient. xk : ndarray Starting point. pk : ndarray Search direction. gfk : ndarray, optional Gradient value for x=xk (xk being the current parameter estimate). Will be recomputed if omitted. old_fval : float, optional Function value for x=xk. Will be recomputed if omitted. old_old_fval : float, optional Function value for the point preceding x=xk. args : tuple, optional Additional arguments passed to objective function. c1 : float, optional Parameter for Armijo condition rule. c2 : float, optional Parameter for curvature condition rule. amax : float, optional Maximum step size extra_condition : callable, optional A callable of the form ``extra_condition(alpha, x, f, g)`` returning a boolean. Arguments are the proposed step ``alpha`` and the corresponding ``x``, ``f`` and ``g`` values. The line search accepts the value of ``alpha`` only if this callable returns ``True``. If the callable returns ``False`` for the step length, the algorithm will continue with new iterates. The callable is only called for iterates satisfying the strong Wolfe conditions. maxiter : int, optional Maximum number of iterations to perform.

Returns ------- alpha : float or None Alpha for which ``x_new = x0 + alpha * pk``, or None if the line search algorithm did not converge. fc : int Number of function evaluations made. gc : int Number of gradient evaluations made. new_fval : float or None New function value ``f(x_new)=f(x0+alpha*pk)``, or None if the line search algorithm did not converge. old_fval : float Old function value ``f(x0)``. new_slope : float or None The local slope along the search direction at the new value ``<myfprime(x_new), pk>``, or None if the line search algorithm did not converge.

Notes ----- Uses the line search algorithm to enforce strong Wolfe conditions. See Wright and Nocedal, 'Numerical Optimization', 1999, pp. 59-61.

Examples -------- >>> from scipy.optimize import line_search

A objective function and its gradient are defined.

>>> def obj_func(x): ... return (x0)**2+(x1)**2 >>> def obj_grad(x): ... return 2*x[0], 2*x[1]

We can find alpha that satisfies strong Wolfe conditions.

>>> start_point = np.array(1.8, 1.7) >>> search_gradient = np.array(-1.0, -1.0) >>> line_search(obj_func, obj_grad, start_point, search_gradient) (1.0, 2, 1, 1.1300000000000001, 6.13, 1.6, 1.4)

val scalar_search_armijo : ?c1:Py.Object.t -> ?alpha0:Py.Object.t -> ?amin:Py.Object.t -> phi:Py.Object.t -> phi0:Py.Object.t -> derphi0:Py.Object.t -> unit -> Py.Object.t

Minimize over alpha, the function ``phi(alpha)``.

Uses the interpolation algorithm (Armijo backtracking) as suggested by Wright and Nocedal in 'Numerical Optimization', 1999, pp. 56-57

alpha > 0 is assumed to be a descent direction.

Returns ------- alpha phi1

val scalar_search_wolfe1 : ?phi0:float -> ?old_phi0:float -> ?derphi0:float -> ?c1:float -> ?c2:float -> ?amax:Py.Object.t -> ?amin:Py.Object.t -> ?xtol:float -> phi:Py.Object.t -> derphi:Py.Object.t -> unit -> float * float * float

Scalar function search for alpha that satisfies strong Wolfe conditions

alpha > 0 is assumed to be a descent direction.

Parameters ---------- phi : callable phi(alpha) Function at point `alpha` derphi : callable phi'(alpha) Objective function derivative. Returns a scalar. phi0 : float, optional Value of phi at 0 old_phi0 : float, optional Value of phi at previous point derphi0 : float, optional Value derphi at 0 c1 : float, optional Parameter for Armijo condition rule. c2 : float, optional Parameter for curvature condition rule. amax, amin : float, optional Maximum and minimum step size xtol : float, optional Relative tolerance for an acceptable step.

Returns ------- alpha : float Step size, or None if no suitable step was found phi : float Value of `phi` at the new point `alpha` phi0 : float Value of `phi` at `alpha=0`

Notes ----- Uses routine DCSRCH from MINPACK.

val scalar_search_wolfe2 : ?phi0:float -> ?old_phi0:float -> ?derphi0:float -> ?c1:float -> ?c2:float -> ?amax:float -> ?extra_condition:Py.Object.t -> ?maxiter:int -> phi:Py.Object.t -> derphi:Py.Object.t -> unit -> float option * float * float * float option

Find alpha that satisfies strong Wolfe conditions.

alpha > 0 is assumed to be a descent direction.

Parameters ---------- phi : callable phi(alpha) Objective scalar function. derphi : callable phi'(alpha) Objective function derivative. Returns a scalar. phi0 : float, optional Value of phi at 0. old_phi0 : float, optional Value of phi at previous point. derphi0 : float, optional Value of derphi at 0 c1 : float, optional Parameter for Armijo condition rule. c2 : float, optional Parameter for curvature condition rule. amax : float, optional Maximum step size. extra_condition : callable, optional A callable of the form ``extra_condition(alpha, phi_value)`` returning a boolean. The line search accepts the value of ``alpha`` only if this callable returns ``True``. If the callable returns ``False`` for the step length, the algorithm will continue with new iterates. The callable is only called for iterates satisfying the strong Wolfe conditions. maxiter : int, optional Maximum number of iterations to perform.

Returns ------- alpha_star : float or None Best alpha, or None if the line search algorithm did not converge. phi_star : float phi at alpha_star. phi0 : float phi at 0. derphi_star : float or None derphi at alpha_star, or None if the line search algorithm did not converge.

Notes ----- Uses the line search algorithm to enforce strong Wolfe conditions. See Wright and Nocedal, 'Numerical Optimization', 1999, pp. 59-61.

val warn : ?category:Py.Object.t -> ?stacklevel:Py.Object.t -> ?source:Py.Object.t -> message:Py.Object.t -> unit -> Py.Object.t

Issue a warning, or maybe ignore it or raise an exception.

OCaml

Innovation. Community. Security.