package resource_cache

  1. Overview
  2. Docs
Legend:
Library
Module
Module type
Parameter
Class
Class type

Wrap a resource that does not natively support a has_close_started operation in a simple record to add such tracking.

Parameters

Signature

module Status : Status.S with type Key.t = R.Key.t
type t
val init : config:Config.t -> log_error:(string -> unit) -> R.Common_args.t -> t
val status : t -> Status.t
val config : t -> Config.t
val with_ : ?open_timeout:Core.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> t -> R.Key.t -> f:(R.t -> 'a Async_kernel.Deferred.t) -> 'a Async_kernel.Deferred.Or_error.t

with_ t key ~f calls f resource where resource is either:

1) An existing cached resource that was opened with key' such that R.Key.equal key key' 2) A newly opened resource created by R.open_ key common_args, respecting the limits of t.config

Returns an error if:

  • the cache is closed
  • R.open_ returned an error
  • no resource is obtained before give_up is determined

If f raises, the exception is not caught, but the resource will be closed and the Cache will remain in a working state (no resources are lost).

val with_' : ?open_timeout:Core.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> t -> R.Key.t -> f:(R.t -> 'a Async_kernel.Deferred.t) -> [ `Ok of 'a | `Gave_up_waiting_for_resource | `Error_opening_resource of Core.Error.t | `Cache_is_closed ] Async_kernel.Deferred.t

Like with_ but classify the different errors

val with_any : ?open_timeout:Core.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> ?load_balance:bool -> t -> R.Key.t list -> f:(R.t -> 'a Async_kernel.Deferred.t) -> (R.Key.t * 'a) Async_kernel.Deferred.Or_error.t

Like with_ and with_' except f is run on the first matching available resource (or the first resource that has availability to be opened).

Preference is given towards resources earlier in the list, unless ~load_balance:true has been specified, in which case preference is given to ensure that load is approximately balanced. The key with the least number of open connections will be favored.

val with_any' : ?open_timeout:Core.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> ?load_balance:bool -> t -> R.Key.t list -> f:(R.t -> 'a Async_kernel.Deferred.t) -> [ `Ok of R.Key.t * 'a | `Error_opening_resource of R.Key.t * Core.Error.t | `Gave_up_waiting_for_resource | `Cache_is_closed ] Async_kernel.Deferred.t
val with_any_loop : ?open_timeout:Core.Time_ns.Span.t -> ?give_up:unit Async_kernel.Deferred.t -> ?load_balance:bool -> t -> R.Key.t list -> f:(R.t -> 'a Async_kernel.Deferred.t) -> [ `Ok of R.Key.t * 'a | `Error_opening_all_resources of (R.Key.t * Core.Error.t) list | `Gave_up_waiting_for_resource | `Cache_is_closed ] Async_kernel.Deferred.t

Tries with_any' in a loop (removing args that have open errors) until receiving an `Ok, or until it has failed to open all resources in args_list.

val close_started : t -> bool
val close_finished : t -> unit Async_kernel.Deferred.t
val close_and_flush : t -> unit Async_kernel.Deferred.t

Close all currently open resources and prevent the creation of new ones. All subsequent calls to with_ and immediate fail with `Cache_is_closed. Any jobs that are waiting for a connection will return with `Cache_is_closed. The returned Deferred.t is determined when all jobs have finished running and all resources have been closed.

OCaml

Innovation. Community. Security.