leaspy.samplers

Submodules

Attributes

Classes

AbstractIndividualSampler

Abstract class for samplers of individual random variables.

AbstractPopulationSampler

Abstract class for samplers of population random variables.

AbstractSampler

Abstract sampler class.

IndividualGibbsSampler

Gibbs sampler for individual variables.

PopulationFastGibbsSampler

Fast Gibbs sampler for population variables.

PopulationGibbsSampler

Gibbs sampler for population variables.

PopulationMetropolisHastingsSampler

Metropolis-Hastings sampler for population variables.

Functions

sampler_factory(sampler, variable_type, **kwargs)

Factory for Samplers.

Package Contents

class AbstractIndividualSampler(name, shape, *, n_patients, acceptation_history_length=25)[source]

Bases: AbstractSampler

Abstract class for samplers of individual random variables.

Parameters:
namestr

The name of the random variable to sample.

shapetuple of int

The shape of the random variable to sample.

n_patientsint

Number of patients.

acceptation_history_lengthint > 0 (default 25)

Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.)

Attributes:
namestr

Name of variable

shapetuple of int

Shape of variable

n_patientsint

Number of patients.

acceptation_history_lengthint

Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.)

acceptation_historytorch.Tensor

History of binary acceptations to compute mean acceptation rate for the sampler in MCMC-SAEM algorithm. It keeps the history of the last acceptation_history_length steps.

Parameters:
  • name (str)

  • shape (Tuple[int, Ellipsis])

  • n_patients (int)

  • acceptation_history_length (int)

n_patients
class AbstractPopulationSampler(name, shape, *, acceptation_history_length=25, mask=None)[source]

Bases: AbstractSampler

Abstract class for samplers of population random variables.

Parameters:
namestr

The name of the random variable to sample.

shapetuple of int

The shape of the random variable to sample.

acceptation_history_lengthint > 0 (default 25)

Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.)

masktorch.Tensor, optional

A binary (0/1) tensor indicating which elements to sample. Elements with value 1 (True) are included in the sampling; elements with 0 (False) are excluded.

Attributes:
namestr

Name of variable

shapetuple of int

Shape of variable

acceptation_history_lengthint

Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.)

acceptation_historytorch.Tensor

History of binary acceptations to compute mean acceptation rate for the sampler in MCMC-SAEM algorithm. It keeps the history of the last acceptation_history_length steps.

masktorch.Tensor of bool, optional

A binary (0/1) tensor indicating which elements to sample. Elements with value 1 (True) are included in the sampling; elements with 0 (False) are excluded.

Parameters:
  • name (str)

  • shape (Tuple[int, Ellipsis])

  • acceptation_history_length (int)

  • mask (Optional[Tensor])

mask = None
class AbstractSampler(name, shape, *, acceptation_history_length=25)[source]

Bases: abc.ABC

Abstract sampler class.

Parameters:
namestr

The name of the random variable to sample.

shapetuple of int

The shape of the random variable to sample.

acceptation_history_lengthint > 0 (default 25)

Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.)

Attributes:
namestr

Name of variable

shapetuple of int

Shape of variable

acceptation_history_lengthint

Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (Same for population or individual variables by default.)

acceptation_historytorch.Tensor

History of binary acceptations to compute mean acceptation rate for the sampler in MCMC-SAEM algorithm. It keeps the history of the last acceptation_history_length steps.

Raises:
LeaspyModelInputError
Parameters:
  • name (str)

  • shape (Tuple[int, Ellipsis])

  • acceptation_history_length (int)

name
shape
acceptation_history_length = 25
acceptation_history
property ndim: int

Number of dimensions.

Return type:

int

property shape_acceptation: Tuple[int, Ellipsis]
Abstractmethod:

Return type:

Tuple[int, Ellipsis]

Return the shape of acceptation tensor for a single iteration.

Returns:
tuple of int

The shape of the acceptation history.

Return type:

Tuple[int, Ellipsis]

abstractmethod sample(state, *, temperature_inv)[source]

Apply a sampling step

<!> It will modify in-place the internal state, caching all intermediate values needed to efficient.

Parameters:
stateState

Object containing values for all model variables, including latent variables

temperature_invfloat > 0

Inverse of the temperature used in tempered MCMC-SAEM

Parameters:
Return type:

None

INDIVIDUAL_SAMPLERS
POPULATION_SAMPLERS
sampler_factory(sampler, variable_type, **kwargs)[source]

Factory for Samplers.

Parameters:
samplerAbstractSampler or str

If an instance of a subclass of AbstractSampler, returns the instance. If a string, returns a new instance of the appropriate class (with optional parameters kwargs).

variable_typeVariableType

The type of random variable that the sampler is supposed to sample.

**kwargs

Optional parameters for initializing the requested Sampler (not used if input is a subclass of AbstractSampler).

Returns:
AbstractSampler

The desired sampler.

Raises:
LeaspyAlgoInputError:

If the sampler provided is not supported.

Parameters:

sampler (SamplerFactoryInput)

Return type:

AbstractSampler

class IndividualGibbsSampler(name, shape, *, n_patients, scale, random_order_dimension=True, mean_acceptation_rate_target_bounds=(0.2, 0.4), adaptive_std_factor=0.1, **base_sampler_kws)[source]

Bases: GibbsSamplerMixin, leaspy.samplers.base.AbstractIndividualSampler

Gibbs sampler for individual variables.

Individual variables are handled using a grouped Gibbs sampler. Currently, this is the only sampler implemented for individual variables.

Parameters:
namestr

The name of the random variable to sample.

shapetuple of int

The shape of the random variable to sample.

scalefloat > 0 or torch.FloatTensor > 0

An approximate scale for the variable. It will be used to scale the initial adaptive std-dev used in sampler. An extra factor will be applied on top of this scale :

Note: If a torch.Tensor is passed, its shape must be compatible with the shape of the variable.

random_order_dimensionbool (default True)

Whether to randomize the order of dimensions during the sampling loop. This is applied only to population variables as group sampling is used for individual variables. Article gives a reason on why we should activate this flag.

mean_acceptation_rate_target_boundstuple [lower_bound: float, upper_bound: float] with 0 < lower_bound < upper_bound < 1

Bounds on the mean acceptation rate. If the rate is outside this range, the adaptive std-dev will be updated to maintain the target acceptance rate. (e.g: ~30%).

adaptive_std_factorfloat in (0, 1)

The factor by which the sampler’s std-dev is adapted:

  • If acceptance rate is too low → decrease by 1 - factor

  • If too high → increase by 1 + factor

**base_sampler_kws

Additional keyword arguments passed to AbstractSampler init method. For example, you may pass acceptation_history_length.

Parameters:
STD_SCALE_FACTOR = 0.5
validate_scale(scale)[source]

Validate the provided scale.

Parameters:
scalefloat or torch.Tensor

The scale to be validated.

Returns:
torch.Tensor

Valid scale.

Parameters:

scale (Union[float, Tensor])

Return type:

Tensor

property shape_adapted_std: tuple

Shape of adaptative variance.

Return type:

tuple

sample(state, *, temperature_inv)[source]

Apply a grouped Metropolis-Hastings (MH) sampling step for individual variables.

For each individual variable:
  • Compute the current attachment and regularity.

  • Propose a new value for the variable.

  • Recompute the attachment and regularity.

  • Accept or reject the proposal using the MH criterion, based on the inverse temperature.

Parameters:
stateState

Object containing values for all model variables, including latent variables

temperature_invfloat > 0

The temperature to use.

Parameters:
Return type:

None

Notes

Population variables remain fixed during this step since we are updating individual variables. However:

  • In fit, population variables might have just been updated, and their effects not yet propagated to the model. In this case, model computations should use the current MCMC state (default behavior).

  • In personalization (mode or mean posterior), population variables are not updated. Therefore, the MCMC state should not be used.

class PopulationFastGibbsSampler(name, shape, *, scale, random_order_dimension=True, mean_acceptation_rate_target_bounds=(0.2, 0.4), adaptive_std_factor=0.1, **base_sampler_kws)[source]

Bases: AbstractPopulationGibbsSampler

Fast Gibbs sampler for population variables.

Note

The sampling batches along the dimensions except the first one. This accelerates sampling process for 2D parameters.

Parameters:
namestr

The name of the random variable to sample.

shapetuple of int

The shape of the random variable to sample.

scalefloat > 0 or torch.FloatTensor > 0

An approximate scale for the variable. It will be used to scale the initial adaptive std-dev used in sampler. An extra factor will be applied on top of this scale :

Note: If a torch.Tensor is passed, its shape must be compatible with the shape of the variable.

random_order_dimensionbool (default True)

Whether to randomize the order of dimensions during the sampling loop. This is applied only to population variables as group sampling is used for individual variables. Article gives a reason on why we should activate this flag.

mean_acceptation_rate_target_boundstuple [lower_bound: float, upper_bound: float] with 0 < lower_bound < upper_bound < 1

Bounds on the mean acceptation rate. If the rate is outside this range, the adaptive std-dev will be updated to maintain the target acceptance rate. (e.g: ~30%).

adaptive_std_factorfloat in (0, 1)

The factor by which the sampler’s std-dev is adapted:

  • If acceptance rate is too low → decrease by 1 - factor

  • If too high → increase by 1 + factor

**base_sampler_kws

Additional keyword arguments passed to AbstractSampler init method. For example, you may pass acceptation_history_length.

Parameters:
property shape_adapted_std: tuple

Shape of adaptative variance.

Return type:

tuple

class PopulationGibbsSampler(name, shape, *, scale, random_order_dimension=True, mean_acceptation_rate_target_bounds=(0.2, 0.4), adaptive_std_factor=0.1, **base_sampler_kws)[source]

Bases: AbstractPopulationGibbsSampler

Gibbs sampler for population variables.

The sampling is done iteratively for all coordinate values.

Parameters:
namestr

The name of the random variable to sample.

shapetuple of int

The shape of the random variable to sample.

scalefloat > 0 or torch.FloatTensor > 0

An approximate scale for the variable. It will be used to scale the initial adaptive std-dev used in sampler. An extra factor will be applied on top of this scale :

Note: If a torch.Tensor is passed, its shape must be compatible with the shape of the variable.

random_order_dimensionbool (default True)

Whether to randomize the order of dimensions during the sampling loop. This is applied only to population variables as group sampling is used for individual variables. Article gives a reason on why we should activate this flag.

mean_acceptation_rate_target_boundstuple [lower_bound: float, upper_bound: float] with 0 < lower_bound < upper_bound < 1

Bounds on the mean acceptation rate. If the rate is outside this range, the adaptive std-dev will be updated to maintain the target acceptance rate. (e.g: ~30%).

adaptive_std_factorfloat in (0, 1)

The factor by which the sampler’s std-dev is adapted:

  • If acceptance rate is too low → decrease by 1 - factor

  • If too high → increase by 1 + factor

**base_sampler_kws

Additional keyword arguments passed to AbstractSampler init method. For example, you may pass acceptation_history_length.

Parameters:
property shape_adapted_std: tuple

Shape of adaptative variance.

Return type:

tuple

class PopulationMetropolisHastingsSampler(name, shape, *, scale, random_order_dimension=True, mean_acceptation_rate_target_bounds=(0.2, 0.4), adaptive_std_factor=0.1, **base_sampler_kws)[source]

Bases: AbstractPopulationGibbsSampler

Metropolis-Hastings sampler for population variables.

Note

The sampling is implemented for all values at once. This accelerates sampling process but usually requires more iterations.

Parameters:
namestr

The name of the random variable to sample.

shapetuple of int

The shape of the random variable to sample.

scalefloat > 0 or torch.FloatTensor > 0

An approximate scale for the variable. It will be used to scale the initial adaptive std-dev used in sampler. An extra factor will be applied on top of this scale :

Note: If a torch.Tensor is passed, its shape must be compatible with the shape of the variable.

random_order_dimensionbool (default True)

Whether to randomize the order of dimensions during the sampling loop. This is applied only to population variables as group sampling is used for individual variables. Article gives a reason on why we should activate this flag.

mean_acceptation_rate_target_boundstuple [lower_bound: float, upper_bound: float] with 0 < lower_bound < upper_bound < 1

Bounds on the mean acceptation rate. If the rate is outside this range, the adaptive std-dev will be updated to maintain the target acceptance rate. (e.g: ~30%).

adaptive_std_factorfloat in (0, 1)

The factor by which the sampler’s std-dev is adapted:

  • If acceptance rate is too low → decrease by 1 - factor

  • If too high → increase by 1 + factor

**base_sampler_kws

Additional keyword arguments passed to AbstractSampler init method. For example, you may pass acceptation_history_length.

Parameters:
property shape_adapted_std: tuple

Shape of adaptative variance.

Return type:

tuple