leaspy.samplers =============== .. py:module:: leaspy.samplers Submodules ---------- .. toctree:: :maxdepth: 1 /reference/api/leaspy/samplers/base/index /reference/api/leaspy/samplers/factory/index /reference/api/leaspy/samplers/gibbs/index Attributes ---------- .. autoapisummary:: leaspy.samplers.INDIVIDUAL_SAMPLERS leaspy.samplers.POPULATION_SAMPLERS Classes ------- .. autoapisummary:: leaspy.samplers.AbstractIndividualSampler leaspy.samplers.AbstractPopulationSampler leaspy.samplers.AbstractSampler leaspy.samplers.IndividualGibbsSampler leaspy.samplers.PopulationFastGibbsSampler leaspy.samplers.PopulationGibbsSampler leaspy.samplers.PopulationMetropolisHastingsSampler Functions --------- .. autoapisummary:: leaspy.samplers.sampler_factory Package Contents ---------------- .. py:class:: AbstractIndividualSampler(name, shape, *, n_patients, acceptation_history_length = 25) Bases: :py:obj:`AbstractSampler` Abstract class for samplers of individual random variables. :Parameters: **name** : :obj:`str` The name of the random variable to sample. **shape** : :obj:`tuple` of :obj:`int` The shape of the random variable to sample. **n_patients** : :obj:`int` Number of patients. **acceptation_history_length** : :obj:`int` > 0 (default 25) Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.) :Attributes: **name** : :obj:`str` Name of variable **shape** : :obj:`tuple` of :obj:`int` Shape of variable **n_patients** : :obj:`int` Number of patients. **acceptation_history_length** : :obj:`int` Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.) **acceptation_history** : :class:`torch.Tensor` History of binary acceptations to compute mean acceptation rate for the sampler in MCMC-SAEM algorithm. It keeps the history of the last `acceptation_history_length` steps. .. !! processed by numpydoc !! .. py:attribute:: n_patients .. py:class:: AbstractPopulationSampler(name, shape, *, acceptation_history_length = 25, mask = None) Bases: :py:obj:`AbstractSampler` Abstract class for samplers of population random variables. :Parameters: **name** : :obj:`str` The name of the random variable to sample. **shape** : :obj:`tuple` of :obj:`int` The shape of the random variable to sample. **acceptation_history_length** : :obj:`int` > 0 (default 25) Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.) **mask** : :class:`torch.Tensor`, optional A binary (0/1) tensor indicating which elements to sample. Elements with value 1 (True) are included in the sampling; elements with 0 (False) are excluded. :Attributes: **name** : :obj:`str` Name of variable **shape** : :obj:`tuple` of :obj:`int` Shape of variable **acceptation_history_length** : :obj:`int` Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.) **acceptation_history** : :class:`torch.Tensor` History of binary acceptations to compute mean acceptation rate for the sampler in MCMC-SAEM algorithm. It keeps the history of the last `acceptation_history_length` steps. **mask** : :class:`torch.Tensor` of :obj:`bool`, optional A binary (0/1) tensor indicating which elements to sample. Elements with value 1 (True) are included in the sampling; elements with 0 (False) are excluded. .. !! processed by numpydoc !! .. py:attribute:: mask :value: None .. py:class:: AbstractSampler(name, shape, *, acceptation_history_length = 25) Bases: :py:obj:`abc.ABC` Abstract sampler class. :Parameters: **name** : :obj:`str` The name of the random variable to sample. **shape** : :obj:`tuple` of :obj:`int` The shape of the random variable to sample. **acceptation_history_length** : :obj:`int` > 0 (default 25) Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (It is the same for population or individual variables.) :Attributes: **name** : :obj:`str` Name of variable **shape** : :obj:`tuple` of :obj:`int` Shape of variable **acceptation_history_length** : :obj:`int` Deepness (= number of iterations) of the history kept for computing the mean acceptation rate. (Same for population or individual variables by default.) **acceptation_history** : :class:`torch.Tensor` History of binary acceptations to compute mean acceptation rate for the sampler in MCMC-SAEM algorithm. It keeps the history of the last `acceptation_history_length` steps. :Raises: :exc:`.LeaspyModelInputError` .. .. !! processed by numpydoc !! .. py:attribute:: name .. py:attribute:: shape .. py:attribute:: acceptation_history_length :value: 25 .. py:attribute:: acceptation_history .. py:property:: ndim :type: int Number of dimensions. .. !! processed by numpydoc !! .. py:property:: shape_acceptation :type: Tuple[int, Ellipsis] :abstractmethod: Return the shape of acceptation tensor for a single iteration. :Returns: :obj:`tuple` of :obj:`int` The shape of the acceptation history. .. !! processed by numpydoc !! .. py:method:: sample(state, *, temperature_inv) :abstractmethod: Apply a sampling step It will modify in-place the internal state, caching all intermediate values needed to efficient. :Parameters: **state** : :class:`.State` Object containing values for all model variables, including latent variables **temperature_inv** : :obj:`float` > 0 Inverse of the temperature used in tempered MCMC-SAEM .. !! processed by numpydoc !! .. py:data:: INDIVIDUAL_SAMPLERS .. py:data:: POPULATION_SAMPLERS .. py:function:: sampler_factory(sampler, variable_type, **kwargs) Factory for Samplers. :Parameters: **sampler** : :class:`.AbstractSampler` or :obj:`str` If an instance of a subclass of :class:`.AbstractSampler`, returns the instance. If a string, returns a new instance of the appropriate class (with optional parameters `kwargs`). **variable_type** : :class:`.VariableType` The type of random variable that the sampler is supposed to sample. **\*\*kwargs** Optional parameters for initializing the requested Sampler (not used if input is a subclass of :class:`.AbstractSampler`). :Returns: :class:`.AbstractSampler` The desired sampler. :Raises: :exc:`.LeaspyAlgoInputError`: If the sampler provided is not supported. .. !! processed by numpydoc !! .. py:class:: IndividualGibbsSampler(name, shape, *, n_patients, scale, random_order_dimension = True, mean_acceptation_rate_target_bounds = (0.2, 0.4), adaptive_std_factor = 0.1, **base_sampler_kws) Bases: :py:obj:`GibbsSamplerMixin`, :py:obj:`leaspy.samplers.base.AbstractIndividualSampler` Gibbs sampler for individual variables. Individual variables are handled using a grouped Gibbs sampler. Currently, this is the only sampler implemented for individual variables. :Parameters: **name** : :obj:`str` The name of the random variable to sample. **shape** : :obj:`tuple` of :obj:`int` The shape of the random variable to sample. **scale** : :obj:`float` > 0 or :class:`torch.FloatTensor` > 0 An approximate scale for the variable. It will be used to scale the initial adaptive std-dev used in sampler. An extra factor will be applied on top of this scale : * 1% for population parameters (:attr:`.AbstractPopulationGibbsSampler.STD_SCALE_FACTOR`) * 50% for individual parameters (:attr:`.IndividualGibbsSampler.STD_SCALE_FACTOR`) Note: If a :class:`torch.Tensor` is passed, its shape must be compatible with the shape of the variable. **random_order_dimension** : :obj:`bool` (default True) Whether to randomize the order of dimensions during the sampling loop. This is applied only to population variables as group sampling is used for individual variables. `Article `_ gives a reason on why we should activate this flag. **mean_acceptation_rate_target_bounds** : :obj:`tuple` [lower_bound: :obj:`float`, upper_bound: :obj:`float`] with 0 < lower_bound < upper_bound < 1 Bounds on the mean acceptation rate. If the rate is outside this range, the adaptive std-dev will be updated to maintain the target acceptance rate. (e.g: ~30%). **adaptive_std_factor** : :obj:`float` in (0, 1) The factor by which the sampler's std-dev is adapted: * If acceptance rate is too low → decrease by ``1 - factor`` * If too high → increase by ``1 + factor`` **\*\*base_sampler_kws** Additional keyword arguments passed to :class:`~leaspy.samplers.AbstractSampler` init method. For example, you may pass `acceptation_history_length`. .. !! processed by numpydoc !! .. py:attribute:: STD_SCALE_FACTOR :value: 0.5 .. py:method:: validate_scale(scale) Validate the provided scale. :Parameters: **scale** : :obj:`float` or :class:`torch.Tensor` The scale to be validated. :Returns: :class:`torch.Tensor` Valid scale. .. !! processed by numpydoc !! .. py:property:: shape_adapted_std :type: tuple Shape of adaptative variance. .. !! processed by numpydoc !! .. py:method:: sample(state, *, temperature_inv) Apply a grouped Metropolis-Hastings (MH) sampling step for individual variables. For each individual variable: - Compute the current attachment and regularity. - Propose a new value for the variable. - Recompute the attachment and regularity. - Accept or reject the proposal using the MH criterion, based on the inverse temperature. :Parameters: **state** : :class:`.State` Object containing values for all model variables, including latent variables **temperature_inv** : :obj:`float` > 0 The temperature to use. .. rubric:: Notes Population variables remain fixed during this step since we are updating individual variables. However: - In fit, population variables might have just been updated, and their effects not yet propagated to the model. In this case, model computations should use the current MCMC state (default behavior). - In personalization (mode or mean posterior), population variables are not updated. Therefore, the MCMC state should not be used. .. !! processed by numpydoc !! .. py:class:: PopulationFastGibbsSampler(name, shape, *, scale, random_order_dimension = True, mean_acceptation_rate_target_bounds = (0.2, 0.4), adaptive_std_factor = 0.1, **base_sampler_kws) Bases: :py:obj:`AbstractPopulationGibbsSampler` Fast Gibbs sampler for population variables. .. note:: The sampling batches along the dimensions except the first one. This accelerates sampling process for 2D parameters. :Parameters: **name** : :obj:`str` The name of the random variable to sample. **shape** : :obj:`tuple` of :obj:`int` The shape of the random variable to sample. **scale** : :obj:`float` > 0 or :class:`torch.FloatTensor` > 0 An approximate scale for the variable. It will be used to scale the initial adaptive std-dev used in sampler. An extra factor will be applied on top of this scale : * 1% for population parameters (:attr:`.AbstractPopulationGibbsSampler.STD_SCALE_FACTOR`) * 50% for individual parameters (:attr:`.IndividualGibbsSampler.STD_SCALE_FACTOR`) Note: If a :class:`torch.Tensor` is passed, its shape must be compatible with the shape of the variable. **random_order_dimension** : :obj:`bool` (default True) Whether to randomize the order of dimensions during the sampling loop. This is applied only to population variables as group sampling is used for individual variables. `Article `_ gives a reason on why we should activate this flag. **mean_acceptation_rate_target_bounds** : :obj:`tuple` [lower_bound: :obj:`float`, upper_bound: :obj:`float`] with 0 < lower_bound < upper_bound < 1 Bounds on the mean acceptation rate. If the rate is outside this range, the adaptive std-dev will be updated to maintain the target acceptance rate. (e.g: ~30%). **adaptive_std_factor** : :obj:`float` in (0, 1) The factor by which the sampler's std-dev is adapted: * If acceptance rate is too low → decrease by ``1 - factor`` * If too high → increase by ``1 + factor`` **\*\*base_sampler_kws** Additional keyword arguments passed to :class:`~leaspy.samplers.AbstractSampler` init method. For example, you may pass `acceptation_history_length`. .. !! processed by numpydoc !! .. py:property:: shape_adapted_std :type: tuple Shape of adaptative variance. .. !! processed by numpydoc !! .. py:class:: PopulationGibbsSampler(name, shape, *, scale, random_order_dimension = True, mean_acceptation_rate_target_bounds = (0.2, 0.4), adaptive_std_factor = 0.1, **base_sampler_kws) Bases: :py:obj:`AbstractPopulationGibbsSampler` Gibbs sampler for population variables. The sampling is done iteratively for all coordinate values. :Parameters: **name** : :obj:`str` The name of the random variable to sample. **shape** : :obj:`tuple` of :obj:`int` The shape of the random variable to sample. **scale** : :obj:`float` > 0 or :class:`torch.FloatTensor` > 0 An approximate scale for the variable. It will be used to scale the initial adaptive std-dev used in sampler. An extra factor will be applied on top of this scale : * 1% for population parameters (:attr:`.AbstractPopulationGibbsSampler.STD_SCALE_FACTOR`) * 50% for individual parameters (:attr:`.IndividualGibbsSampler.STD_SCALE_FACTOR`) Note: If a :class:`torch.Tensor` is passed, its shape must be compatible with the shape of the variable. **random_order_dimension** : :obj:`bool` (default True) Whether to randomize the order of dimensions during the sampling loop. This is applied only to population variables as group sampling is used for individual variables. `Article `_ gives a reason on why we should activate this flag. **mean_acceptation_rate_target_bounds** : :obj:`tuple` [lower_bound: :obj:`float`, upper_bound: :obj:`float`] with 0 < lower_bound < upper_bound < 1 Bounds on the mean acceptation rate. If the rate is outside this range, the adaptive std-dev will be updated to maintain the target acceptance rate. (e.g: ~30%). **adaptive_std_factor** : :obj:`float` in (0, 1) The factor by which the sampler's std-dev is adapted: * If acceptance rate is too low → decrease by ``1 - factor`` * If too high → increase by ``1 + factor`` **\*\*base_sampler_kws** Additional keyword arguments passed to :class:`~leaspy.samplers.AbstractSampler` init method. For example, you may pass `acceptation_history_length`. .. !! processed by numpydoc !! .. py:property:: shape_adapted_std :type: tuple Shape of adaptative variance. .. !! processed by numpydoc !! .. py:class:: PopulationMetropolisHastingsSampler(name, shape, *, scale, random_order_dimension = True, mean_acceptation_rate_target_bounds = (0.2, 0.4), adaptive_std_factor = 0.1, **base_sampler_kws) Bases: :py:obj:`AbstractPopulationGibbsSampler` Metropolis-Hastings sampler for population variables. .. note:: The sampling is implemented for all values at once. This accelerates sampling process but usually requires more iterations. :Parameters: **name** : :obj:`str` The name of the random variable to sample. **shape** : :obj:`tuple` of :obj:`int` The shape of the random variable to sample. **scale** : :obj:`float` > 0 or :class:`torch.FloatTensor` > 0 An approximate scale for the variable. It will be used to scale the initial adaptive std-dev used in sampler. An extra factor will be applied on top of this scale : * 1% for population parameters (:attr:`.AbstractPopulationGibbsSampler.STD_SCALE_FACTOR`) * 50% for individual parameters (:attr:`.IndividualGibbsSampler.STD_SCALE_FACTOR`) Note: If a :class:`torch.Tensor` is passed, its shape must be compatible with the shape of the variable. **random_order_dimension** : :obj:`bool` (default True) Whether to randomize the order of dimensions during the sampling loop. This is applied only to population variables as group sampling is used for individual variables. `Article `_ gives a reason on why we should activate this flag. **mean_acceptation_rate_target_bounds** : :obj:`tuple` [lower_bound: :obj:`float`, upper_bound: :obj:`float`] with 0 < lower_bound < upper_bound < 1 Bounds on the mean acceptation rate. If the rate is outside this range, the adaptive std-dev will be updated to maintain the target acceptance rate. (e.g: ~30%). **adaptive_std_factor** : :obj:`float` in (0, 1) The factor by which the sampler's std-dev is adapted: * If acceptance rate is too low → decrease by ``1 - factor`` * If too high → increase by ``1 + factor`` **\*\*base_sampler_kws** Additional keyword arguments passed to :class:`~leaspy.samplers.AbstractSampler` init method. For example, you may pass `acceptation_history_length`. .. !! processed by numpydoc !! .. py:property:: shape_adapted_std :type: tuple Shape of adaptative variance. .. !! processed by numpydoc !!