leaspy.algo.personalize

Submodules

Classes

PersonalizeAlgorithm

Abstract class for personalize algorithm.

ConstantPredictionAlgorithm

ConstantPredictionAlgorithm is an algorithm that provides constant predictions.

LMEPersonalizeAlgorithm

Personalization algorithm associated to LMEModel.

McmcPersonalizeAlgorithm

Base class for MCMC-based personalization algorithms.

MeanPosteriorAlgorithm

Sampler-based algorithm that derives individual parameters as the most frequent mean posterior value from n_iter samplings.

ModePosteriorAlgorithm

Sampler-based algorithm that derives individual parameters as the most frequent mode posterior value from n_iter samplings.

ScipyMinimizeAlgorithm

Gradient descent based algorithm to compute individual parameters, i.e. personalizing a model for a given set of subjects.

Package Contents

class PersonalizeAlgorithm(settings)[source]

Bases: leaspy.algo.base.IterativeAlgorithm[leaspy.algo.base.ModelType, leaspy.algo.base.ReturnType]

Abstract class for personalize algorithm.

Estimation of individual parameters of a given Data file with a frozen model (already estimated, or loaded from known parameters).

Parameters:
settingsAlgorithmSettings

Settings of the algorithm.

Attributes:
namestr

Algorithm’s name.

seedint, optional

Algorithm’s seed (default None).

algo_parametersdict

Algorithm’s parameters.

Parameters:

settings (AlgorithmSettings)

See also

Leaspy.personalize()
family: AlgorithmType
set_output_manager(output_settings)[source]

Set the output manager.

This is currently not implemented for personalize.

Parameters:

output_settings (OutputsSettings)

Return type:

None

class ConstantPredictionAlgorithm(settings)[source]

Bases: leaspy.algo.personalize.base.PersonalizeAlgorithm[leaspy.models.ConstantModel, leaspy.io.outputs.individual_parameters.IndividualParameters]

ConstantPredictionAlgorithm is an algorithm that provides constant predictions.

It is used with the ConstantModel.

Parameters:
settingsAlgorithmSettings
The settings of constant prediction algorithm. It supports the following prediction_type values (str)::
  • 'last': last value even if NaN,

  • 'last_known': last non NaN value,

  • 'max': maximum (=worst) value ,

  • 'mean': average of values

depending on features, the last_known / max value may correspond to different visits. For a given feature, value will be NaN if and only if all values for this feature are NaN.

Raises:
LeaspyAlgoInputError

If any invalid setting for the algorithm

Parameters:

settings (AlgorithmSettings)

name: AlgorithmName
deterministic: bool = True
prediction_type: PredictionType
class LMEPersonalizeAlgorithm(settings)[source]

Bases: leaspy.algo.personalize.base.PersonalizeAlgorithm[leaspy.models.LMEModel, leaspy.io.outputs.individual_parameters.IndividualParameters]

Personalization algorithm associated to LMEModel.

Parameters:
settingsAlgorithmSettings

Algorithm settings (none yet). Most LME parameters are defined within LME model and LME fit algorithm.

Attributes:
name'lme_personalize'
Parameters:

settings (AlgorithmSettings)

name: AlgorithmName
deterministic: bool = True
set_output_manager(output_settings)[source]

Not implemented.

class McmcPersonalizeAlgorithm(settings)[source]

Bases: leaspy.algo.algo_with_annealing.AlgorithmWithAnnealingMixin, leaspy.algo.algo_with_samplers.AlgorithmWithSamplersMixin, leaspy.algo.algo_with_device.AlgorithmWithDeviceMixin, leaspy.algo.personalize.base.PersonalizeAlgorithm[leaspy.models.McmcSaemCompatibleModel, leaspy.io.outputs.individual_parameters.IndividualParameters]

Base class for MCMC-based personalization algorithms.

Individual parameters are derived from values of individual variables of the model.

Parameters:
settingsAlgorithmSettings

Settings of the algorithm.

Parameters:

settings (AlgorithmSettings)

class MeanPosteriorAlgorithm(settings)[source]

Bases: leaspy.algo.personalize.mcmc.McmcPersonalizeAlgorithm

Sampler-based algorithm that derives individual parameters as the most frequent mean posterior value from n_iter samplings.

Parameters:
settingsAlgorithmSettings

Settings of the algorithm.

Parameters:

settings (AlgorithmSettings)

name: AlgorithmName
class ModePosteriorAlgorithm(settings)[source]

Bases: leaspy.algo.personalize.mcmc.McmcPersonalizeAlgorithm

Sampler-based algorithm that derives individual parameters as the most frequent mode posterior value from n_iter samplings.

TODO? we could derive some confidence intervals on individual parameters thanks to this personalization algorithm…

Parameters:
settingsAlgorithmSettings

Settings of the algorithm.

Parameters:

settings (AlgorithmSettings)

name: AlgorithmName
regularity_factor: float = 1.0

Weighting of regularity term in the final loss to be minimized.

class ScipyMinimizeAlgorithm(settings)[source]

Bases: leaspy.algo.personalize.base.PersonalizeAlgorithm[leaspy.models.McmcSaemCompatibleModel, leaspy.io.outputs.individual_parameters.IndividualParameters]

Gradient descent based algorithm to compute individual parameters, i.e. personalizing a model for a given set of subjects.

Parameters:
settingsAlgorithmSettings

Settings for the algorithm, including the custom_scipy_minimize_params parameter, which contains keyword arguments passed to scipy.optimize.minimize().

Attributes:
scipy_minimize_paramsdict

Keyword arguments for scipy.optimize.minimize(), with default values depending on the usage of a jacobian (cf. ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIAN and ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN). Customization is possible via the custom_scipy_minimize_params in AlgorithmSettings.

format_convergence_issuesstr
A format string for displaying convergence issues, which can use the
following variables:
  • patient_id: str

  • optimization_result_pformat: str

  • optimization_result_obj: dict-like

The default format is defined in ScipyMinimize.DEFAULT_FORMAT_CONVERGENCE_ISSUES, but it can be customized via the custom_format_convergence_issues parameter.

loggerNone or callable str -> None

The function used to display convergence issues returned by scipy.optimize.minimize(). By default, convergence issues are printed only if the BFGS optimization method is not used. This can be customized by setting the logger attribute in AlgorithmSettings.

Parameters:

settings (AlgorithmSettings)

name: AlgorithmName
DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIAN
DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN
DEFAULT_FORMAT_CONVERGENCE_ISSUES = Multiline-String
Show Value
"""<!> {patient_id}:
{optimization_result_pformat}"""
regularity_factor: float = 1.0
scipy_minimize_params
format_convergence_issues
logger
obj_no_jac(x, state, scaling)[source]

Objective loss function to minimize in order to get patient’s individual parameters.

Parameters:
xnumpy.ndarray

Individual standardized parameters At initialization x is full of zeros (mode of priors, scaled by std-dev)

stateState

The cloned model state that is dedicated to the current individual. In particular, individual data variables for the current individual are already loaded into it.

scaling_AffineScalings1D

The scaling to be used for individual latent variables.

Returns:
objectivefloat

Value of the loss function (negative log-likelihood).

Parameters:
Return type:

float

abstractmethod obj_with_jac(x, state, scaling)[source]

Objective loss function to minimize in order to get patient’s individual parameters, together with its jacobian w.r.t to each of x dimension.

Parameters:
xnumpy.ndarray

Individual standardized parameters At initialization x is full of zeros (mode of priors, scaled by std-dev)

stateState

The cloned model state that is dedicated to the current individual. In particular, individual data variables for the current individual are already loaded into it.

scaling_AffineScalings1D

The scaling to be used for individual latent variables.

Returns:
2-tuple (as expected by scipy.optimize.minimize() when jac=True)
  • objective : float

  • gradient : array-like[float] with same length as x (= all dimensions of individual latent variables, concatenated)

Parameters:
Return type:

tuple[float, Tensor]

is_jacobian_implemented(model)[source]

Check that the jacobian of model is implemented.

Parameters:

model (McmcSaemCompatibleModel)

Return type:

bool