leaspy.algo.personalize¶
Submodules¶
Classes¶
Abstract class for personalize algorithm. |
|
ConstantPredictionAlgorithm is an algorithm that provides constant predictions. |
|
Personalization algorithm associated to |
|
Base class for MCMC-based personalization algorithms. |
|
Sampler-based algorithm that derives individual parameters as the most frequent mean posterior value from n_iter samplings. |
|
Sampler-based algorithm that derives individual parameters as the most frequent mode posterior value from n_iter samplings. |
|
Gradient descent based algorithm to compute individual parameters, i.e. personalizing a model for a given set of subjects. |
Package Contents¶
- class PersonalizeAlgorithm(settings)[source]¶
Bases:
leaspy.algo.base.IterativeAlgorithm[leaspy.algo.base.ModelType,leaspy.algo.base.ReturnType]Abstract class for personalize algorithm.
Estimation of individual parameters of a given Data file with a frozen model (already estimated, or loaded from known parameters).
- Parameters:
- settings
AlgorithmSettings Settings of the algorithm.
- settings
- Attributes:
- Parameters:
settings (AlgorithmSettings)
See also
Leaspy.personalize()
- family: AlgorithmType¶
- set_output_manager(output_settings)[source]¶
Set the output manager.
This is currently not implemented for personalize.
- Parameters:
output_settings (OutputsSettings)
- Return type:
None
- class ConstantPredictionAlgorithm(settings)[source]¶
Bases:
leaspy.algo.personalize.base.PersonalizeAlgorithm[leaspy.models.ConstantModel,leaspy.io.outputs.individual_parameters.IndividualParameters]ConstantPredictionAlgorithm is an algorithm that provides constant predictions.
It is used with the
ConstantModel.- Parameters:
- settings
AlgorithmSettings - The settings of constant prediction algorithm. It supports the following prediction_type values (str)::
'last': last value even if NaN,'last_known': last non NaN value,'max': maximum (=worst) value ,'mean': average of values
depending on features, the last_known / max value may correspond to different visits. For a given feature, value will be NaN if and only if all values for this feature are NaN.
- settings
- Raises:
LeaspyAlgoInputErrorIf any invalid setting for the algorithm
- Parameters:
settings (AlgorithmSettings)
- name: AlgorithmName¶
- prediction_type: PredictionType¶
- class LMEPersonalizeAlgorithm(settings)[source]¶
Bases:
leaspy.algo.personalize.base.PersonalizeAlgorithm[leaspy.models.LMEModel,leaspy.io.outputs.individual_parameters.IndividualParameters]Personalization algorithm associated to
LMEModel.- Parameters:
- settings
AlgorithmSettings Algorithm settings (none yet). Most LME parameters are defined within LME model and LME fit algorithm.
- settings
- Attributes:
- name
'lme_personalize'
- name
- Parameters:
settings (AlgorithmSettings)
- name: AlgorithmName¶
- class McmcPersonalizeAlgorithm(settings)[source]¶
Bases:
leaspy.algo.algo_with_annealing.AlgorithmWithAnnealingMixin,leaspy.algo.algo_with_samplers.AlgorithmWithSamplersMixin,leaspy.algo.algo_with_device.AlgorithmWithDeviceMixin,leaspy.algo.personalize.base.PersonalizeAlgorithm[leaspy.models.McmcSaemCompatibleModel,leaspy.io.outputs.individual_parameters.IndividualParameters]Base class for MCMC-based personalization algorithms.
Individual parameters are derived from values of individual variables of the model.
- Parameters:
- settings
AlgorithmSettings Settings of the algorithm.
- settings
- Parameters:
settings (AlgorithmSettings)
- class MeanPosteriorAlgorithm(settings)[source]¶
Bases:
leaspy.algo.personalize.mcmc.McmcPersonalizeAlgorithmSampler-based algorithm that derives individual parameters as the most frequent mean posterior value from n_iter samplings.
- Parameters:
- settings
AlgorithmSettings Settings of the algorithm.
- settings
- Parameters:
settings (AlgorithmSettings)
- name: AlgorithmName¶
- class ModePosteriorAlgorithm(settings)[source]¶
Bases:
leaspy.algo.personalize.mcmc.McmcPersonalizeAlgorithmSampler-based algorithm that derives individual parameters as the most frequent mode posterior value from n_iter samplings.
TODO? we could derive some confidence intervals on individual parameters thanks to this personalization algorithm…
- Parameters:
- settings
AlgorithmSettings Settings of the algorithm.
- settings
- Parameters:
settings (AlgorithmSettings)
- name: AlgorithmName¶
- class ScipyMinimizeAlgorithm(settings)[source]¶
Bases:
leaspy.algo.personalize.base.PersonalizeAlgorithm[leaspy.models.McmcSaemCompatibleModel,leaspy.io.outputs.individual_parameters.IndividualParameters]Gradient descent based algorithm to compute individual parameters, i.e. personalizing a model for a given set of subjects.
- Parameters:
- settings
AlgorithmSettings Settings for the algorithm, including the custom_scipy_minimize_params parameter, which contains keyword arguments passed to
scipy.optimize.minimize().
- settings
- Attributes:
- scipy_minimize_params
dict Keyword arguments for
scipy.optimize.minimize(), with default values depending on the usage of a jacobian (cf. ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIAN and ScipyMinimize.DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN). Customization is possible via the custom_scipy_minimize_params inAlgorithmSettings.- format_convergence_issues
str - A format string for displaying convergence issues, which can use the
- following variables:
The default format is defined in ScipyMinimize.DEFAULT_FORMAT_CONVERGENCE_ISSUES, but it can be customized via the custom_format_convergence_issues parameter.
- loggerNone or callable
str-> None The function used to display convergence issues returned by
scipy.optimize.minimize(). By default, convergence issues are printed only if the BFGS optimization method is not used. This can be customized by setting the logger attribute inAlgorithmSettings.
- scipy_minimize_params
- Parameters:
settings (AlgorithmSettings)
- name: AlgorithmName¶
- DEFAULT_SCIPY_MINIMIZE_PARAMS_WITH_JACOBIAN¶
- DEFAULT_SCIPY_MINIMIZE_PARAMS_WITHOUT_JACOBIAN¶
- DEFAULT_FORMAT_CONVERGENCE_ISSUES = Multiline-String¶
Show Value
"""<!> {patient_id}: {optimization_result_pformat}"""
- scipy_minimize_params¶
- format_convergence_issues¶
- logger¶
- obj_no_jac(x, state, scaling)[source]¶
Objective loss function to minimize in order to get patient’s individual parameters.
- Parameters:
- xnumpy.ndarray
Individual standardized parameters At initialization x is full of zeros (mode of priors, scaled by std-dev)
- state
State The cloned model state that is dedicated to the current individual. In particular, individual data variables for the current individual are already loaded into it.
- scaling_AffineScalings1D
The scaling to be used for individual latent variables.
- Returns:
- objective
float Value of the loss function (negative log-likelihood).
- objective
- Parameters:
- Return type:
- abstractmethod obj_with_jac(x, state, scaling)[source]¶
Objective loss function to minimize in order to get patient’s individual parameters, together with its jacobian w.r.t to each of x dimension.
- Parameters:
- xnumpy.ndarray
Individual standardized parameters At initialization x is full of zeros (mode of priors, scaled by std-dev)
- state
State The cloned model state that is dedicated to the current individual. In particular, individual data variables for the current individual are already loaded into it.
- scaling_AffineScalings1D
The scaling to be used for individual latent variables.
- Returns:
- 2-tuple (as expected by
scipy.optimize.minimize()whenjac=True) objective :
floatgradient : array-like[float] with same length as x (= all dimensions of individual latent variables, concatenated)
- 2-tuple (as expected by
- Parameters:
- Return type:
- is_jacobian_implemented(model)[source]¶
Check that the jacobian of model is implemented.
- Parameters:
model (McmcSaemCompatibleModel)
- Return type: