Fit Distributions and Neural Networks to Censored and Truncated Data
Gamma distribution
Pareto Distribution
Poisson Distribution
Tranlsated distribution
Beta Distribution
Binomial Distribution
Convert TensorFlow tensors to distribution parameters recursively
Transition functions for blended distributions
Keras Callback for adaptive learning rate with weight restoration
Callback to monitor likelihood gradient components
Construct a BDEGP-Family
Blended distribution
Dirac (degenerate point) Distribution
Discrete Distribution
Empirical distribution
Erlang Mixture distribution
Exponential distribution
Generalized Pareto Distribution
Log Normal distribution
Mixture distribution
Negative binomial Distribution
Normal distribution
Truncated distribution
Uniform distribution
Weibull Distribution
Base class for Distributions
Fit a Blended mixture using an ECME-Algorithm
Find starting values for distribution parameters
The Pareto Distribution
Fit a general distribution to observations
Fit an Erlang mixture using an ECME-Algorithm
Fit a generic mixture using an ECME-Algorithm
Fit a neural network based distribution model to data
Flatten / Inflate parameter lists / vectors
The Generalized Pareto Distribution (GPD)
Adaptive Gauss-Kronrod Quadrature for multiple limits
Convex union and intersection of intervals
Intervals
Test if object is a Distribution
Cast to a TensorFlow matrix
Compute weighted quantiles
Plot several distributions
Predict individual distribution parameters
Determine probability of reporting under a Poisson arrival Process
Quantiles of Distributions
Objects exported from other packages
Soft-Max function
Compute weighted tabulations
Compile a Keras model for truncated data under dist
Initialise model weights to a global parameter fit
Define a set of truncated observations
Truncate claims data subject to reporting delay
Compute weighted moments
Define distribution families and fit them to interval-censored and interval-truncated data, where the truncation bounds may depend on the individual observation. The defined distributions feature density, probability, sampling and fitting methods as well as efficient implementations of the log-density log f(x) and log-probability log P(x0 <= X <= x1) for use in 'TensorFlow' neural networks via the 'tensorflow' package. Allows training parametric neural networks on interval-censored and interval-truncated data with flexible parameterization. Applications include Claims Development in Non-Life Insurance, e.g. modelling reporting delay distributions from incomplete data, see Bücher, Rosenstock (2022) <doi:10.1007/s13385-022-00314-4>.
Useful links