Sample Entropy (also known as Kolgomorov-Sinai Entropy)
Sample Entropy (also known as Kolgomorov-Sinai Entropy)
These functions measure the complexity of the RR time series. Large values of the Sample Entropy indicate high complexity whereas that smaller values characterize more regular signals.
HRVData: Data structure that stores the beats register and information related to it
indexNonLinearAnalysis: Reference to the data structure that will contain the nonlinear analysis
doPlot: Logical value. If TRUE (default), a plot of the correlation sum is shown
regressionRange: Vector with 2 components denoting the range where the function will perform linear regression
useEmbeddings: A numeric vector specifying which embedding dimensions should the algorithm use to compute the sample entropy.
...: Additional plot parameters.
Returns
The CalculateSampleEntropy returns a HRVData structure containing the sample entropy computations of the RR time series under the NonLinearAnalysis list.
The EstimateSampleEntropy function estimates the sample entropy of the RR time series by performing a linear regression over the radius' range specified in regressionRange. If doPlot is TRUE, a graphic of the regression over the data is shown. In order to run EstimateSampleEntropy, it is necessary to have performed the sample entropy computations before with ComputeSampleEntropy. The results are returned into the HRVData structure, under the NonLinearAnalysis list.
PlotSampleEntropy shows a graphic of the sample entropy computations.
where m is the embedding dimension and r is the radius of the neighbourhood. When computing the correlation dimensions we use the linear regions from the correlation sums in order to do the estimates. Similarly, the sample entropy hq(m,r)
should not change for both various m and r.
Note
In order to run this functions, it is necessary to have used the CalculateCorrDim function.
This function is based on the sampleEntropy function from the nonlinearTseries package.