Movatterモバイル変換


[0]ホーム

URL:


Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
Thehttps:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

NIH NLM Logo
Log inShow account info
Access keysNCBI HomepageMyNCBI HomepageMain ContentMain Navigation
pubmed logo
Advanced Clipboard
User Guide

Full text links

MDPI full text link MDPI Free PMC article
Full text links

Actions

Share

.2024 Feb 10;26(2):155.
doi: 10.3390/e26020155.

A Fast Algorithm for Estimating Two-Dimensional Sample Entropy Based on an Upper Confidence Bound and Monte Carlo Sampling

Affiliations

A Fast Algorithm for Estimating Two-Dimensional Sample Entropy Based on an Upper Confidence Bound and Monte Carlo Sampling

Zeheng Zhou et al. Entropy (Basel)..

Abstract

The two-dimensional sample entropy marks a significant advance in evaluating the regularity and predictability of images in the information domain. Unlike the direct computation of sample entropy, which incurs a time complexity of O(N2) for the series withN length, the Monte Carlo-based algorithm for computing one-dimensional sample entropy (MCSampEn) markedly reduces computational costs by minimizing the dependence onN. This paper extends MCSampEn to two dimensions, referred to as MCSampEn2D. This new approach substantially accelerates the estimation of two-dimensional sample entropy, outperforming the direct method by more than a thousand fold. Despite these advancements, MCSampEn2D encounters challenges with significant errors and slow convergence rates. To counter these issues, we have incorporated an upper confidence bound (UCB) strategy in MCSampEn2D. This strategy involves assigning varied upper confidence bounds in each Monte Carlo experiment iteration to enhance the algorithm's speed and accuracy. Our evaluation of this enhanced approach, dubbed UCBMCSampEn2D, involved the use of medical and natural image data sets. The experiments demonstrate that UCBMCSampEn2D achieves a 40% reduction in computational time compared to MCSampEn2D. Furthermore, the errors with UCBMCSampEn2D are only 30% of those observed in MCSampEn2D, highlighting its improved accuracy and efficiency.

Keywords: Monte Carlo algorithm; sample entropy; upper confidence bound strategy.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Figure 1
Figure 1
Examples of reference images include: images (a,b) in the Warwick QU Dataset, which represent benign and malignant cases, respectively, each with a size of580×440; image (c) is a natural image with a size of775×522; and image (d), called the wallpaper, with a size of3000×3000, is used to verify the method’s performance on large-scale data.
Figure 2
Figure 2
(a) depicts the average error variation in MCSampEn2D and UCBMCSampEn2D experiments on the Warwick QU dataset with changingN1, where parameters are set tom=2 andr=0.3; (b) depicts the average error variation in MCSampEn2D and UCBMCSampEn2D experiments on natural datasets with changingN1, where parameters are set tom=2,r=0.3,a=5 andb=1. The reward function is set as the cosine function.
Figure 3
Figure 3
The average error variation in different reward functionR with changingN1 on the Warwick QU dataset, where parameters are set tom=2 andr=0.3. The parameters for the cosine function in the reward function are set asa=8 andb=0.5, while for the normal distribution function, the parameters are set asa=8 andb=2.
Figure 4
Figure 4
The error standard deviation variation for the wallpaper, whereN0=128,N1=300, where the UCB parameters were set ata=8 andb=1.
Figure 5
Figure 5
The mean error variation for the wallpaper, where the UCB parameters were set ata=8 andb=1.
Figure 6
Figure 6
The error standard deviation variation for the wallpaper, whereN0=128 andN1=300.
See this image and copyright information in PMC

Similar articles

See all similar articles

References

    1. Shannon C.E. A Mathematical Theory of Communication. Assoc. Comput. Mach. 2001;5:1559–1662. doi: 10.1145/584091.584093. - DOI
    1. Richman J.S., Moorman J.R. Physiological time-series analysis using approximate entropy and sample entropy. Am. J. Physiol. Heart Circ. Physiol. 2000;278:H2039–H2049. doi: 10.1152/ajpheart.2000.278.6.H2039. - DOI - PubMed
    1. Pincus S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA. 1991;88:2297–2301. doi: 10.1073/pnas.88.6.2297. - DOI - PMC - PubMed
    1. Tomčala J. New fast ApEn and SampEn entropy algorithms implementation and their application to supercomputer power consumption. Entropy. 2020;22:863. doi: 10.3390/e22080863. - DOI - PMC - PubMed
    1. Rostaghi M., Azami H. Dispersion entropy: A measure for time-series analysis. IEEE Signal Process. Lett. 2016;23:610–614. doi: 10.1109/LSP.2016.2542881. - DOI

Grants and funding

LinkOut - more resources

Full text links
MDPI full text link MDPI Free PMC article
Cite
Send To

NCBI Literature Resources

MeSHPMCBookshelfDisclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.


[8]ページ先頭

©2009-2025 Movatter.jp