One of the most pervasive applications in Computing, is the generation of Random numbers, which belong to a certain probability distribution such as a Gaussian (normal) distribution. These probability distributions possess statistical properties such as expected values (mean), variance (standard deviation), p-value, Entropy etc.; out of which Entropy is significant, for quantifying the amount of (useful) information, that a particular instance of a distribution embodies. This quantification of Entropy is of value as a characterizing metric, which determines the amount of randomness/uncertainty and/or redundancy that can be achieved using a particular distribution instance. This is particularly useful for communication, cryptographic and astronomical applications in this day and age. In the present work the Author introduces an alternate way to calculate the approximate value of the Information Entropy (with a variation to the formulation of Information Entropy by Claude Shannon, as known by the scientific community); by observing that a Takens embedding of the probability distribution yields a simple measure of the Entropy; by taking into consideration only four critical/representative points of the embedding. By comparative experimentation, the Author has been able to empirically verify that this alternate formulation is consistently valid: The baseline experiment chosen relates to Discrete Task Oriented Joint Source Channel Coding (DT-JSCC) which utilizes entropy computation to perform efficient and reliable task oriented communication (transmission and reception) as will be elaborated further. The author performed the comparison by employing the Shannon formulation for Entropy computation in the baseline DT-JSCC experiment and then repeating the experiment by employing the Entropy formulation, introduced in this work. Eventually, the accuracy of results obtained (data models generated) were almost identical (differing in accuracy by only ~ 1% overall). Thus, the alternate formulation introduced in this work, provides a reliable means of validating the random numbers obtained from the Shannon formulation and also potentially serves as a simpler, faster, and more computationally optimal method. This is particularly useful in applications, where there is a constraint on the computational resources available, such as mobile and limited devices. The method is also useful as a way of uniquely identifying and characterizing Random probability sources, such as those from astronomical and/or optical (photonic) phenomenon. The author also investigates the impact of incorporating the above notion of Entropy into the Mars Rover IER software and confirms the conclusions in the original article from Jet Propulsion Laboratories, NASA, which describes the ICER Progressive Wavelet Image Compressor.
| Published in | American Journal of Mathematical and Computer Modelling (Volume 10, Issue 4) |
| DOI | 10.11648/j.ajmcm.20251004.13 |
| Page(s) | 145-150 |
| Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
| Copyright |
Copyright © The Author(s), 2025. Published by Science Publishing Group |
Shannon Entropy, Alternate Formulation, Golomb Codes, Takens Embedding
DT-JSCC | Discrete Task Oriented Joint Source Channel Coding |
| [1] | Mutual Information-Empowered Task-Oriented Communication: Principles, Applications and Challenges Hongru Li, Songjie Xie, Jiawei Shao, Zixin Wang, Hengtao He Shenghui Song, Jun Zhang, and Khaled B. Letaief, |
| [2] | Robust Information Bottleneck for Task-Oriented Communication with Digital Modulation Songjie Xie, Student Member, IEEE, Shuai Ma, Member, IEEE, Ming Ding, Senior Member, IEEE, Yuanming Shi, Senior Member, IEEE, MingJian Tang, and Youlong Wu, Member, IEEE, |
| [3] | An Alternate Formulation of The Mutual Information Statistic Which Yields a More Realistic Measure, Leading to a More Precise and Dependable Model of Natural Language Translation Probabilities Among Parallel Corpora Author: Parthasarathy Srinivasan Beehive Software Solutions. |
| [4] | IPN Progress Report 42-155 November 15, 2003 The ICER Progressive Wavelet Image Compressor A. Kiely and M. Klimesh. |
| [5] |
W. B. Pennebaker and J. L. Mitchell, JPEG Still Image Data Compression Standard, Van Nostrand Reinhold, New York, 1993. HYPERLINK "
https://archive.org/details/jpegstillimageda0000penn/page/n9/mode/2up" https://archive.org/details/jpegstillimageda0000penn/page/n9/mode/2up |
| [6] | A. R. Calderbank, I. Daubechies, W. Sweldins, and B.-L. Yeo, “Wavelet Transforms that Map Integers to Integers,” Applied and Computational Harmonic Analysis, vol. 5, pp. 332-369, July 1998. |
| [7] | D. Gündüz, Z. Qin, I. E. Aguerri, H. S. Dhillon, Z. Yang, A. Yener, K. K. Wong, and C.-B. Chae, “Beyond transmitting bits: Context, semantics, and task-oriented communications,” IEEE J. Sel. Areas Commun. vol. 41, no. 1, pp. 5-41, 2022. |
| [8] | C. Cai, X. Yuan, and Y.-J. A. Zhang, “End-to-end learning for task-oriented semantic communications over mimo channels: An information-theoretic framework,” IEEE J. Sel. Areas Commun., 2025. |
| [9] | C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J., vol. 27, no. 3, pp. 379-423, 1948. |
| [10] | F. Zhai, Y. Eisenberg, and A. K. Katsaggelos, “Joint source-channel coding for video communications,” Handbook of Image and Video Processing, pp. 1065-1082, 2005. |
| [11] | H. Xie, Z. Qin, G. Y. Li, and B.-H. Juang, “Deep learning enabled semantic communication systems,” IEEE Transactions on Signal Processing, vol. 69, pp. 2663-2675, 2021. |
| [12] | K. Wei, J. Li, C. Ma, M. Ding, C. Chen, S. Jin, Z. Han, and H. V. Poor,“Low-latency federated learning over wireless channels with differential privacy,” IEEE Journal on Selected Areas in Communications, vol. 40, no. 1, pp. 290-307, 2021. |
| [13] | K. Choi, K. Tatwawadi, A. Grover, T. Weissman, and S. Ermon, “Neural joint source-channel coding,” in International Conference on Machine Learning. PMLR, 2019, pp. 1182-1192. |
| [14] | Takens, F. (1981). Detecting strange attractors in turbulence. In: Rand, D., Young, LS. (eds) Dynamical Systems and Turbulence, Warwick 1980. Lecture Notes in Mathematics, vol 898. Springer, Berlin, Heidelberg. |
APA Style
Srinivasan, P. (2025). An Alternate Formulation for Computing/Validating the Shannon Entropy of Probability Distributions. American Journal of Mathematical and Computer Modelling, 10(4), 145-150. https://doi.org/10.11648/j.ajmcm.20251004.13
ACS Style
Srinivasan, P. An Alternate Formulation for Computing/Validating the Shannon Entropy of Probability Distributions. Am. J. Math. Comput. Model. 2025, 10(4), 145-150. doi: 10.11648/j.ajmcm.20251004.13
AMA Style
Srinivasan P. An Alternate Formulation for Computing/Validating the Shannon Entropy of Probability Distributions. Am J Math Comput Model. 2025;10(4):145-150. doi: 10.11648/j.ajmcm.20251004.13
@article{10.11648/j.ajmcm.20251004.13,
author = {Parthasarathy Srinivasan},
title = {An Alternate Formulation for Computing/Validating the Shannon Entropy of Probability Distributions},
journal = {American Journal of Mathematical and Computer Modelling},
volume = {10},
number = {4},
pages = {145-150},
doi = {10.11648/j.ajmcm.20251004.13},
url = {https://doi.org/10.11648/j.ajmcm.20251004.13},
eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajmcm.20251004.13},
abstract = {One of the most pervasive applications in Computing, is the generation of Random numbers, which belong to a certain probability distribution such as a Gaussian (normal) distribution. These probability distributions possess statistical properties such as expected values (mean), variance (standard deviation), p-value, Entropy etc.; out of which Entropy is significant, for quantifying the amount of (useful) information, that a particular instance of a distribution embodies. This quantification of Entropy is of value as a characterizing metric, which determines the amount of randomness/uncertainty and/or redundancy that can be achieved using a particular distribution instance. This is particularly useful for communication, cryptographic and astronomical applications in this day and age. In the present work the Author introduces an alternate way to calculate the approximate value of the Information Entropy (with a variation to the formulation of Information Entropy by Claude Shannon, as known by the scientific community); by observing that a Takens embedding of the probability distribution yields a simple measure of the Entropy; by taking into consideration only four critical/representative points of the embedding. By comparative experimentation, the Author has been able to empirically verify that this alternate formulation is consistently valid: The baseline experiment chosen relates to Discrete Task Oriented Joint Source Channel Coding (DT-JSCC) which utilizes entropy computation to perform efficient and reliable task oriented communication (transmission and reception) as will be elaborated further. The author performed the comparison by employing the Shannon formulation for Entropy computation in the baseline DT-JSCC experiment and then repeating the experiment by employing the Entropy formulation, introduced in this work. Eventually, the accuracy of results obtained (data models generated) were almost identical (differing in accuracy by only ~ 1% overall). Thus, the alternate formulation introduced in this work, provides a reliable means of validating the random numbers obtained from the Shannon formulation and also potentially serves as a simpler, faster, and more computationally optimal method. This is particularly useful in applications, where there is a constraint on the computational resources available, such as mobile and limited devices. The method is also useful as a way of uniquely identifying and characterizing Random probability sources, such as those from astronomical and/or optical (photonic) phenomenon. The author also investigates the impact of incorporating the above notion of Entropy into the Mars Rover IER software and confirms the conclusions in the original article from Jet Propulsion Laboratories, NASA, which describes the ICER Progressive Wavelet Image Compressor.},
year = {2025}
}
TY - JOUR T1 - An Alternate Formulation for Computing/Validating the Shannon Entropy of Probability Distributions AU - Parthasarathy Srinivasan Y1 - 2025/12/24 PY - 2025 N1 - https://doi.org/10.11648/j.ajmcm.20251004.13 DO - 10.11648/j.ajmcm.20251004.13 T2 - American Journal of Mathematical and Computer Modelling JF - American Journal of Mathematical and Computer Modelling JO - American Journal of Mathematical and Computer Modelling SP - 145 EP - 150 PB - Science Publishing Group SN - 2578-8280 UR - https://doi.org/10.11648/j.ajmcm.20251004.13 AB - One of the most pervasive applications in Computing, is the generation of Random numbers, which belong to a certain probability distribution such as a Gaussian (normal) distribution. These probability distributions possess statistical properties such as expected values (mean), variance (standard deviation), p-value, Entropy etc.; out of which Entropy is significant, for quantifying the amount of (useful) information, that a particular instance of a distribution embodies. This quantification of Entropy is of value as a characterizing metric, which determines the amount of randomness/uncertainty and/or redundancy that can be achieved using a particular distribution instance. This is particularly useful for communication, cryptographic and astronomical applications in this day and age. In the present work the Author introduces an alternate way to calculate the approximate value of the Information Entropy (with a variation to the formulation of Information Entropy by Claude Shannon, as known by the scientific community); by observing that a Takens embedding of the probability distribution yields a simple measure of the Entropy; by taking into consideration only four critical/representative points of the embedding. By comparative experimentation, the Author has been able to empirically verify that this alternate formulation is consistently valid: The baseline experiment chosen relates to Discrete Task Oriented Joint Source Channel Coding (DT-JSCC) which utilizes entropy computation to perform efficient and reliable task oriented communication (transmission and reception) as will be elaborated further. The author performed the comparison by employing the Shannon formulation for Entropy computation in the baseline DT-JSCC experiment and then repeating the experiment by employing the Entropy formulation, introduced in this work. Eventually, the accuracy of results obtained (data models generated) were almost identical (differing in accuracy by only ~ 1% overall). Thus, the alternate formulation introduced in this work, provides a reliable means of validating the random numbers obtained from the Shannon formulation and also potentially serves as a simpler, faster, and more computationally optimal method. This is particularly useful in applications, where there is a constraint on the computational resources available, such as mobile and limited devices. The method is also useful as a way of uniquely identifying and characterizing Random probability sources, such as those from astronomical and/or optical (photonic) phenomenon. The author also investigates the impact of incorporating the above notion of Entropy into the Mars Rover IER software and confirms the conclusions in the original article from Jet Propulsion Laboratories, NASA, which describes the ICER Progressive Wavelet Image Compressor. VL - 10 IS - 4 ER -