Academic Journal

Using the Semantic Information G Measure to Explain and Extend Rate-Distortion Functions and Maximum Entropy Distributions

التفاصيل البيبلوغرافية
العنوان: Using the Semantic Information G Measure to Explain and Extend Rate-Distortion Functions and Maximum Entropy Distributions
المؤلفون: Chenguang Lu
المصدر: Entropy; Volume 23; Issue 8; Pages: 1050
بيانات النشر: Multidisciplinary Digital Publishing Institute
سنة النشر: 2021
المجموعة: MDPI Open Access Publishing
مصطلحات موضوعية: rate-distortion function, boltzmann distribution, semantic information measure, machine learning, maximum entropy, minimum mutual information, Bayes’ formula
الوصف: In the rate-distortion function and the Maximum Entropy (ME) method, Minimum Mutual Information (MMI) distributions and ME distributions are expressed by Bayes-like formulas, including Negative Exponential Functions (NEFs) and partition functions. Why do these non-probability functions exist in Bayes-like formulas? On the other hand, the rate-distortion function has three disadvantages: (1) the distortion function is subjectively defined; (2) the definition of the distortion function between instances and labels is often difficult; (3) it cannot be used for data compression according to the labels’ semantic meanings. The author has proposed using the semantic information G measure with both statistical probability and logical probability before. We can now explain NEFs as truth functions, partition functions as logical probabilities, Bayes-like formulas as semantic Bayes’ formulas, MMI as Semantic Mutual Information (SMI), and ME as extreme ME minus SMI. In overcoming the above disadvantages, this paper sets up the relationship between truth functions and distortion functions, obtains truth functions from samples by machine learning, and constructs constraint conditions with truth functions to extend rate-distortion functions. Two examples are used to help readers understand the MMI iteration and to support the theoretical results. Using truth functions and the semantic information G measure, we can combine machine learning and data compression, including semantic compression. We need further studies to explore general data compression and recovery, according to the semantic meaning.
نوع الوثيقة: text
وصف الملف: application/pdf
اللغة: English
Relation: Information Theory, Probability and Statistics; https://dx.doi.org/10.3390/e23081050
DOI: 10.3390/e23081050
الاتاحة: https://doi.org/10.3390/e23081050
Rights: https://creativecommons.org/licenses/by/4.0/
رقم الانضمام: edsbas.463A9ABF
قاعدة البيانات: BASE