4972Accesses
8Citations
3 Altmetric
Abstract
This paper deals with the divergence of fuzzy variables froma priori one. Within the framework of credibility theory, a fuzzy cross-entropy is defined to measure the divergence, and some mathematical properties are investigated. Furthermore, a minimum cross-entropy principle is proposed, which tells us that out of all membership functions satisfying given moment constraints, we should choose the one that is closest to the givena priori membership function.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, books and news in related subjects, suggested using machine learning.Avoid common mistakes on your manuscript.
Introduction
Fuzzy entropy provides a quantitative measure of the uncertainty associated with each fuzzy variable. Since Zadeh [1] introduced the fuzzy entropy as a weighted shannon entropy, researchers gave several definitions from different angles, such as De Luca and Termini [2], Yager [3], Kaufmann [4], Kosko [5], Pal and Pal [6]. The above definitions characterize the uncertainty resulting primarily from the linguistic vagueness rather than resulting from information deficiency and vanish when the fuzzy variable is an equipossible one. However, Liu [7] suggested that a fuzzy entropy should meet at least the following three basic requirements: the entropy of a crisp number is zero; the entropy of an equipossible fuzzy variable is maximum; and the entropy is applicable not only to finite and infinite cases but also to discrete and continuous cases. In order to meet these requirements, within the framework of credibility theory, Li and Liu [8] provided a new definition of fuzzy entropy to characterize the uncertainty resulting from information deficiency which is caused by the impossibility to predict the specified value that a fuzzy variable takes. Based on this definition, Li and Liu [9] proposed the fuzzy maximum entropy principle and proved some maximum entropy theorems.
This paper is devoted to formulate a fuzzy cross-entropy characterized by credibility measure. For this purpose, we organize this paper as follows. The ‘Preliminaries’ section recalls some useful definitions and properties about credibility theory. The ‘Fuzzy cross-entropy’ section defines the fuzzy cross-entropy and studies some useful properties. In the ‘Minimum cross-entropy principle’ section, the minimum cross-entropy principle is proposed. At the end of this paper, a brief summary is given.
Preliminaries
Credibility theory [10] is a branch of mathematics for studying the behavior of fuzzy phenomena. LetΘ be a nonempty set, and let
be the power set ofΘ. Each elementA of
is called an event. In 2002, Liu and Liu [11] presented a credibility measure Cr{A} to express the chance that eventA occurs. Furthermore, Li and Liu [12] proved that a set function is a credibility measure if and only if it satisfies the following axioms:
Axiom 1. (Normality) Cr{Θ}=1;
Axiom 2. (Monotonicity) Cr{A}≤Cr{B} wheneverA⊂B;
Axiom 3. (Self-duality) Cr is self-dual, i.e., Cr{A}+Cr{Ac}=1 for any eventA;
Axiom 4. (Countable subadditivity) Cr{∪iAi}= supiCr{Ai} for any events {Ai} with supiCr{Ai}<0.5.
If Cr is a credibility measure, the triplet (Θ,,Cr) is called a credibility space. A fuzzy variable is defined as a function from a credibility space (Θ,,Cr) to the set of real numbers. Letξ be a fuzzy variable. Then, its membership function is derived from the credibility measure by:
Conversely, ifξ is a fuzzy variable with membership functionμ, then, for any setB⊆ℜ, we have:
This formula is also called the credibility inversion theorem.
Definition2.1.
Letξ be a fuzzy variable taking values in {x1,x2,⋯,xn} (Li and Liu [8]). Then, its fuzzy entropy is defined as:
whereS(t)=−t lnt−(1−t) ln(1−t).
Fuzzy entropy is used to quantify the uncertainty associated to fuzzy variables.
Theorem2.1.
Letξ be a fuzzy variable taking values in {x1,x2,⋯,xn} (Li and Liu [8]). Then, we have:
Especially,H[ξ] attains its minimum value 0 if and only ifξ is a crisp number, andH[ξ] attains its maximum valuen ln2 if and only ifξ is an equipossible fuzzy variable.
Definition2.2.
Letξ be a continuous fuzzy variable (Li and Liu [8]). Then, its fuzzy entropy is defined as:
Theorem2.2.
Letξ be a continuous fuzzy variable taking values in [a,b] (Li and Liu [8]). Then, we have:
Especially,H[ξ] attains its minimum value if and only ifξ is a crisp number, andH[ξ] attains its maximum value if and only ifξ is an equipossible fuzzy variable.
In 2007, Li and Liu [9] proposed a fuzzy maximum entropy principle, which tells us that out of all the membership functions satisfying the given constraints, we should select the one that maximizes the entropy.
Fuzzy cross-entropy
In this section, we define a fuzzy cross-entropy for quantifying the divergence of fuzzy variables from ana priori one. The relation between fuzzy entropy and fuzzy cross-entropy is also discussed.
Definition3.1.
Letξ andη be two discrete fuzzy variables taking values in {x1,x2,⋯,xn}. Then, the fuzzy cross-entropy ofξ fromη is defined as:
whereT(s,t)=s ln(s/t)+(1−s) ln((1−s)/(1−t)).
It is easy to prove thatD[ξ;η] is permutationally symmetric, i.e., the value does not change if the outcomes are labeled differently.
Definition3.2.
Letξ andη be two continuous fuzzy variables taking values in [a,b]. Then, the cross-entropy ofξ fromη is defined as:
Letμ andν be the membership functions of continuous fuzzy variablesξ andη, respectively. Since Cr{ξ=x}=μ(x)/2 and Cr{η=x}=ν(x)/2, the cross-entropy ofξ fromη can be rewritten as:
Remark3.1.
It is easy to extend the concept of cross-entropy to fuzzy vectors. Ifξ=(ξ1,ξ2,⋯,ξm) andη=(η1,η2,⋯,ηm) are discrete, we have:
Ifξ andη are continuous variables, we have:
Remark3.2.
It is clear thatT(s,t) is a function from [ 0,1]×[ 0,1] to [ 0,+∞). Please also mention that:
In addition, it is easy to prove that:
Then, the following properties aboutT(s,t) can be easily proved: (a)T(s,t) is strictly convex with respect to (s,t) and attains its minimum value zero on the lines=t; and (b) for any 0≤s≤1 and 0≤t≤1, we haveT(s,t)=T(1−s,1−t).
Theorem3.1.
For any fuzzy variablesξ andη, we haveD[ξ;η]≥0, and the equality holds if and only ifξ andη have the same membership function.
Proof.
Letμ andν be the membership functions of discrete fuzzy variablesξ andη, respectively. SinceT(s,t) is strictly convex about (s,t) and attains its minimum value zero on the lines=t, we haveT(Cr{ξ=xi},Cr{η=xi})≥0 for alli, which implies that:
Furthermore, for any 0≤s∗≤1, the unique minimum point ofT(s∗,t) ist=s∗. Thus, we haveD[ξ;η]=0 if and only ifT(Cr{ξ=xi},Cr{η=xi})=0, that is:
for alli=1,2,⋯,n. Ifξ andη are continuous fuzzy variables, the theorem can be proved in a similar way. The proof is complete. □
Theorem3.2.
Letτ be the equipossible fuzzy variable with membership functionν(xi)=1 for alli=1,2,⋯,n. Then, for any discrete fuzzy variableξ taking values in {x1,x2,⋯,xn}, we have:
Proof.
According to the credibility inversion theorem, it is easy to prove that Cr{τ=xi}=0.5 for alli=1,2,⋯,n. It follows from the definition of cross-entropy thatD[ξ,τ] is:
The proof is complete. □
Theorem3.3.
Letτ be the equipossible fuzzy variable with membership functionν(x)=1 for allx∈[a,b]. Then, for any continuous fuzzy variableξ taking values in [a,b], we have:
Proof.
It follows from the definition of cross-entropy thatD[ξ,τ] is:
The proof is complete. □
Minimum cross-entropy principle
In many real problems, the membership function of a fuzzy variable is unavailable except some partial information, for example, moment constraints, which may be based on observations. In this case, the maximum entropy principle (Li and Liu [9]) tells us that out of all the membership functions satisfying given constraints, choose the one that has maximum entropy. However, there may be another type of information, for example,a priori membership function, which may be based on intuition or experience with the problem. If both thea priori membership function and the moment constraints are given, which membership function should we choose? The following minimum cross-entropy principle tells us that out of all membership functions satisfying given moment constraints, choose the one that is closest to the givena priori membership function.
There is nothing mysterious about this principle. It is just based on common sense. Our membership function must be consistent with observations or given information, and if there are many membership functions consistent with the given information, we must choose the one that is nearest to our intuition and experience. On the other hand, if we have noa priori experience or intuition to guide us, we choose the membership function that is nearest to the equipossible one. In this sense, if thea priori membership function is not prescribed and the fuzzy variable is simple (bounded for continuous case), the maximum entropy principle and minimum cross-entropy principle are consistent because:
whereυ is the equipossible fuzzy variable.
Conclusion
Based on credibility measure, a definition of cross-entropy was proposed in this paper to measure the divergence of fuzzy variables froma priori one, and some properties were investigated. Furthermore, a minimum cross-entropy principle was proposed as an important entropy optimization principle.
References
Zadeh, LA: Probability measures of fuzzy events. J. Math. Anal. Appl. 23, 421–427 (1968).
De Luca, A, Termini, S: A definition of nonprobabilistic entropy in the setting of fuzzy sets theory. Inf. Control. 20, 301–312 (1972).
Yager, RR: On measures of fuzziness and negation, part I: membership in the unit interval. Int. J. General Syst. 5, 221–229 (1979).
Kaufmann, A: Introduction to the Theory of Fuzzy Subsets. Academic Press, New York (1975).
Kosko, B: Fuzzy entropy and conditioning. Inf. Sci. 40, 165–174 (1986).
Pal, NR, Pal, SK: Higher order fuzzy entropy and hybrid entropy of a set. Inf. Sci. 61, 211–231 (1992).
Liu, B: A survey of entropy of fuzzy variables. J. Uncertain Syst. 1(1), 4–13 (2007).
Li, P, Liu, B: Entropy of credibility distributions for fuzzy variables. IEEE Trans. Fuzzy Syst. 16(1), 123–129 (2008).
Li, X, Liu, B: Maximum entropy principle for fuzzy variables. Int. J. Uncertainty Fuzziness Knowledge-Based Syst. 15(Supp 2), 40–48 (2007).
Liu, B: Uncertainty Theory. Springer-Verlag, Berlin (2004).
Liu, B, Liu, YK: Expected value of fuzzy variable and fuzzy expected value models. IEEE Trans. Fuzzy Syst. 10(4), 445–450 (2002).
Li, X, Liu, B: A sufficient and necessary condition for credibility measures. Int. J. Uncertainty Fuzziness Knowledge-Based Syst. 14(5), 527–535 (2006).
Acknowledgements
This work was supported by the National Natural Science Foundation of China (No. 71371027) and Program for New Century Excellent Talents in University under Grant No. NCET-13-0649.
Author information
Authors and Affiliations
School of Economics and Management, Beijing University of Chemical Technology, No. 15, Beisanhuan East Road, Beijing, 100029, China
Xiang Li
- Xiang Li
Search author on:PubMed Google Scholar
Corresponding author
Correspondence toXiang Li.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0), which permits use, duplication, adaptation, distribution, and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Li, X. Fuzzy cross-entropy.J. Uncertain. Anal. Appl.3, 2 (2015). https://doi.org/10.1186/s40467-015-0029-5
Received:
Accepted:
Published:
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative

