Home| Journals | Statistics Online Expert | About Us | Contact Us
    About this Journal  | Table of Contents
Untitled Document

[Abstract] [PDF] [HTML] [Linked References]

 

Harmonic Measures of Fuzzy Entropy and their Normalization

 

Priti Gupta1, Abhishek Sheoran2*

 1,2 Department of Statistics, M.D. University, Rohtak, Haryana, INDIA.

*Corresponding Address

[email protected]

Research Article

 


Abstract: In the present communication, we give a brief summary about the various kinds of fuzziness measures investigated so far and formulate some critical aspect of the theory. Keeping in the view the non-probabilistic nature of the experiments, two new measures of fuzzy entropy have been introduced. The essential properties of these measures have been studied. The existing as well as the newly introduced measure of fuzzy entropy has been applied to the normalized principle.

Keywords: Fuzzy Entropy, Crisp Set, Membership Function, Normalization.

 

Introduction

The main concepts of information theory can be grasped by considering the most widespread means of human communication languages. Two important aspects of a concise language are as follows: First, the most common words (e.g., “a”, “the” and “I”) should be shorter than less common words (e.g., “Benefit”, “Generation” and “Mediocre”), so that sentences will not be too long. Such a tradeoff in word length is analogous to data compression and is the essential aspect of source coding. Second, if part of a sentence is unheard or misheard due to noise- e.g., a passing car- the listener should still is able to glean the meaning of the underlying message.  Such robustness is as essential for an electronic communication system as it is for a language: properly building such robustness in to communication is done by channel coding. Source coding and channel coding are the fundamental concerns of information theory. Fundamental theorem of information theory state “It is possible to transmit information over a noisy channel at any rate less than channel coding with an arbitrary small probability of error.” Information theory considered here to be identified by Shannon (1948). Are probabilistic methods and statistical techniques be the best available tools for solving problems involving uncertainty? This question is often now being answered in the negative, especially by computer scientists and engineers. These respondents are motivated by the view that probability is inadequate for dealing with “certain kinds” of uncertainty. Thus alternatives are needed to fill the gap. Zadeh (1965) introduced fuzzy set as a mathematical construct in set theory with no intention of using it to enhance, complement or replace probability theory. Fuzzy sets plays a significant role in many deployed system because of their capability to model known statistical imprecision. A fuzzy set is a class of objects with continuum of grade of membership. Such a set is characterized by a member of functions which assign to each object a grade of membership ranging between 0 and 1. Fuzzy set A is represented as 1 where 1 gives the degree of belongingness of the element 1 to the set A. If every element of the set A is 0 or 1, there is no uncertainty about it and the set is said to be crisp set. On the other hand, a fuzzy set A is defined by a characteristic function 1 the function 1 associate with each 1 belongs to 1 grade of membership to the set A and is known as membership function. The importance of the fuzzy sets comes from the fact that it can be deal with imprecise and inexact information.

 

Preliminaries

Fuzzy Information Measures

De Luca and Termini (1972) introduced the concept of fuzziness measure in order to obtain a global measure of the indefiniteness connected with the situations described by fuzzy sets. Such a measure characterizes the sharpness of the membership functions. It also can be regarded as entropy, in the sense, that it measures the uncertainty about the presence or absence of a certain property over the investigated set. They introduced a set of four axioms and these are widely accepted as criterion for defining any fuzzy entropy. In fuzzy set theory, the entropy is a measure of fuzziness which expresses the amount of average ambiguity or difficulty in making a decision whether an element belongs to a set or not. A measure of fuzziness 1 in a fuzzy set should have at least the following properties:

P1 (Sharpness): 1  is minimum if and only if A is a crisp set, i.e. 1  for all1.

P2 (Maximality): 1  is maximum if and only if A is a most fuzzy set i.e. 1  for all1.

P3 (Resolution): 1, where 1 is a sharpened version of1.

P4 (Symmetry): 1, where 1 is complement set of 1

Definition: A fuzzy set 1 is called a sharpened version of fuzzy set 1 if the following conditions are satisfied:

1         if 1 for all 1.

1         if 1 for all 1.

The above defined properties are natural requirements for a measure of fuzziness. All measures introduced so far satisfy these properties.

Since 1 and 1 give the same degree of fuzziness, therefore, De Luca and Termini (1972) defined measure of  fuzzy entropy for a fuzzy set A corresponding to Shannon’s (1948) entropy as

1    1                                                   

In the next section, we discuss a brief summary about the various fuzziness measures investigated so far.

The Survey of Some Existing Measures of Fuzzy Entropy

This section includes various developments in the area of fuzzy information measures. Kaufmann (1975), proposed a measure using the generalized relative hamming distance as:

 1                      1      

And defined another measure using the generalized relative Euclidean distance as:

1      

Ebanks (1983) defined fuzzy information measure for a fuzzy set as:

1      

Kapur (1997) introduced the following measure of fuzzy entropy, which uses the Logarithmic scale as:

1      

; α≥1, β≤1

After that Parkash and Sharma (2004) introduced two measures of fuzzy entropy, keeping in the view the existing probability measures, which are given by

1

1      

    ;1 

And

1

 

1         (2.6)         

            ;1  

Some other measure of fuzzy entropy were discussed, characterized and generalized by various authors. In next section, we put an effort to propose some new measure of fuzzy entropy corresponding to Harmonic Mean representation.

 

Measure of Fuzzy Entropy and their Validity

Here, we introduced some fuzzy information entropy corresponding to harmonic mean, which are as follows:

Fuzzy Entropy Corresponding to Harmonic Mean

Firstly, we propose a new measure of fuzzy information as given by the following mathematical expression:

1      

To prove that the measure 1 is a correct measure of fuzzy entropy, we have to study its essential properties which are as follows:

  1. 2is a concave function of 3.

4 We have

5

 

11

1 

Hence, 1 is a concave function.

  1. 1 does not change when 1 is replaced by 1-1.
  2. 1  is an increasing function of 1 for 1
  3. 1  is an decreasing function of 1 for 1
  4. 1 = 0 when 1 is a crisp set.
  5. 1 attain its maximum values when 1
Since 1 satisfies all the essential properties of being a measure of fuzzy entropy, it is valid measure of fuzzy entropy. Different values of 1 corresponding to different values of 1 are computed in the Table-1:

Table 1

1

1

0.0

0.0000

0.1

0.0672

0.2

0.1148

0.3

0.1476

0.4

0.1699

0.5

0.1733

0.6

0.1699

0.7

0.1476

0.8

0.1148

0.9

0.0672

1.0

0.0000

 The value of 1 are graphically represented in the Fig-1, which shows that the measure introduced in equation 1 satisfies all the properties from 1 to1.

 

1

Figure 1: Graph Hh(A) Vs. µA(xi)

 

Exponential Fuzzy Entropy Corresponding to   Harmonic Mean

We propose here another measure of fuzzy entropy as follows:

1

Exponential Fuzzy Entropy Corresponding to   Harmonic Mean

We propose here another measure of fuzzy entropy as follows:

1

We shall prove that the measure introduced in equation 1  is a correct measure of fuzzy entropy, and, we studied its essential properties which are as follows:

  1. 1 is a concave function of 1.

Proof:

1

Also,

12

 

12

3

Since 1 

Hence, 1is a concave function. 1

  1. 1 does not change when 1 is replaced by      

     1-1.

3. 1  is an increasing function of 1 for       0≤11.

4.  1  is an decreasing function of 11 for          

     1

5.   1 when 1 is a crisp set.

6. 11 attain its maximum value when 1

Since 1  satisfies all the essential properties of being a measure of fuzzy entropy, it is valid measure of fuzzy entropy. Different values of 1 corresponding to different values of 1 are computed in the Table-2:    

 

Table 2

1

1

0.0

0.000000

0.1

0.159349

0.2

0.285773

0.3

0.378573

0.4

0.435609

0.5

0.454884

0.6

0.435609

0.7

0.378573

0.8

0.285773

0.9

0.159349

1.0

0.000000

 

The value of 1 1 are graphically represented in the Fig-2, on the next page, which shows that the measure introduced in equation 1 satisfies all the properties from 1 to 1.

1

Figure 2: Graph between1 and  1

 

Normalized Fuzzy Information Entropy

Need for Normalizing Fuzzy Information Measures

The measure of fuzzy entropy due to De Luca and Termini (1.1) measures the degree of entropy among 1 The fuzzy values, that is, the greater the equality among, the greater the value of 1 and thus entropy has its maximum value 1 when all the fuzzy values are equal, that is, when each1.

Example: let us consider the following fuzzy distribution,

1

1

Then, we have following fuzzy values for 1

1

 

We want to check which fuzzy distribution is more uniform or to which distribution the fuzzy values are more equal? From the values of two fuzzy entropies, it appears that B is more uniform than A, but still1. The fallacy arises due to the fact that the fuzzy entropy depends not only on the degree of equality among the fuzzy values; it also depends on the value of n. So long as n is the same, entropy can be used to compare the uniformity of fuzzy distributions. But, if the number of outcomes are different, then fuzzy entropy is not a satisfactory measure of uniformity. In that case, we try to eliminate the effect of n by normalizing the fuzzy entropy, which is by defining a normalized measure of fuzzy entropy as

1

For De Luca and termini’s [1] measure of fuzzy entropy, we have

1= 0.97095, and

1= 0.93508

Obviously,11,

Thus, A is more uniform than B. This gives the correct result that B is less uniform than A. Thus, to compare the uniformity or equality or uncertainty of two fuzzy distributions, we should compare their normalized measures of fuzzy entropy.

 

Normalized Measure of Fuzzy Information Entropy

In this section, we have proposed normalized measure of fuzzy entropy corresponding to,

1

The maximum value of above is given by

2

Thus, the expression for normalized measure is given by

3

Proceeding on the similar way, we can obtain the normalized measure of fuzzy entropy corresponding to [3.2] as:

1

On similar lines, we can obtain maximum values for different fuzzy entropies and consequently developed many other expressions for the normalized measure of fuzzy entropy.

 

Conclusion

This work introduces two new measure of fuzzy entropy called fuzzy entropy corresponding to harmonic mean and exponential fuzzy entropy corresponding to harmonic mean. Some properties of these measures have been studied. We have also introduced the concept of normalized measure of fuzzy entropy is also introduced

 

 

 

References

  1. De Luca A. and Termini S., “A definition of a Non-probabilistic Entropy in Setting of Fuzzy Sets”, Information and Control, Vol. 20, pp. 301-312, 1972.
  2. Ebanks B.R., “On Measure of fuzziness and their Representations”, Journal of Mathematical Analysis and Applications, Vol. 94, pp. 24-37, 1983.
  3. Kapur J.N., “Measure of Fuzzy Information”, Mathematical Science Trust Society, Vol. 1, 1997.
  4. Kaufmann A., “Introduction to the Theory of Fuzzy Subsets”, Academic Press, Vol. 1, 1975.
  5. Parkash O. And Sharma, P.K., “Measure of Fuzzy Entropy and their Relations”, International Journal of Management and Systems, Vol. 20, pp. 65-72, 2004.
  6. Shannon, C.E., “A Mathematical Theory of Communication”, Bell System Technical Journal, Vol. 27, pp. 379-423. 1948.
  7. Zadeh L.A., “Fuzzy Sets” ,Information and Control, Vol. 8, pp. 338-353, 1965.

 

 

 
 
 
 
 
  Copyrights statperson consultancy www

Copyrights statperson consultancy www.statperson.com  2013. All Rights Reserved.

Developer Details