International Journal of Scientific & Engineering Research Volume 4, Issue 2, February-2013 1

ISSN 2229-5518

Some New Trigonometric, Hyperbolic and Exponential Measures of Intutionistic Fuzzy Information.

Jha P., MishraVikas Kumar

Abstract: New Trigonometric, Hyperbolic and Exponential Measures of Intutionistic Fuzzy Entropy and Intutionistic Fuzzy Directed

Divergence are obtained and some particular cases have been discussed.

Index Terms: Intutionistic Fuzzy Set, Intutionistic Fuzzy Entropy, Intutionistic Fuzzy Directed Divergence, Measures of Intutionistic Fuzzy

Information.

2000 Mathematics Subject Classification: 94A17 and 94 D 05

—————————— ——————————
1. Introduction: Uncertainty and fuzziness are the basic nature of human thinking and of many real world objectives. Fuzziness is found in our decision, in our language and in the way we process information. The main use of information is to remove uncertainty and fuzziness. In fact, we measure information supplied by the amount of probabilistic uncertainty removed in an experiment and the measure of uncertainty removed is also called as a measure of information while measure of fuzziness is the measure of vagueness and ambiguity of uncertainties. Shannon [2] used “entropy” to measure uncertain degree of the randomness in a probability distribution. Let X is a discrete random variable with
= 0.5 if maximum uncertainty
In fact JlA (x) associates with each x ∈ U a grade of
membership in the set A. When JlA (x) is valued in {0,1}
it is the characteristic function of a crisp (i.e. nonfuzzy)
set. Since JlA (x) and 1 − JlA (x) gives the same degree
of fuzziness, therefore, corresponding to the entropy due
to Shannon [2], De Luca and Termini [1] suggested the following measure of fuzzy entropy:

n

H(A) = − JlA (xi )logJlA (xi )
probability distribution P = (p1, p2, … . . , pn ) in an
experiment. The information contained in this
experiment is given by

n

H(P) = − pi logpi (1)

i =1

Which is well known Shannon entropy.

i =1

n

+ (1 − JlA (xi ))log(1

i =1

− JlA (xi )) (2)

The concept of entropy has been widely used in
different areas, e.g. communication theory, statistical
mechanics, finance, pattern recognition, and neural network etc. Fuzzy set theory developed by Lofti A. Zadeh [8] has found wide applications in many areas of science and technology, e.g. clustering, image processing, decision making etc. because of its capability to model non-statistical imprecision or vague concepts.
It may be recalled that a fuzzy subset A in U
(universe of discourse) is characterized by a
membership function JlA : U → [0,1] which represents the

grade of membership of x ∈ U in A as follows

JlA (x) = 0 if x does not belongs to A,
and tℎere is no uncertainty
= 1 if x belongs to A and tℎere is no uncertainty
De Luca and Termini introduced a set of properties and these properties are widely accepted as a criterion for defining any new fuzzy entropy. In fuzzy set theory, the entropy is a measure of fuzziness which expresses the amount of average ambiguity/difficulty in making a decision whether an element
belongs to a set or not. So, a measure of average
fuzziness in a fuzzy set should have at least the following properties to be valid fuzzy entropy:
i) H(A) = 0 when JlA (xi ) = 0 or 1.
ii) H(A) increases as JlA (xi ) increases from
0 to 0.5.
iii) H(A) decreases as JlA (xi ) increases from
0.5 to 1.
iv) H(A) = H(A), i.e. JlA (xi ) = 1 − JlA (xi )

IJSER © 2013 http://www.ijser.org

International Journal of Scientific & Engineering Research Volume 4, Issue 2, February-2013 2

ISSN 2229-5518

v) H(A) is a concave function of JlA (xi ).
Kullback and Leibler [7] obtained the measure of
directed divergence of probability distribution
P = (p1,p2, … . . , pn ) from the probability distribution Q = (q1, q2, … . . , qn ) as

n

pi
1. It should be defined in the range 0 ≤ JlA (x) +
vA (x) ≤ 1
2. It should be continuous in this range.
3. It should be zero when JlA (x) = 0 and
vA (x) = 1.
4. It should not be changed when JlA (x) changed
in to vA (x).

D(P: Q) = pi log q
(3)
5. It should be increasing function of JlA (x) in the

i =1 i

range 0 ≤ JlA (x) ≤ 0.5 and decreasing
Let A and B be two standard fuzzy sets with same
supporting points x1, x2, … . . , xn and with fuzzy
vectors JlA (x1), JlA (x2), … . . , JlA (xn ) and
JlB (x1), JlB (x2), … . . , JlB (xn ). The simplest measure of
fuzzy directed divergence as suggested by Bhandari and
Pal (1993), is

n

function of vA (x) in the range 0 ≤ vA (x) ≤
0.5
6. It should be concave function of JlA (x).

Conditions for measures of intutionistic fuzzy

JlA (xi )

directed divergence:- Let A and B be two standard


D(A: B) = JlA (xi ) log Jl
(x )
intutionistic fuzzy sets with same supporting points

i =1 B i

x1,
x2,
… . . , xn
with membership

n

(1 − JlA (xi ))
JlA (x1), JlA (x2), … . . , JlA (xn ) and
JlB (x1), JlB (x2), … . . , JlB (xn ). and non membership

+ (1 − JlA (xi )) log (
− Jl
(x )) (4)
v (x ), v (x ), … . . , v (x ) and

i =1

1 B i

A 1 A 2 A n

satisfying the conditions:
i) D(A: B) ≥ 0
ii) D(A: B) = 0 iff A = B
iii) D(A: B) = D(B: A)
iv) D(A: B) is a convex function of JlA (xi )
later kapur [5],[6] introduced a number of trigonometric
hyperbolic and exponential measures of fuzzy entropy and fuzzy directed divergence. In section 2 and 3 we introduce some new trigonometric, hyperbolic and exponential measures of intutionistic fuzzy entropy and measures of intutionistic fuzzy directed divergence. Now we define the concept of intutionistic fuzzy set and then discuss the properties of intutionistic fuzzy entropy and intutionistic fuzzy directed divergence.
The concept of intutionistic fuzzy set was first time given by K. T. Atanasav (1983) as

Intutionistic fuzzy set: - Let a set E be fixed. An intutionistic fuzzy set (IFs) A of E is an object having

vB (x1), vB (x2), … . . , vB (xn ).The measure of
intutionistic fuzzy directed divergence is denoted by
D(A: B) satisfying the following conditions.
i) D(A: B) ≥ 0
ii) D(A: B) = 0 iff A = B
iii) D(A: B) = D(B: A)
iv) D(A: B) is a convex function of JlA (xi ) .

2. New Measures of Intutionistic Fuzzy

Entropy

2.1 Trigonometric Measure of Intutionistic

Fuzzy Entropy

Consider the function sinmx where 0 ≤ x ≤ 1, is a convex function which gives us

n

H1 (A) = sin(mJlA (xi ))

i =1

n

the form A = {< x, JlA (x), va (x) > x ∈ E} where the function JlA : E → [0,1] and vA : E → [0,1] define
respectively the degree of membership and degree of
+ sin(mvA (xi ))

i=1

(5)
non membership of the element x ∈ E to the set A, which is a subset of E and for every x ∈ E, 0 ≤ JlA (x) + vA (x) ≤ 1.

Conditions for measures of intutionistic fuzzy entropy:-

is a new measure of intutionistic fuzzy entropy.
Clearly (5) is defined in the range 0 ≤ JlA (x) + vA (x) ≤
1. It is continous in this range. It is zero when JlA (x) =
0 and vA (x) = 1. It does not changed when JlA (x)
changed in to vA (x). It is an increasing function of
JlA (x) in the range 0 ≤ JlA (x) ≤ 0.5 and decreasing
function of vA (x) in the range 0 ≤ vA (x) ≤ 0.5. Let us

IJSER © 2013 http://www.ijser.org

International Journal of Scientific & Engineering Research Volume 4, Issue 2, February-2013 3

ISSN 2229-5518

consider JlA (x) = s and vA (x) = c(constant) then f(s) = sinms + sinmc differentiating twice w.r.t. s we get ft (s) = m(cosms) and ftt (s) = −m 2 (sinms) < 0.

n

HS (A) = cos( JlA (xi ))

i=1

Therefore H1 (A) is a concave function of JlA (x). So (5)
is a valid measure of intutionistic fuzzy entropy.
in particular for ≤ m

n

H2 (A) = sin( JlA (xi ))

i=1

n

n

+ cos( vA (xi ))

i =1

− cos (9)
is a new measure of intutionistic fuzzy entropy. Clearly
above given measures of intutionistic fuzzy entropies are satisfying all the properties which are given in
section 1. So these are valid measures of intutionistic
+ sin( vA (xi )) − sin

i =1

(6)
fuzzy entropy.

2.1 Hyperbolic Measure of Intutionistic

is also a new measure of intutionistic fuzzy entropy.
(5) is a special case of (6) when = m.

rr

Another special case of (6) arises when = 2 we get

Fuzzy Entropy

sinℎx, cosℎx, tanℎx where 0 ≤ x ≤ 1 are all convex functions and gives us following valid measures of
intutionistic fuzzy entropy

n


m
H (A) = sin

i=1 2

JlA (xi )

n

m

n

H (A) = sinℎ − sinℎ( JlA (xi ))

i=1


+ sin

i =1 2

vA (xi )

n

− sinℎ (vA (xi ))
(10)
− 1 (7)
Another trigonometric measure of intutionistic fuzzy
entropy is

n

HH (A) = sin( JlA (xi ) + a)

i =1

n

H (A) = cosℎ − cosℎ( JlA (xi ))

i=1

n

− cosℎ (v x )

i=1

n

+ sin( vA (xi ) + a)

i =1

i=1

n

A ( i )

(11)
− sin(a + ) (8)
H (A) = tanℎ − tanℎ( JlA (xi ))

i =1

(8) reduces to (6) when a = 0.

rr

n

− tanℎ( vA (xi ))
(12)
(8) reduces to (7) when a = 0, = 2.
(8) reduces to (5) when a = 0, = m.
(8) is a 2-parameter measure of intutionistic fuzzy

entropy.

rr

If we put = 2 we get

i=1

Since xm sinℎx, xm cosℎx, xm tanℎx are also convex
functions for m ≥ 1, we get the following additional
measures of intutionistic fuzzy entropy.

n

H (A) = sinℎ − JlA m (xi )sinℎ( JlA (xi ))

i =1

n

− vA m (xi )sinℎ( vA (xi ))

i =1

(13)

IJSER © 2013 http://www.ijser.org

International Journal of Scientific & Engineering Research Volume 4, Issue 2, February-2013 4

ISSN 2229-5518

n

H10 (A) = cosℎ − JlA m (xi )cosℎ( JlA (xi ))

i =1

n

iv) D1 (A: B) is a convex function JlA (xi ).
D1 (A: B) satisfies all properties of intutionistic fuzzy
directed divergence and therefore D1(A: B) is a valid
− vA m (xi )cosℎ( vA (xi ))

i=1

(14)
measure of intutionistic fuzzy directed divergence.

n

JlA (xi )

n

H11 (A) = tanℎ − JlA m (xi )tanℎ( JlA (xi ))

i =1

D2(A: B) = JlB (xi ) cosℎ

i =1

n

JlB

(xi )
vA (xi )

n

− vA m (xi )tanℎ( vA (xi ))

i =1

(15)
+ vB (xi ) cosℎ

i =1

n n


vB (xi )

2.2 Exponential Measures of Intutionistic

Fuzzy Entropy

− JlB (xi )cosℎ

i =1

− vB (xi )cosℎ

i=1

(18)

n

JlA (xi )
Since xm e a( is a convex function when m ≥ 1, x > 0
we get the measure of intutionistic fuzzy entropy
D (A: B) = JlB (xi ) tanℎ

i =1

JlB (

xi )

n

H12 (A) = e a − JlA m (xi ) e aaJ! ((x )

n

+ vB (xi ) tanℎ

vA (xi )
v (x )

i=1

n

i =1 B i

n n

− vA m (xi ) e aaJ!((x ) (16)

− Jl
x tanℎ − v
x tanℎ

i=1

3. New Measures of Intutionistic Fuzzy

i =1

B ( i )

i=1

B ( i )

(19)

Directed Divergence

3.1 New Hyperbolic Measures of

Intutionistic Fuzzy Directed Divergence

Again since xm sinℎx, xm cosℎx, xm tanℎx are also convex functions for m ≥ 1, we get the following more
general hyperbolic measures of intutionistic fuzzy
directed divergence.
Using the convexity of sinℎx, cosℎx, tanℎx we get the
D A B

n

Jl m

1-m


JlA (xi )
following measures of hyperbolic intutionistic fuzzy
directed divergence.

H( :

) =

i =1

n

A (xi )JlB

(xi ) sinℎ
JlB
(xi )

n

JlA (xi )
+ vA m (xi )vB 1-m (xi ) sinℎ

vA (xi )
vB (xi )
D1 (A: B) = JlB (xi ) sinℎ

i =1

JlB

(xi )

i=1

n n

n

vA (xi )
− JlB (xi )sinℎ
− vB (xi )sinℎ
(20)
+ vB (xi ) sinℎ

i =1

n n


vB (xi )

i =1

n

i =1

JlA (xi )
− JlB (xi )sinℎ

i =1

− vB (xi )sinℎ

i =1

(17)
DS(A: B) = JlA m (xi )JlB 1-m (xi ) cosℎ

i =1

n


JlB (xi )
Clearly
i) D1(A: B) ≥ 0
+ vA m (xi )vB 1-m (xi ) cosℎ

i=1


vA (xi )
vB (xi )
ii) D1 (A: B) = 0 iff A = B
iii) D1 (A: B) = D1 (B: A)

n

− JlB (xi )cosℎ

i =1

n

− vB (xi )cosℎ

i =1

IJSER © 2013 http://www.ijser.org

International Journal of Scientific & Engineering Research Volume 4, Issue 2, February-2013 5

ISSN 2229-5518

D (A: B)

n

= JlA m (xi )JlB 1-m (xi ) tanℎ

i =1

n

+ vA m (xi )vB1-m (xi )tanℎ

i =1

(21)


JlA (xi ) JlB (xi ) vA (xi ) vB (xi )

4. Conclusion

In section 2 and 3 by using the convexity of some trigonometric, hyperbolic and exponential function and satisfying the conditions of intutionistic fuzzy entropy and intutionistic fuzzy directed divergence we get some new trigonometric, hyperbolic and exponential measures of intutionistic fuzzy entropy and intutionistic fuzzy

n

− JlB (xi )tanℎ

i =1

n

− vB (xi )tanℎ

i =1

(22)
directed divergence.

5. References

3.2 New Exponential Measures of Fuzzy

Intutionistic Directed Divergence

Since xm e a( is a convex function when m ≥ 1, x > 0
we get the following measures of intutionistic fuzzy
directed divergence

n

a aJ!((x )

[1] Atanassov K. intuitionistic fuzzy sets, VII ITKR’s Session, Sofia, June 1983, Deposed in Central Sci.- Techn. Library of Bulg. Acad. of Sci., 1697/84).

[2] De Luca and S. Termini, A Definition of a Non- probabilistic Entropy in the Setting of fuzzy sets theory, Information and Control, 20, 301-312, 1972.

[3] C.E.Shannon (1948). “A Mathematical Theory of

Communication”. Bell. System Tech. Journal Vol. 27,


D (A: B) = JlA m (xi )JlB 1-m (xi ) e

i =1

n

a aJ!((x )

a ((x )

pp. 379-423,623-659.

[4] D. Bhandari and N. R. Pal, Some new information measures for fuzzy sets, Information Science, 67, 204 -

228, 1993.


+ vA m (xi )vB 1-m (xi ) e

i =1

a ((x ) −e a (23)

[5] F.M.Reza (1948&1949). “An introduction to information theory”. Mc. Graw-Hill, New-York.

Special case for m=0 and m=1 are

n

[6] J.N.Kapur. Some New Measures of Directed

Divergence. Willey Eastern Limited.

D (A: B) = JlB (xi ) e

i=1

a aJ! ((x)

a ((x)

[7] J.N.Kapur.Trigonometrical Hyperbolic and Exponential

Measure of Fuzzy Entropy and Fuzzy Directed

Divergence. MSTS

n aJ! ((x)

a

D (A: B)

n

+ vB (xi ) e

i=1

aJ! ((x )

a ((x) −ea (24)

[8] S. Kullback and R.A. Leibler, On Information and

Sufficiency, Annals of Mathematical Statistics, 22,79 -

86, 1951.

[9] L. A. Zadeh, Fuzzy Sets, Information and Control, 8,

a

= JlA (xi ) e

i=1

n

a ((x )

338 - 353, 1965

+ vA (xi ) e

i =1

a aJ!((x )

a ((x ) −e a (25)

Jha P. Department of Mathematics,

Govt Chattisgarh P.G. College, Raipur, Chhattisgarh (India) Email-purush.jha@gmail.com

Mishra Vikas Kumar Department of Mathematics,

Rungta College of Engineering and Technology, Raipur, Chhattisgarh (India) Email-vikas_mishravicky@yahoo.com

IJSER © 2013 http://www.ijser.org