International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 1

ISSN 2229-5518

General Pseudoadditivity of Kapur’s Entropy prescribed by the existence of equilibrium

Priti Gupta, Vijay Kumar

Abstract—In this paper, the concept of composibility of the total channel capacity of the system of channels composed of two independent subsystems of channel is discussed. This system of channels is a function of entropies of the subsystems. Here, we show that the generalized entropies, the Tsalli’s entropy, the normalized Tsalli’s entropy and the Kapur’s entropy are compatible with the composition law and defined consistently with the condition of equilibrium and satisfies pseudo additivity.

Key wordsEntropy, Kapur’s Entropy, Pseudo-additivity

1 INTRODUCTION

—————————— • ——————————
In the process of transmission of signals, a communication system has to do with the amount of information, the capacity of the communication channel and the effects of noise. Communication is a continuous process and a chan- nel is the ‘pipe’ along which a information is conveyed, and there are a wide variety of different communication channels available, from basic face-to-face conversation, through to telecommunication channels like the telephone or e-mail and computational channels like the medical record. Channels have attributes like capacity and noise, which determine their suitability for different tasks. When two parties ex- change information across a channel at the same time, this is known as synchronous communication.

For example:: Telephones are one of the commonest two-way synchronous channels. It is the nature of synchronous com- munication that it is interruptive, and these interruptions may have a negative impact on individuals who have high cognitive loads.

On the contrary, when individuals can be separated in time, they may use an asynchronous channel to support their inte- raction. Since there can be no simultaneous discussion, con- versations occur through a series of message exchanges. This can range from Post-it notes left on a colleague’s desk, to sophisticated electronic information systems. One of the benefits of asynchronous communication is that it is not in- herently interruptive, and if a communication is not urgent, asynchronous channels may be a preferred way of commu- nicating with otherwise busy individuals.
A communication system is thus a bundle of different com- ponents and the utility of the overall system is

————————————————

Dr. Priti Gupta is a professor in the Department of Staistics, Maharishi Dayanand University, Rohtak.

determined by the appropriateness of all the components to- gether.
If even one element of the system bundle is inappropriate to the setting, the communication system can under perform. A communication system provides only a channel for mu- tual information exchange which is not apriori dedicated to certain categories of information only.
The channels can be modeled physically by trying to calculate the physical processes which modify the transmission of signals.

For example:: In wireless communications the channel is often modeled by a random attenuation (known as fading) of the transmitted signal, followed by additive noise. It can be modeled by calculating the reflection of every object in the environment.

The attenuation term is a simplification of the underlying physical processes and captures the change in signal power over the course of the transmission. The noise in the model captures external interference and/or electronic noise in the receiver. If the attenuation term is complex, it describes the relative time that a signal takes to get through the channel. The statistics of the random attenuation are decided by previous measurements or physical simulations.
Channel models may be continuous channel models in that there is no limit to how precisely their values may be defined.
As of today, it is now possible to assign multiple custom channels to a single ad unit. This feature enables us to track our ad performance with greater flexibility and view more granular information. When generating our ad code, we will be able to add up to five custom channels to a specific in- stance of ad code. Multiple channels can be very useful

Mr. Vijay Kumar is Senior Lecturer in Mathematics, Delhi

College of Technology& Management, Palwal

IJSER © 2010 http://www.ijser.org

International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 2

ISSN 2229-5518

when we want to track one ad unit across several different metrics simultaneously.
For example: To run a sports website and we have to place a leader board at the top and bottom of every page. To track the performance of the ad placement, we have to create two custom channels - ‘Top Leader board’ and ‘Bottom Leader board’ and regenerated our ad code appropriately.

1.1 KAPURS ENTROPY

Another characterization, which obtains for a partition of probability space for two segments, one for low valued prob- abilities and other for high valued probabilities, of pseudo-
additivity of sub-systems A and B is:
Also, if we also want to compare your football pages and

H ( K )

( p , p , ...., p )

H ( K )

( p , p )

( p )a H

( K ) p1 , p2 ,......., pw

your baseball pages at the same time. With multiple custom channels, this is not a problem. Just create two new custom

a 1 2

w = a

l m + l



a

) pl pl

pl

channels called ‘Football Pages’ and ‘Baseball Pages’ and

+ ( p

)a H

( K ) p1 , p2 ,......., pw

add them to the appropriate ad units. Now, each leader boards will be tagged with two custom channels that let us

m a

wl w



) pm pm

pm

know the position they are in (top or bottom), and the type
of page on which they appear (football or baseball). Moreo-
ver, ad units tagged with multiple custom channels will log

Where

pl = I pi ,

i=1

pm =

I pi with

i= wl +1

wl + wm = w

impressions or clicks in each channel. As a result, we will see

r pi 1

r pi 1

large number of impressions and clicks when we view chan

and the sets H

and H

are conditional probabili-

nel reports than aggregate reports.
Generally, you can hear the static hiss on a radio or see it as snow on a television screen. Shannon realized that the
ties.

l pl J

a

l pm J

a

noise added to one digital pulse would generally make
Also, pi

> pi

iff

a < 1 and pi

< pi

iff

a > 1

the overall amplitude different from that of another, otherwise identical, pulse. Further, the noise amplitudes
Here,

( K )

a

p1 ,

p2 , ....,

pw )

is of the form of a homo-
for two pulses are independent.
In the present work, we are trying to discuss the additivity (noise) of subsystems of channels. Here, we are considering two subsystems of a channel.
The additive requirement for the system(channel) puts con- straints on the symmetry of the system under consideration:
geneous function.

2. COMPOSABILITY OF VARIOUS ENTROPIES PRESCRIBED BY THE EXISTENCE OF EQUILIBRIUM

it is indivisibly connected with the homogeneity of the sys-
Let

X ( A, B)

be the total channel capacity of the system of
tem, which is a familiar additional assumption in ordi-
nary circumstances . Channels exhibit complex sys-
tems/processes and attracting great attention. A common feature of these systems is that they stay in non equilibrium stationary states for significantly long periods. In these sys- tems, channels are generally in homogeneous.
In this paper, we discuss a general composition law for the
channels and is divided into two independent subsystems of channels, A and B.

Let X ( A, B) satisfy additivity as:

X ( A, B) = X ( A) + X ( B)

The joint probability of the total system of channel is
system and satisfies pseudoadditivity. There, three genera- lized entropies, the Tsalli’s entropy, the normalized Tsalli’s

Pij ( A, B) = Pi ( A) Pj ( B)

(2.1)

entropy, and the Kapur’s entropy, are shown to be consistent with the equilibrium condition. Abe [9] discuss the thermo-
Where

pi ( A)

is the probability of finding the subsystem
dynamic system, but here we are taking the system of chan- nels for the same entropies.

A in its ith state and

p j ( B) is the probability of finding the subsystem B in its

From a thermodynamic viewpoint, the heat disturbs the sig- nal, making it noisy.

j th

state.

The entropy of the total system, H ( A, B) H [ p( A, B)],

IJSER © 2010 http://www.ijser.org

International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 3

ISSN 2229-5518

is expressed in terms of the entropies of the subsys- tems, H ( A) and H (B) , i.e,

( NT )

a

H (T ) ( p) ( p) = a

N

H ( A, B) =

f [H ( A), H ( B)]

I ( pi )

i=1

Where f is symmetrical bivariate function:

= 1

T

1 - 1

(2.8)

f ( x, y) =

f ( y, x)

(2..2)

1 - a

N

I ( pi )

The given function is a composite function.

L i=1 J

According to Boltzmann-Gibbs-Shannon entropy [3]
Also,

( NT )

a

( A, B) = H ( NT )

( A) + H ( NT )

( B)

N + (a - 1)H ( NT ) ( A) H ( NT ) (B)

(2.9)

H ( p) = - I pi ln( pi )

i=1

Where N is the total no. of states at a given scale.

For independent variables A and B ,

(2.3)

a a

gives pseudoadditivity as (2.7)
Here, we notice two important points:

H ( A, B) = H ( A) + H ( B)

(2.4)

(i) As

a 1 , Renyi’s Entropy, Tsalli’s Entropy

From [8],

and normalized Tsalli’s Entropy all converges

to Boltzmann-Gibbs-Shannon Entropy.

H ( Aa , Ba ) = Ha ( A) + H a (B) + -r (a ) Ha ( A).H a (B)

(ii) For

a E (0,1) , Renyi’s Entropy and normal-

Where -r (a ) is a function of the entropic index. According to Renyi’s Entropy [1]

N

ized Tsalli’s Entropy are concave only and for

a > 0 , Tsalli’s Entropy is always concave.

The generalized entropies of Kapur of order a and type [7] is

H ( R ) ( p) =

1

1 - a

ln I ( p )a

i=1

(a > 0)

(2.5)

N

I pi

and Tsalli’s Entropy [4]

H K ( p)

= 1

- a

log 2

i =1 ,

N

I p

T N ) i =1

H (T ) ( p) =

1 I ( p )a - 1

(a > 0)

(2.6)

a ,

a , > 0

(2.10)

1 - a

L i=1 J

In the limiting case,

Since H ( R ) ( p) and H (T ) ( p) are composite functions and

a a

K

satisfies pseudoadditivity as:

When a 1 and = 1 , H a , ( p) reduces to H ( p)

H (T ) ( A, B) = H (T ) ( A) + H (T ) (B)

and when

= 1 , H K ( p)

reduces to H ( R )

( p) .

a a a

+ (1 - a ) H (T ) ( A) H (T ) ( B)

(2.7)
Also, H K

( p)

is a composite function which satisfies

a a

The normalized Tsalli’s Entropy is

a ,

pseudoadditivity as:

K K K

Ha , ( A, B) = Ha , ( A) + Ha , ( B)

IJSER © 2010 http://www.ijser.org

International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 4

ISSN 2229-5518

+ -r (a ,
) Ha ,
( A) Ha ,
( B)
(2.10 a)
Separation of variables can only be possible if the derivation of the composite function satisfy
The above Entropies have a common feature “ Sum plus product ” as:

af [H ( A), H (B)]

a( H ( A))

f ( x, y) = x + y + -r (a ) xy

= k ( H ( A), H ( B)) g ( H ( A)) h( H (B))

(2.15)

Where -r (a ) is the frequency of channel .

af [H ( A), H ( B)]

a(H ( B))

For Renyi’s Entropy and Boltzmann-Gibbs Shannon Entro- py , -r (a ) = 0

= k ( H ( A), H ( B)) h( H ( A)) g (H ( B))

(2.16)

For Tsalli’s Entropy, -r (a ) = 1 - a

where

k, g and h are some functions. Also k is a

For Normalized Tsalli’s Entropy , -r (a ) = a - 1 and

Assuming that, For Kapur’s Entropy,
symmetrical function due to (2..2)
In order to establish (2.14), in most physical systems k is in- dependent of the entropies of subsystems and the possibility of k to be entropy dependent is limited, in general.

-r (a ,

) = (1 - a )(1 - )

Henceforth, k will be taken to be an unity.
The additivity of channel capacity is not obvious, this makes the channel highly non-trivial for non-extensive systems with long range interactions.
Equations (2.15) and (2.16) becomes as:

The stationary state of the total system may be defined by the maximization of total entropy with fixed X ( A, B) , i.e,

af [H ( A), H ( B)]

a(H ( A))

= g( H ( A)) h( H (B))

(2.17)

8 H ( A, B) = 8

f [H ( A), H ( B)] = 0

(2.11)

af [H ( A), H ( B)]

a( H (B))

= h( H ( A)) g ( H (B))

(2.18)

8 X ( A) = - 8 X (B)

(2.12)

From these equations it follows that,
It is obvious that

af [H ( A), H (B)]

d ( H ( A))

a 2 f [H ( A), H ( B)]

=

a 2 f [H ( A), H (B)]

(2.19)

a( H ( A))

d ( X ( A))

a(H ( A))

a(H ( B))

a( H (B)) a( H ( A))

af [H ( A), H (B)]

=

a(H ( B))

d (H (B))

d ( X (B))

(2.13)
Differentiate (2.17) with and put it in (2.19), we get

H (B)

and (2.18) with

H ( A)

In equilibrium state,

-r ( A) = -r (B)

(2.14)

1

g (H ( A))

dh( H ( A)) =

d (H ( A))

1

g( H (B))

dh( H ( B))

d ( H (B))

Where -r is the frequency, if X is the channel capacity of the system

-r (2.20)

IJSER © 2010 http://www.ijser.org

International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 5

ISSN 2229-5518

Where -r is a frequency and is known as separation constant.
In order to find h-r ,
let us suppose that both A and B are in the completely
Put the value of
(2.18), we get

g (H ( A))

and

g (H ( B))

in (2.17) and

ordered state, i.e,

H ( A, B) = 0,

af-r [H ( A), H (B)]

a(H ( A))

h-r (0) = 1

-r

1 dh

(H ( A))

Again, let us suppose that only B is in the completely

= -r

-r d (H ( A))

h-r (H (B))

(2.21)

ordered state, i.e,

H ( A, B) = H ( A)

(2.26)

af-r [H ( A), H ( B)]

a(H ( B))

1

dh (H ( B))

By (2.25), we get

H ( A) = 1

1

h-r ( H ( A)) -

(2.27)

= -r

-r d ( H (B))

h-r (H ( A))

(2.22)
Or

-r

h-r ( H ) = -r H + 1

-r

(2.28)
Integrating (2.21) with get

H ( A)

and (2.22) with

H (B) , we

Put (2.26) in (2.23), we get

f-r [H ( A), H ( B)]

H ( A, B) =

f [H ( A), H ( B)]

1

= -r h-r

(H ( A))

h-r

(H ( B)) + c

(2.23)

= H ( A) + H (B) + -r H ( A)

H ( B)

(2.29)
Where c is the constant of integration.
In order to get convergent result in the limit -r 0 ,

1


Without loss of generality, Setting c = -

-r

and simultaneously impose the condition
Eq. (2.29) is similar to eq. (2.10a).
This shows the validity of composition law of the system for
Kapur’s entropy and satisfies pseudoadditivity.

3. CONCLUSION

Here, we have derived the most general pseudoadditivity

lt

-r 0

h-r (H ) = 1, H

(2.24)
rule for composite entropy based only on the existence of equilibrium. We have shown how the three generalized en- tropies type pseudoadditivity can be obtained as the sim- plest case. In the present work, composability of entropy has been taken as a basic premise. This concept puts a stringent
Thus,

f-r [H ( A), H ( B)]

constraint on possible forms of entropies. In [10] the entropy arising from the idea of quantum groups is not composable.

1

= -r h-r

(H ( A))

h-r

(H ( B)) - 1

-r

(2.25)
Composability does not seem to be possible to define the equilibrium states in the standard manner, if entropy does not satisfy composability.
However, it is worth noting that composability assumes
Eq. (2.25) is the form of the composite function prescribed by the existence of equilibrium.
divisibility of the total system into independent subsystems. Actually reliability of this independence also puts stringent constraints on physical nature of the systems.

IJSER © 2010 http://www.ijser.org

International Journal of Scientific & Engineering Research, Volume 1, Issue 3, December-2010 6

ISSN 2229-5518

For example: If the total system contains a long range inte- ractions between its microscopic components, the indepen- dence of the subsystems may hardly be realiable in general.
In such a situation, thorough generalization of the standard formalism may be required.

REFERENCES

[1] A. Renyi’s, 1961, On measures of entropy and information, Pro ceeding of 4th Berkeley Symposium on Mathematical statis- tics and Probability, Vol. 1, pp. 547-561.

[2] B. Lesche, 1982, Journal of Statistical Physics, 27, 419.

[3] C. E Shannon, 1948, The Mathematical Theory of Com- mu nication, Bell’s systems Tech. Journal, 27, pp. 279-423.

[4] C. Tsallis, Non-extensive Statistical Mechanics and Ther mody namics, Historical Background and Present Status. In: Abe, S., Okamoto, Y. (eds.) Nonextensive Statistic- al Mechanics and Its Applications, Springer-Verlag Heidelberg,

2001.

[5] C. Tsallis, 1988, Possible generalization of Boltzmann-Gibbs sta tistics, Journal of Statistical Physics, 52, pp. 479-487.

[6] J. N. Kapur, H. K. Kesavan, Entropy optimisation Principles with Applications, Academic Press, Boston,1994.

[7] J.N Kapur, 1967, Generalization entropy of order a and type ,

The mathematical seminar, 4 ,pp. 78-84.

[8] P.T. Landsberg, V. Vedral, 1998, Physics Letters A 247, 211.

[9] S. Abe, 2001, General pseudoadditivity of composable entropy prescribed by the existence of equilibrium, Physics Review E

63, 061105.

[10] S. Abe, 1997, Physics Letters A 224, 326.

IJSER © 2010 http://www.ijser.org