loading
α-mutual information
2015 Edition, February 1, 2015 - IEEE - Institute of Electrical and Electronics Engineers, Inc.

Renyi entropy and Renyi divergence evidence a long track record of usefulness in information theory and its applications. Alfred Rényi never got around to generalizing mutual information in a similar way. In fact, in the literature there are several possible ways to accomplish...

Convexity/concavity of renyi entropy and α-mutual information
2015 Edition, June 1, 2015 - IEEE - Institute of Electrical and Electronics Engineers, Inc.

Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P, the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P||Q). This...

New approach to eliminate structural redundancy in case resource pools using α mutual information
2013 Edition, Volume 24, August 1, 2013 - China Aerospace Science and Industry Corporation (CASIC)

Structural redundancy elimination in case resource pools (CRP) is critical for avoiding performance bottlenecks and maintaining robust decision capabilities in cloud computing services. For these purposes, this paper proposes a novel approach to ensure redundancy elimination of a reasoning system...

Computation of Csiszár’s mutual Information of order α
2008 Edition, July 1, 2008 - IEEE - Institute of Electrical and Electronics Engineers, Inc.

Csiszar introduced the mutual information of order alpha in [1] as a parameterized version of Shannon's mutual information. It involves a minimization over the probability space, and it cannot be computed in closed form. An alternating minimization algorithm for its...

A generalized erasure channel in the sense of polarization for binary erasure channels
2016 Edition, September 1, 2016 - IEEE - Institute of Electrical and Electronics Engineers, Inc.

The polar transform of a binary erasure channel (BEC) can be exactly approximated by other BECs. Arikan proposed that polar codes for a BEC can be efficiently constructed by using its useful property. This study proposes a new class of arbitrary input generalized erasure channels, which can be...

Minimax Rényi Redundancy
Volume PP - IEEE - Institute of Electrical and Electronics Engineers, Inc.

The redundancy for universal lossless compression of discrete memoryless sources in Campbell's setting is characterized as a minimax Rényi divergence, which is shown to be equal to the maximal α-mutual information via a generalized redundancy-capacity theorem. Special attention...

Two measures of dependence
2016 Edition, November 1, 2016 - IEEE - Institute of Electrical and Electronics Engineers, Inc.

Motivated by a distributed task-encoding problem, two closely related families of dependence measures are introduced. They are based on the Rényi divergence of order α and the relative α-entropy, respectively, and both reduce to the mutual information when the parameter...

Rényi information transfer: Partial rényi transfer entropy and partial rényi mutual information
2014 Edition, May 1, 2014 - IEEE - Institute of Electrical and Electronics Engineers, Inc.

Shannon and Rényi information theory have been applied to coupling estimation in complex systems using time series of their dynamical states. By analysing how information is transferred between constituent parts of a complex system, it is possible to infer the coupling parameters of...

One-class classification through mutual information minimization
2016 Edition, July 1, 2016 - IEEE - Institute of Electrical and Electronics Engineers, Inc.

In one-class classification problems, a model is synthesized by using only information coming from the nominal state of the data generating process. Many important applications can be cast in the one-class classification framework, such as anomaly and change in stationarity detection, and...

Which Boolean Functions Maximize Mutual Information on Noisy Inputs?
2014 Edition, Volume 60, August 1, 2014 - IEEE - Institute of Electrical and Electronics Engineers, Inc.

We pose a simply stated conjecture regarding the maximum mutual information a Boolean function can reveal about noisy inputs. Specifically, let Xn be independent identically distributed Bernoulli(1/2), and let Yn be the result of passing Xn through a memoryless binary symmetric...

Advertisement