University of Kashan , shams.mehdi@gmail.com@yahoo.com
Abstract: (98 Views)
Fisher information is a measure of the information inside a random variable about an unknown parameter. Mutual information shows the dependence between two variables and relative entropy shows the difference between the two probability distributions. In this paper, Fisher information is generalized for mutual information and relative entropy and the monotonicity properties of Fisher information are examined. Then concepts such as information correlation and information correlation coefficient are introduced. Finally, it is shown that Shannon differential entropy, which measures the behavior of a random variable, and conditional Fisher information are used to determine the probability of estimation error.
Type of Study:
Original Manuscript |
Subject:
stat Received: 2023/10/7 | Revised: 2024/11/9 | Accepted: 2024/05/15 | Published: 2024/11/6 | ePublished: 2024/11/6