Volume 10, Issue 3 (11-2024)                   mmr 2024, 10(3): 72-99 | Back to browse issues page

XML Persian Abstract Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Shams M. New Approach to Fisher Information. mmr 2024; 10 (3) :72-99
URL: http://mmr.khu.ac.ir/article-1-3353-en.html
University of Kashan , shams.mehdi@gmail.com@yahoo.com
Abstract:   (98 Views)
Fisher information is a measure of the information inside a random variable about an unknown parameter. Mutual information shows the dependence between two variables and relative entropy shows the difference between the two probability distributions. In this paper, Fisher information is generalized for mutual information and relative entropy and the monotonicity properties of Fisher information are examined. Then concepts such as information correlation and information correlation coefficient are introduced. Finally, it is shown that Shannon differential entropy, which measures the behavior of a random variable, and conditional Fisher information are used to determine the probability of estimation error.
Full-Text [PDF 1227 kb]   (67 Downloads)    
Type of Study: Original Manuscript | Subject: stat
Received: 2023/10/7 | Revised: 2024/11/9 | Accepted: 2024/05/15 | Published: 2024/11/6 | ePublished: 2024/11/6

Add your comments about this article : Your username or Email:
CAPTCHA

Send email to the article author


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

© 2024 CC BY-NC 4.0 | Mathematical Researches

Designed & Developed by : Yektaweb