A Comparative Study of Traditional and Kullback-Leibler Divergence of Survival Functions Estimators for the Parameter of Lindley Distribution
DOI:
https://doi.org/10.17713/ajs.v48i5.772Abstract
A new point estimation method based on Kullback-Leibler divergence of survival functions (KLS), measuring the distance between an empirical and prescribed survival functions, has been used to estimate the parameter of Lindley distribution. The simulation studies have been carried out to compare the performance of the proposed estimator with the corresponding Least square (LS), Maximum likelihood (ML) and Maximum product spacing (MPS) methods of estimation.
References
Rafail Abramov, Andrew Majda, and Richard Kleeman. Information
theory and predictability for low-frequency variability. Journal of the
atmospheric sciences, 62(1):65-87, 2005.
Janos Aczel and Zoltan Daroczy. On measures of information and their characterizations. New York, 1975.
Thomas M Cover and Joy A Thomas. Elements of information theory 2nd edition. 2006.
Inderjit S Dhillon, Subramanyam Mallela, and Rahul Kumar. A divisive information-theoretic feature clustering algorithm for text classication. Journal of machine learning research, 3(Mar):1265-1287, 2003.
B Forte and W Hughes. The maximum entropy principle: a tool to
define new entropies. Reports on mathematical physics, 26(2):227-235, 1988.
ME Ghitany, DK Al-Mutairi, and SM Aboukhamseen. Estimation of the reliability of a stress-strength system from power lindley distributions. Communications in Statistics-Simulation and Computation, 44(1):118-136, 2015.
ME Ghitany, F Alqallaf, DK Al-Mutairi, and HA Husain. A two parameter weighted lindley distribution and its applications to survival data. Mathematics and Computers in Simulation, 81(6):1190-1201, 2011.
ME Ghitany, B Atieh, and S Nadarajah. Lindley distribution and its
application. Mathematics and computers in simulation, 78(4):493-506, 2008.
Young Kyung Lee and Byeong U Park. Estimation of kullback-leibler divergence by local likelihood. Annals of the Institute of Statistical Mathematics, 58(2):327-340, 2006.
Dennis V Lindley. Fiducial distributions and bayes' theorem. Journal of the Royal Statistical Society. Series B (Methodological), pages 102-107, 1958.
Bruce G Lindsay. Efficiency versus robustness: the case for minimum hellinger distance and related methods. The annals of statistics, pages 1081-1114, 1994.
Juan Liu. Information theoretic content and probability. PhD thesis, Citeseer, 2007.
Pedro J Moreno, Purdy P Ho, and Nuno Vasconcelos. A kullback-leibler divergence based kernel for svm classication in multimedia applications. In Advances in neural information processing systems, page None, 2003.
Fernando Perez-Cruz. Kullback-leibler divergence estimation of continuous distributions. In 2008 IEEE international symposium on information theory, pages 1666-1670. IEEE, 2008.
Murali Rao, Yunmei Chen, Baba C Vemuri, and Fei Wang. Cumulative residual entropy: a new measure of information. IEEE transactions on Information Theory, 50(6):1220-1228, 2004.
Umesh Singh, Sanjay Kumar Singh, and Rajwant Kumar Singh. Product spacings as an alternative to likelihood for bayesian inferences. Journal of Statistics Applications & Probability, 3(2):179, 2014.
Qing Wang, Sanjeev R Kulkarni, and Sergio Verdu. Divergence estimation of continuous distributions based on data-dependent partitions. IEEE Transactions on Information Theory, 51(9):3064-3074, 2005.
Qing Wang, Sanjeev R Kulkarni, and Sergio Verdu. A nearest-neighbor approach to estimating divergence between continuous random vectors. convergence, 1000(1):11, 2006.
Gholamhossein Yari, Alireza Mirhabibi, and Abolfazl Sagha. Estimation of the weibull parameters by kullback-leibler divergence of survival functions. Appl. Math, 7(1):187-192, 2013.
Published
How to Cite
Issue
Section
Copyright (c) 2019 Austrian Journal of Statistics

This work is licensed under a Creative Commons Attribution 3.0 International License.
The Austrian Journal of Statistics publish open access articles under the terms of the Creative Commons Attribution (CC BY) License.
The Creative Commons Attribution License (CC-BY) allows users to copy, distribute and transmit an article, adapt the article and make commercial use of the article. The CC BY license permits commercial and non-commercial re-use of an open access article, as long as the author is properly attributed.
Copyright on any research article published by the Austrian Journal of Statistics is retained by the author(s). Authors grant the Austrian Journal of Statistics a license to publish the article and identify itself as the original publisher. Authors also grant any third party the right to use the article freely as long as its original authors, citation details and publisher are identified.
Manuscripts should be unpublished and not be under consideration for publication elsewhere. By submitting an article, the author(s) certify that the article is their original work, that they have the right to submit the article for publication, and that they can grant the above license.