A fully parametric approach to minimum power-divergence estimation

Auteur(s)

La Vecchia, Davide

Accéder

Texte intégral indisponible

Descrizione

We approach parameter estimation based on power-divergence using Havrda-Charvat generalized entropy. Unlike other robust estimators relying on divergence measures, the procedure is fully parametric and avoids complications related to bandwidth selection. Hence, it allows for the treatment of multivariate distributions. The parameter estimator is indexed by a single constant q, balancing the trade-off between robustness and efficiency. If q approaches 1, the procedure is maximum likelihood estimation; if q = 1/2, we minimize an empirical version of the Hellinger distance which is fully parametric. We study the mean squared error under contamination by means of a multi-parameter generalization of the change-of-variance function and devise an analytic min-max criterion for selecting q. Optimal q between 1/2 and 1 give remarkable robustness and yet result in negligible losses of efficiency compared to maximum likelihood. The method is considerably accurate for relatively large multivariate problems in presence of a relevant fraction of bad data.

Institution partenaire

Langue

English

Data

2009

Le portail de l'information économique suisse

© 2016 Infonet Economy