Abstract We define a new class of random probability measures, approximating the well-known normalized generalized gamma (NGG) process. Our new process is defined from the representation of NGG processes as discrete measures where the weights are obtained by normalization of the jumps of Poisson processes and the support consists of independent identically distributed location points, however considering only jumps larger than a threshold ε. Therefore, the number of jumps of the new process, called ε-NGG process, is a.s. finite. A prior distribution for ε can be elicited. We assume such a process as the mixing measure in a mixture model for density and cluster estimation, and build an efficient Gibbs sampler scheme to simulate from the posterior. Finally, we discuss applications and performance of the model to two popular datasets, as well as comparison with competitor algorithms, the slice sampler and a posteriori truncation.

(2016). A blocked Gibbs sampler for NGG-mixture models via a priori truncation [journal article - articolo]. In STATISTICS AND COMPUTING. Retrieved from http://hdl.handle.net/10446/191949

A blocked Gibbs sampler for NGG-mixture models via a priori truncation

Argiento, Raffaele;
2016-01-01

Abstract

Abstract We define a new class of random probability measures, approximating the well-known normalized generalized gamma (NGG) process. Our new process is defined from the representation of NGG processes as discrete measures where the weights are obtained by normalization of the jumps of Poisson processes and the support consists of independent identically distributed location points, however considering only jumps larger than a threshold ε. Therefore, the number of jumps of the new process, called ε-NGG process, is a.s. finite. A prior distribution for ε can be elicited. We assume such a process as the mixing measure in a mixture model for density and cluster estimation, and build an efficient Gibbs sampler scheme to simulate from the posterior. Finally, we discuss applications and performance of the model to two popular datasets, as well as comparison with competitor algorithms, the slice sampler and a posteriori truncation.
articolo
2016
Argiento, Raffaele; Bianchini, Ilaria; Guglielmi, Alessandra
(2016). A blocked Gibbs sampler for NGG-mixture models via a priori truncation [journal article - articolo]. In STATISTICS AND COMPUTING. Retrieved from http://hdl.handle.net/10446/191949
File allegato/i alla scheda:
File Dimensione del file Formato  
Argiento2016_Article_ABlockedGibbsSamplerForNGG-mix.pdf

Solo gestori di archivio

Versione: publisher's version - versione editoriale
Licenza: Licenza default Aisberg
Dimensione del file 2.85 MB
Formato Adobe PDF
2.85 MB Adobe PDF   Visualizza/Apri
14-1_stat_comp_4aperto (1).pdf

Open Access dal 15/02/2016

Descrizione: This is a post-peer-review, pre-copyedit version of an article published in Statistics and Computing. The final authenticated version is available online at: http://dx.doi.org/10.1007/s11222-015-9549-6
Versione: postprint - versione referata/accettata senza referaggio
Licenza: Licenza default Aisberg
Dimensione del file 5.94 MB
Formato Adobe PDF
5.94 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

Aisberg ©2008 Servizi bibliotecari, Università degli studi di Bergamo | Terms of use/Condizioni di utilizzo

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10446/191949
Citazioni
  • Scopus 17
  • ???jsp.display-item.citation.isi??? 16
social impact