BNP clustering using mixtures with interacting atoms

Nonparametric mixture models are routinely used for Bayesian density estimation and clustering. To specify a mixture model, one must select a probability density function (or kernel) that controls the law of data within each cluster, and the law of a random probability measure which gives the random weights and the atoms of the mixture. Well-known examples such as Dirichlet process mixtures or its generalizations assume that the cluster-specific parameters are i.i.d. from a common base distribution. Such an assumption preserves analytical tractability but it is associated with the poor performance of model-based clustering, especially if the model is (even slightly) misspecified. We propose a general framework to introduce interaction among the cluster-specific parameters, by normalizing discrete random measures driven by general point processes. In particular, we focus on the case of repulsive processes which are useful to estimate well-separated and interpretable clusters. We show that our general formulation leads to an efficient MCMC sampler that avoids complex split-merge reversible jump moves; we further extend the proposed approach to cluster high-dimensional data by means of a novel class of anisotropic determinantal point processes. We then establish several distributional properties of our prior and discuss the marginal and predictive distributions induced on the latent parameters of the mixture.

Zoom link: LINK