The Kullback–Leibler Divergence Between Lattice Gaussian Distributions

Frank Nielsen

Abstract


A lattice Gaussian distribution of given mean and covariance matrix is a discrete distribution supported on a lattice maximizing Shannon’s entropy under these mean and covariance constraints. Lattice Gaussian distributions fnd applications in cryptography and in machine learning. The set of Gaussian distributions on a given lattice can be handled as a discrete exponential family whose partition function is related to the Riemann theta function. In this paper, we frst report a formula for the Kullback–Leibler divergence between two lattice Gaussian distributions and then show how to effciently approximate it numerically either via Rényi’s α-divergences or via the projective γ-divergences. We illustrate how to use the Kullback-Leibler divergence to calculate the Chernoff information on the dually fat structure of the manifold of lattice Gaussian distributions.

Full Text:

pdf

Refbacks

  • There are currently no refbacks.