PDF

Estimation of Information-Theoretic Quantities for Particle Clouds

Abstract

When compared to alternative approaches such as Gaussian Mixture Models (GMMs), particle clouds more faithfully represent uncertainty. A concern about particle clouds, how- ever, is their inability to provide the analyst with closed form expressions for many standard information theoretic quantities such as entropy and divergence. Recent advances in infor- mation theory have provided techniques that can approximately estimate such quantities. One approach in the literature is the use of the k-th nearest neighbor (k-NN) algorithm to, firstly, estimate the probability density function that the particle cloud represents. Given these density estimates, one can then compute various information theoretic quantities. In this paper we review the k-NN algorithm and then discuss two applications. The first application is the estimation of the entropy of a particle cloud. Specifically, we show that the entropy of a nonlinear Hamiltonian system is conserved if canonical coordinates are used as a coordinate frame. The second application is to estimate the divergence between two particle clouds. Specifically, we use the estimated Bhattacharyya divergence to solve an uncorrelated track (UCT) correlation problem.

Downloads

You can find the compiled version of this paper using the PDF icon at the top of this page.

BibTeX citation

@inproceedings{kulumani2016d,
    Author = {Kulumani, Shankar and Hussein, Islam I. and Roscoe, Christopher W. T. and Wilkins, Matthew P. and Schumacher, Paul Jr.},
    Booktitle = {Proceedings of the AIAA/AAS Astrodynamics Specialists Conference, Long Beach, California},
    Month = {September},
    Title = {Estimation of Information-Theoretic Quantities for Particle Clouds},
    Year = {2016}}