Non-Asymptotic Statistical Similarity Measures
Final Report Abstract
In this project, it was investigated to what extent non-asymptotic statistical similarity measures can be used to advance the methodological state-of-the-art in statistical signal processing and information theory. In the area of sequential detection, it was shown that the problem of robust sequential detection can be formulated as an f-divergence minimization, where the divergence-defining function f is implicitly determined by an integral equation. In addition, an interesting special case was considered, namely, a non-parametric version of the so-called Kiefer–Weiss problem. Here, the distribution is sought under which the average runtime of a sequential test is maximized, as well as the testing policy that minimizes this maximum (minimax optimal). It was shown that the nonparametric Kiefer–Weiss problem leads to a sequential test that has some unexpected properties. In particular, it can require randomized stopping times, meaning that the test stops in a given state only with a certain probability. Moreover, unlike its parametric counterpart, the test is not necessarily truncated. A second research topic in the area of sequential detection was the optimal quantization of continuous observations with respect to the average runtime of the detector. This problem is of practical interest, as high-resolution analog-to-digital converters increasingly pose a bottleneck due to their relatively high energy consumption and the area they occupy in integrated circuits. In this project, it was shown how optimal quantizers can be designed, and lower bounds for the average runtime of the corresponding sequential tests were derived. In the area of stochastic control, methods were developed to design a controller that keeps a system in a safe regime with high probability. The novelty here is that this probability holds not only for a given point in time, but for the entire trajectory of the system. Moreover, not only the probability itself is taken into account, but also the degree to which the constraint defining the safe regime is exceeded. Interestingly, it was possible to apply techniques to this problem that were previously developed for the analysis of robust sequential tests. In the area of parameter estimation, an alternative to the widely used Bayesian Cramér–Rao bound was developed. Instead of the Fisher information, this bound uses a divergence to a reference distribution that can be flexibly adapted. In addition, an extension from the typical cost function, the mean squared error, to Bregman cost functions is possible. The latter is a class of cost functions commonly used in statistics and optimization, which are defined by Bregman divergences. In the area of information theory, an approach was developed that enables one to define variants of important quantities such as entropy, typical sequences, and mutual information, which are well-defined even for non-vanishing error probabilities and finite block lengths. The underlying idea is to use the sequential test as a fundamental building block, as it provides a universal lower bound on the required average number of observations for any desired error probabilities. The results in this area are promising but not yet fully developed.
Publications
-
Minimax optimal sequential hypothesis tests for Markov processes. The Annals of Statistics, 48(5).
Fauß, Michael; Zoubir, Abdelhak M. & Poor, H. Vincent
-
An Asymptotically Pointwise Optimal Procedure For Sequential Joint Detection And Estimation. ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 5020-5024.
Reinhard, Dominik; Faus, Michael & Zoubir, Abdelhak M.
-
Minimax Robust Detection: Classic Results and Recent Advances. IEEE Transactions on Signal Processing, 69(2021), 2252-2283.
Faus, Michael; Zoubir, Abdelhak & Poor, H. Vincent
-
On Optimal Quantization in Sequential Detection. IEEE Transactions on Signal Processing, 70(2022), 4440-4453.
Faus, Michael; Stein, Manuel S. & Poor, H. Vincent
-
A Kullback-Leibler Divergence Variant of the Bayesian Cramér-Rao Bound. Signal Processing, 207, 108933.
Fauß, Michael; Dytso, Alex & Poor, H. Vincent
