Lab Home | Phone | Search | ||||||||
|
||||||||
To model modern large-scale datasets, we need efficient algorithms to infer a set of P unknown model parameters from N noisy measurements. What are fundamental limits on the accuracy of parameter inference, given limited measurements, signal-to-noise ratios, prior information, and computational tractability requirements? How can we combine prior information with measurements to achieve these limits? Classical statistics gives incisive answers to these questions as the measurement density approaches infinity. However, modern 'big data' problems are often high-dimensional: have finite measurement density (N/P). This regime is important for a variety of fields and to study it we formulate and analyze high-dimensional inference as a problem in the statistical physics of quenched disorder. This analysis reveals that widely cherished Bayesian inference algorithms are suboptimal, and yields tractable, optimal algorithms to replace them. Host: Misha Chertkov |