Lab Home | Phone | Search | ||||||||
|
||||||||
In "probabilistic inference" problems one tries to estimate the true state of some quantities of interest, given only noisy or ambiguous sensor measurements of related quantities. Such problems arise in a very wide variety of fields, with applications in communications, signal processing, computer vision, speech and audio recognition, machine learning, physics, and robotics. A common formalism that can be used to attack all these problems is using message-passing algorithms that operate on "factor graphs." In this talk, I will describe and compare two particularly important algorithms for probabilistic inference: the celebrated "belief propagation" (BP) algorithm; and the "divide and concur" (D&C) algorithm recently developed by Gravel and Elser, which I will show can also be understood as a message-passing algorithm. I will describe the relative advantages of the two algorithms, as well as various methods that can improve their performance. The D&C algorithm has some notable advantages compared with BP, in that it more naturally deals with problems with continuous valued variables, or with variables that lack local evidence. Another advantage of D&C is that its "difference-map" dynamics enables it to avoid "traps."
I will also describe a new decoder (developed with Yige Wang and Stark Draper) for low-density parity check codes that combines the ideas of the D&C and BP algorithms. This "difference-map belief propagation" (DMBP) decoder significantly improves error-floor performance compared with state-of-the-art BP decoders, while maintaining a similar computational complexity. |