Lab Home | Phone | Search | ||||||||
|
||||||||
Condensed matter physicists have a sophisticated array of numerical techniques that they use to study classical and quantum many-body models. In parallel, the machine learning community has developed a very successful set of algorithms with the goal of classifying, characterizing and interpreting complex sets of data, such as images and natural language recordings. We briefly show that standard neural networks architectures for supervised learning can identify phases and phase transitions in a variety of condensed matter Hamiltonians, directly from raw state configurations sampled with standard Monte Carlo and treated like images. Then, we show how we can use such Monte Carlo configurations to train a stochastic variant of a neural network, called a Restricted Boltzmann Machine (RBM), for use in unsupervised learning applications. We demonstrate how RBMs, once trained, can be sampled much like a physical Hamiltonian to produce configurations useful for estimating physical observables, as well as other applications. Finally, we explore the representational power of RBMs, and comment on their application to the simulation of quantum systems. Host: Lukasz Cincio |