Deep Believe Networks and Statistical Mechanics

Read an amazing paper today by I. P. Waldmann: Dreaming of atmospheres on deep belief networks and an application to predicting exoplanet atmospheric content from spectral data. For me it kills two birds with one paper– my friends Duane Lee and Mike Lund and I have hit a bottleneck on a paper we’re currently working on, where we have not been able to, with reasonable computaional time, extract meaningful atmospheric chemical information from spectral emissions from exoplanets. This paper proposes a way of getting around this issue using a deep believe network to predict chemical species from spectral information as input; it’s genuis. I’ll probably email the guys after this post to let them know we can move past this road block.

The second issue it resolves for me is finally someone breaking down neural networks in the language of statistical mechanics. I have realized for some time that neural networks are a representation of the grand canonical ensemble, and it’s been tough to find a good reference that looks at this problem from this perspective. This paper breaks it down beautifully and begins to break down the Restricted Boltzmann Machines architecture in the microcanonical ensemble, a necessary step towards macrocanonical. I was asked to give a lecture on deep learning networks in the physics department next week, so this will be a perfect reference.

AAAS and grant writing

This weekend I’m working on a section of a grant that I’m writing and finally finishing off my AAAS fellowship application. Gotta get to that! Oh and today’s halloween.. which is unfortunate since I probably won’t have a chance to get out and celebrate..

Written on October 29, 2016