Jascha Sohl-Dickstein: Difference between revisions

From RedwoodCenter
Jump to navigationJump to search
Line 9: Line 9:
== Current projects ==
== Current projects ==


I am working with Peter Battaglino and Michael DeWeese on a technique for parameter estimation in probabilistic models with intractable partition functions, involving minimization of probability flows.  See the [http://arxiv.org/abs/0906.4779 arXiv pre/e-print].  Matlab code implementing Minimum Probability Flow learning for the Ising model case is available on my public github [https://github.com/Sohl-Dickstein/Jascha-Sohl-Dickstein-research-code repository].
I am working with Peter Battaglino and Michael DeWeese on a technique for parameter estimation in probabilistic models with intractable partition functions, involving minimization of probability flows.  See the [http://arxiv.org/abs/0906.4779 arXiv pre/e-print].  Matlab code implementing Minimum Probability Flow learning for the Ising model and RBM cases, and comparing performance to other techniques under the RBM case, is available on my public github [https://github.com/Sohl-Dickstein/Jascha-Sohl-Dickstein-research-code repository].


I am working with Jimmy Wang and Bruno Olshausen to build a Lie algebraic model of the transformations which occur in natural video.  See an [http://arxiv.org/abs/1001.1027 arXiv pre/e-print], a [http://marswatch.astro.cornell.edu/jascha/pdfs/jimmy_jascha_bavrd_09.pdf poster pdf], or [https://redwood.berkeley.edu/jwang/research.html Jimmy's web page].
I am working with Jimmy Wang and Bruno Olshausen to build a Lie algebraic model of the transformations which occur in natural video.  See an [http://arxiv.org/abs/1001.1027 arXiv pre/e-print], a [http://marswatch.astro.cornell.edu/jascha/pdfs/jimmy_jascha_bavrd_09.pdf poster pdf], or [https://redwood.berkeley.edu/jwang/research.html Jimmy's web page].

Revision as of 03:04, 28 November 2010

I am a graduate student in the Redwood Center for Theoretical Neuroscience, at University of California, Berkeley. I am a member of Bruno Olshausen's lab, and the Biophysics Graduate Group. My email address is jascha@berkeley.edu.

I am interested in how we learn to perceive the world. There is evidence that much of our representation of the world is learned during development rather than being pre-programmed - everything from the way light intensity is correlated on adjacent patches of the retina, all the way up to the behavior (and existence!) of objects. We seem to infer most of human scale physics from examples of sensory input.

How this unsupervised learning problem is solved - how we learn the structure inherent in the world just by experiencing examples of it - is not well understood. This is the problem I am interested in tackling.

There are two large (known) parts to this problem. The first is the design of models which are flexible enough to describe *anything* without being unwieldy (greater flexibility frequently comes at the cost of an explosion in the number of parameters and/or computationally costly implementation). The second is training these models once they've been designed (almost any complex model you can write down is impractical to exactly evaluate - a factor called the "partition function", coming from the constraint that the probabilities of all states must sum to 1, is intractable to compute. This inability to exactly evaluate the model makes training exceedingly difficult.). I am attempting to work on both parts of this problem, though I've had more success so far with the second.

Current projects

I am working with Peter Battaglino and Michael DeWeese on a technique for parameter estimation in probabilistic models with intractable partition functions, involving minimization of probability flows. See the arXiv pre/e-print. Matlab code implementing Minimum Probability Flow learning for the Ising model and RBM cases, and comparing performance to other techniques under the RBM case, is available on my public github repository.

I am working with Jimmy Wang and Bruno Olshausen to build a Lie algebraic model of the transformations which occur in natural video. See an arXiv pre/e-print, a poster pdf, or Jimmy's web page.

I am working with Nicol Harper and Chris Rodgers to build a device allowing human echolocation.

I am working with Jack Culpepper and Bruno Olshausen on novel uses of sampling algorithms in learning. Specifically, efficient ways to maintain the full posterior during EM, and ways to exactly calculate the log likelihood and partition function for distributions by treating the sampling chain as an alternative analytic form for the distribution.

I am working on my own to build energy based probabilistic models out of unstructured recurrent neural networks, and train them on natural stimuli.

Notes and work in progress

  • Entropy of Generic Distributions - Calculates the entropy that can be expected for a distribution drawn at random from the simplex of all possible distributions

The following are titles for informal notes I intend to write, but haven't gotten to/finished yet. If any of the following sound interesting to you, pester me and they will appear more quickly.

  • Natural gradients made quick and dirty
  • A log bound on the growth of intelligence with system size
  • The field of experts model learns Gabor-like receptive fields when trained via minimum probability flow or score matching
  • For small time bins, generalized linear models and causal Boltzmann machines become equivalent
  • How to construct phase space volume preserving recurrent networks
  • Maximum likelihood learning as constraint satisfaction
  • A spatial derivation of score matching

Interests

If you know anything about the following, I would love to pick your brain:

  • compressive sensing (specifically - connections to information theory / manifolds)
  • echo state networks / liquid state machines (specifically - what can we say about the classes of transformations the input undergoes?)
  • non-equilibrium statistical mechanics (Jaynes, Crooks, Jarzynski...)
  • criticality (especially as related to steady state, non-equilibrium, stat mech)
  • ways to close the sensori-motor loop. (what's the objective function for the brain?)
  • brain-machine interfaces (especially - thoughts about algorithm design on the machine side, and ways to adapt intelligently, and online, to brain output)

Code

Code related to my research is available on my public github repository. The code there includes:

  • Matlab code implementing Minimum Probability Flow learning in the Ising model case.

Relevant publications

J Sohl-Dickstein, JC Wang, BA Olshausen. An Unsupervised Algorithm For Learning Lie Group Transformations. (2009) http://arxiv.org/abs/1001.1027

J Sohl-Dickstein, P Battaglino, M DeWeese. Minimum probability flow learning. (2009) http://arxiv.org/abs/0906.4779

C Abbey, J Sohl-Dickstein, BA Olshausen. Higher-order scene statistics of breast images. Proceedings of SPIE (2009) http://link.aip.org/link/?PSISDG/7263/726317/1

POSTER - J Sohl-Dickstein, BA Olshausen. Learning in energy based models via score matching. Cosyne (2007) - this (dense!) poster introduces a spatial derivation of score matching, applies it to learning in a Field of Experts model, and then extends Field of Experts to work with heterogeneous experts (to form a "tapestry of experts"). download poster

POSTER - J Wang, J Sohl-Dickstein, BA Olshausen. Unsupervised learning of Lie group operators from natural movies. Bay Area Vision Research Day (2009). download poster


Papers from my previous life as a Martian

Before coming to Berkeley, I worked as a research associate on the Pancam team for the Mars Exporation Rover mission. Publications stemming from that are listed below:

Kinch et al. Dust deposition on the Mars Exploration Rover Panoramic Camera (Pancam) calibration targets. Journal of Geophysical Research-Planets (2007)

Johnson et al. Radiative transfer modeling of dust-coated Pancam calibration target materials: Laboratory visible/near-infrared spectrogoniometry. J. Geophys. Res (2006)

Joseph et al. In-flight calibration and performance of the Mars Exploration Rover Panoramic Camera (Pancam) Instruments. J. Geophys. Res (2006)

Parker et al. Stratigraphy and sedimentology of a dry to wet eolian depositional system, Burns formation, Meridiani Planum, Mars. Earth and Planetary Science Letters (2005)

Soderblom et al. Pancam multispectral imaging results from the Opportunity rover at Meridiani Planum. Science (2004)

Soderblom et al. Pancam multispectral imaging results from the Spirit rover at Gusev crater. Science (2004)

Smith et al. Athena microscopic imager investigation. Journal of Geophysical Research-Planets (2003)

Bell et al. Hubble Space Telescope Imaging and Spectroscopy of Mars During 2001. American Geophysical Union (2001)