Sunday 3 November 2013

"Film Night"

Here are videos of some interesting talks. Grab a bag of popcorn — or more healthily, a bag of pork scratchings — and settle in to enjoy ...

Recent Developments in Deep Learning
Geoff Hinton, 30 May 2013 (65 minutes)

Neural nets have been around for a long time, but people could not get them to work as well as they had hoped. The explanation for this was that learning by backpropagation would never give better results because:
  • It required labelled training data.
  • Learning time was very slow with deep nets.
  • It got stuck in local optima.
Surprisingly, every one of these statements was wrong. Deep neural nets are now practical, and they can do amazing things.

Monads and Gonads
Douglas Crockford, 15 January 2013 (49 minutes)

Yet another monads tutorial? Is it any more comprehensible than the others? It's certainly more entertaining, if only because Crockford asserts that you don't need to understand Haskell, you don't need to understand Category Theory, and if you've got the balls, you don't even need static types. (As you might expect, he uses Javascript. I really enjoyed this talk, but I found that I had to re-do his Javascript in Python to make sure I had understood it right.)

Mission Impossible: Constructing Charles Babbage's Analytical Engine
Doron Swade, 8 May 2012 (87 minutes)

This is longer than the others, but there's really no heavy-lifting in this talk. Swade outlines the very ambitious project to complete the design and construction of Charles Babbage's Analytical Engine. (If you want to support the effort, you can make a contribution at

1 comment:

  1. Neural nets have been the method of choice for predicting protein secondary structure for about 20 years. 4-layer neural nets work quite well for that application.
    See, for example, Unifying secondary-structure, fold-recognition, and new-fold methods for protein structure prediction Dagstuhl seminar 02471 Schloss Dagstuhl Germany