In his talk Inventing on Principle,
Bret Victor sets the scene by explaining his own guiding principle,
that "creators need an immediate connection with what they create."
The interactive demonstrations in
his talk are very impressive. You look at them and think "Why isn't my
interaction with the computer like that? Why am I still struggling,
still working by remote control?" The difference between his demos and
the way we usually work is as dramatic as the difference between
batch-processing and time-sharing.
"When you make a decision," says Victor, "you need to see the effect
of that immediately — there can't be any delay, can't be anything
hidden." The demos are lovely, and it's certainly nice to get rapid
feedback, but "need"? Do we really need it?
We can find a pretty clear answer to that question if we turn
around and consult the paper Conditions for Intuitive Expertise:
A Failure to Disagree by psychologists Daniel Kahneman and Gary Klein
(American Psychologist, Vol 64(6), Sep 2009, pp515-526).
For many
years, Kahneman and Klein have been the central figures in two opposed
camps of psychologists studying human expertise. Kahneman, in the
"heuristics and biases" camp, enjoyed demonstrating that human
intuition is often flawed and that experts can often be
overconfident about their decisions. Klein, in the "naturalistic
decision making" camp, enjoyed celebrating the uncanny ability of
experts to make good decisions in complex situations and under time
pressure.
For example, Klein found from many interviews with firefighters that
rather than thinking of a number of options (or even just two) and
deciding between them, they usually only thought of one option — the
right one — and then quickly checked that it was appropriate for the
current situation. Their intuition really works. On the other hand,
"experts" in other fields don't seem so capable. Unfortunately, their
intuitions are still compelling and feel to them just as just as
correct as the firefighters' — but this feeling is
deceptive. For example, in
Moneyball,
Michael Lewis describes how
baseball scouts fail time and again to choose the most effective
players because they fall for the "representativeness" heuristic:
they pick good lookers, rather than good performers.
Was it possible to reconcile these two conflicting points of view?
Kahneman and Klein decided to try settle their differences by
writing a paper together. Both camps were clearly right some of the
time — experts are of course sometimes accurate and sometimes wide
of the mark. But was there a pattern? "What are the activities," they
asked, "in which skilled intuitive judgement develops with experience?
And what are the activities in which experience is more likely to
produce overconfidence than genuine skill?"
In the end, the two psychologists demonstrated "a failure to
disagree". Reconciling the evidence from each camp, they describe in
their paper the precise conditions for skilled intuition to
develop. Firstly, there must be a "high-validity" environment, or in
other words there must be cues in the observable environment which
indicate the true nature of the current situation. For example,
firefighters and paediatric nurses inhabit high-validity environments,
because they can perceive early indications that a building will
collapse or that a baby will suffer from an infection. They will often
be unaware of the precise cues which guide their intuitive decisions,
but the cues are there all the same.
Secondly, people must have an opportunity to learn the cues and to get
feedback when they are right and when they are wrong. But humans are
pretty good at this, provided they are in a high-validity
environment. As Kahneman and Klein note: "Where simple and valid cues
exist, humans will find them if they are given sufficient experience
and enough rapid feedback to do so". But the key for us here is the
feedback: to build skilled intuition this must be "both rapid and
unequivocal".
So, let's now come back to Brett Victor and his principle of rapid
feedback. Do we need it? Computer programming certainly forms a
high-validity environment. We should be able to become intuitive
experts, but as you can now see, that depends critically on whether
the feedback we get is "both rapid and unequivocal". In most
programming environments nowadays the feedback is neither.
(I leave it as an exercise for the reader to work out whether a
kitchen is a high-validity environment. The point Victor was making
about rapid feedback seems to have been missed by so many people that
he felt it necessary to construct a web-page explaining it all over
again. That account is certainly worth reading too, and you can find it here:
Learnable
Programming.)
Sunday, 27 January 2013
Sunday, 20 January 2013
A Rather Strange Cookbook
Almost a review of The 4-Hour
Chef by Tim Ferriss
The ideas I want to explore in this blog started to form while I was reading this book, so perhaps I should say a little bit about it. Tim Ferriss — how can I put it delicately? — is something of a self-publicist. The book has many five-star reviews on Amazon. One of the few one-star reviews asks: "Why does he feel the need to fake the ratings for his book? Over 50 five star reviews pop up the same day the book is published, almost at the same time, by reviewers who didn't review any other book." It certainly looks as though the techniques described in Trust Me, I'm Lying by Ryan Holiday have been used to promote this particular cookbook. So is there anything there beyond the hype? Well, actually, yes there is.
Ferris is a hacker. Not a computer hacker, of course, but a hacker nevertheless. He's obsessed with how to become an expert in something with much less than the normal effort. The conventional wisdom is that to be an expert in something takes around 10,000 hours of effortful practice. (See, for example, Outliers by Malcolm Gladwell.) Effortful practice means working at something to the limit of your current ability, not just idling at a level which you find easy.) Ferriss claims that it's possible to hack expertise, to go from zero to the top 5% in much less time than this — maybe only 10% or 20% of the time. How?
The first 70 pages of the book are devoted to "meta-learning": principles and examples of how to make a programme for learning anything. (The examples include things like how to shoot basketball hoops.) The next section of the book can be read as a 120 page basic cookery course, but it can also be deconstructed and used as a detailed example of how to apply the meta-learning principles in practice.
Ferriss's meta-learning principles are for the most part not particularly novel — if you are familiar with educational theory you will recognise many of them as descriptions of best practice. His description of how to learn a foreign language will be very familiar to anyone who has used the Michell Thomas language courses. But the idea that top-performers might not be good examples of that best practice is nowadays a bit heretical. Ferris suggests that rather than look to superstars for tips on how to practice, we would be better off finding the outliers who have achieved some success despite not being well endowed by nature.
Something that Ferriss's meta-method also emphasises above the meta-learning principles is to seek help from expert tutors. (And this is a place where, because of his celebrity, he might be better placed to find help than you and me.) This is an interesting angle, because most educational theory looks at the problem the other way around, from the perspective of an expert wanting to teach novices. As a novice wanting to learn something, it might appear that the principles are all you need, but that's not the case: to build your learning programme most effectively you also need some hands-on expert advice. (Because you need to learn the expert's tacit knowledge, which the expert might not even know how to teach. Ferriss gives some ideas on how you might approach this.)
It's intuitively obvious that this would be a good idea, but perhaps not so obvious just how powerful it is. We know from educational research that personal tutors are unreasonably effective — this is the so-called "two-sigma problem". Nearly 30 years ago, Benjamin Bloom compared group teaching with one-to-one tutoring, and found that an average student with one-to-one tutoring performed at the same level as a top 2% student with group teaching. That's really quite astonishing, isn't it? See The 2 sigma problem: The search for methods of group instruction as effective as one-to-one tutoring in Educational Researcher, 13(6), pp4-16 (1984).
But how is all this relevant to you and me? Well Ferriss's ideas on cooking are quite interesting in themselves, and I'll come back to them another time. But as programmers we have our own expertise problem. As part of my "day job" I teach small classes of students how to program in Python, as part of an electrical engineering degree. Well, I try to. I am no longer surprised at how difficult this is. A few people just get it straight away, like I did when I originally learned to program. Most struggle. I've found a few things that seem to help (and I'll come back to these another time too), but surely we can do much better. If there really is a way to gain expertise 10 times faster, we could certainly use it.
On the other hand, experienced programmers could always learn to be better too. I'd certainly like to learn more and be a better programmer. It's a generally accepted idea that programming teams often have some programmers who are 10-times more productive than others, but there's been relatively little interest in systematically finding what the difference is and learning to be that 10-times programmer yourself.
If you're an expert and you want to improve, you will certainly have to make your own learning programme, and Ferriss's ideas are a fairly accessible place to start. So: can we apply Ferriss's principles? What 20% of expert programming knowledge will give 80% of the results? What things are common amongst best performers, but infrequently taught? What you you know that I need to know too? I think it's not just novices but also experts who could do a lot better.
So overall, yes, despite the hype, do I think it's worth reading The 4-Hour Chef, if only to help imagine what might be possible.
The ideas I want to explore in this blog started to form while I was reading this book, so perhaps I should say a little bit about it. Tim Ferriss — how can I put it delicately? — is something of a self-publicist. The book has many five-star reviews on Amazon. One of the few one-star reviews asks: "Why does he feel the need to fake the ratings for his book? Over 50 five star reviews pop up the same day the book is published, almost at the same time, by reviewers who didn't review any other book." It certainly looks as though the techniques described in Trust Me, I'm Lying by Ryan Holiday have been used to promote this particular cookbook. So is there anything there beyond the hype? Well, actually, yes there is.
Ferris is a hacker. Not a computer hacker, of course, but a hacker nevertheless. He's obsessed with how to become an expert in something with much less than the normal effort. The conventional wisdom is that to be an expert in something takes around 10,000 hours of effortful practice. (See, for example, Outliers by Malcolm Gladwell.) Effortful practice means working at something to the limit of your current ability, not just idling at a level which you find easy.) Ferriss claims that it's possible to hack expertise, to go from zero to the top 5% in much less time than this — maybe only 10% or 20% of the time. How?
The first 70 pages of the book are devoted to "meta-learning": principles and examples of how to make a programme for learning anything. (The examples include things like how to shoot basketball hoops.) The next section of the book can be read as a 120 page basic cookery course, but it can also be deconstructed and used as a detailed example of how to apply the meta-learning principles in practice.
Ferriss's meta-learning principles are for the most part not particularly novel — if you are familiar with educational theory you will recognise many of them as descriptions of best practice. His description of how to learn a foreign language will be very familiar to anyone who has used the Michell Thomas language courses. But the idea that top-performers might not be good examples of that best practice is nowadays a bit heretical. Ferris suggests that rather than look to superstars for tips on how to practice, we would be better off finding the outliers who have achieved some success despite not being well endowed by nature.
Something that Ferriss's meta-method also emphasises above the meta-learning principles is to seek help from expert tutors. (And this is a place where, because of his celebrity, he might be better placed to find help than you and me.) This is an interesting angle, because most educational theory looks at the problem the other way around, from the perspective of an expert wanting to teach novices. As a novice wanting to learn something, it might appear that the principles are all you need, but that's not the case: to build your learning programme most effectively you also need some hands-on expert advice. (Because you need to learn the expert's tacit knowledge, which the expert might not even know how to teach. Ferriss gives some ideas on how you might approach this.)
It's intuitively obvious that this would be a good idea, but perhaps not so obvious just how powerful it is. We know from educational research that personal tutors are unreasonably effective — this is the so-called "two-sigma problem". Nearly 30 years ago, Benjamin Bloom compared group teaching with one-to-one tutoring, and found that an average student with one-to-one tutoring performed at the same level as a top 2% student with group teaching. That's really quite astonishing, isn't it? See The 2 sigma problem: The search for methods of group instruction as effective as one-to-one tutoring in Educational Researcher, 13(6), pp4-16 (1984).
But how is all this relevant to you and me? Well Ferriss's ideas on cooking are quite interesting in themselves, and I'll come back to them another time. But as programmers we have our own expertise problem. As part of my "day job" I teach small classes of students how to program in Python, as part of an electrical engineering degree. Well, I try to. I am no longer surprised at how difficult this is. A few people just get it straight away, like I did when I originally learned to program. Most struggle. I've found a few things that seem to help (and I'll come back to these another time too), but surely we can do much better. If there really is a way to gain expertise 10 times faster, we could certainly use it.
On the other hand, experienced programmers could always learn to be better too. I'd certainly like to learn more and be a better programmer. It's a generally accepted idea that programming teams often have some programmers who are 10-times more productive than others, but there's been relatively little interest in systematically finding what the difference is and learning to be that 10-times programmer yourself.
If you're an expert and you want to improve, you will certainly have to make your own learning programme, and Ferriss's ideas are a fairly accessible place to start. So: can we apply Ferriss's principles? What 20% of expert programming knowledge will give 80% of the results? What things are common amongst best performers, but infrequently taught? What you you know that I need to know too? I think it's not just novices but also experts who could do a lot better.
So overall, yes, despite the hype, do I think it's worth reading The 4-Hour Chef, if only to help imagine what might be possible.
Sunday, 13 January 2013
Start of Day
Over the Christmas holidays I was reading a rather strange cookbook
by Tim Ferris when some vague ideas about programming and cooking
started to crystallise in my mind. Too vague really to turn into a
paper or an essay. Perhaps a blog might be a better medium to explore
these ideas? Let's see.
One of the vague ideas is that the way we do programming today, the way we build systems, is wrong. Surely we can do a lot better? Think for example of Bret Victor's talk Inventing on Principle or Alan Kay's talk Programming and Scaling. (If you haven't already watched these, you need to stop reading and watch them right now. Go on. Do it!) Is it possible that we could really develop all our programs like this? Is it possible that our programs could be 100 or even 1000 times shorter? Even if the answer to these questions is only "maybe", then surely we should still put a lot of effort into trying, rather than condemn future programmers to suffer as we have suffered?
So part of what I'd like to do here is to review these and other related ideas; to look at "the road not taken"; to understand the ideas better by trying to apply them in practice.
Another vague idea is that programming and cooking have more in common than programming and constructing buildings. Developers sometimes like to call themselves "software engineers" or even "software architects", but the analogy is a bit thin. As Glenn Vanderburg points out, "software engineering" is quite unusual: the term "engineering" is generally reserved for those techniques which actually work in practice. Not so with software. And, despite the popularity of "patterns" in the software world, anyone who has actually read Christopher Alexander should understand that trying to copy the current practice of architecture is not such a clever idea.
Think instead of the organisation of a commercial kitchen, run by a "chef" — literally the "chief" cook — and not by a "culinary architect". Think of the respect for technique and the role of apprenticeship in learning those craft skills. Think of a more domestic scene, maybe on a Sunday morning. You break off from hacking some code and you go to the kitchen to put together a chili for lunch. Are these really such different activities? Can we learn something from the comparison? Whatever their medium, hackers like to understand how things really work, and relish the opportunity to practice their skills. (Perhaps the main difference between food and coding is that it takes a hacker to appreciate someone else's code, but anyone can appreciate food.)
So another part of what I'd like to do here is to see what we can learn from the comparison. But Nathan Myhrvold is not the only hacker to find food interesting in its own right. So I'll also take the opportunity to look at tools, techniques, ingredients and of course recipies.
Let's see how this goes.
One of the vague ideas is that the way we do programming today, the way we build systems, is wrong. Surely we can do a lot better? Think for example of Bret Victor's talk Inventing on Principle or Alan Kay's talk Programming and Scaling. (If you haven't already watched these, you need to stop reading and watch them right now. Go on. Do it!) Is it possible that we could really develop all our programs like this? Is it possible that our programs could be 100 or even 1000 times shorter? Even if the answer to these questions is only "maybe", then surely we should still put a lot of effort into trying, rather than condemn future programmers to suffer as we have suffered?
So part of what I'd like to do here is to review these and other related ideas; to look at "the road not taken"; to understand the ideas better by trying to apply them in practice.
Another vague idea is that programming and cooking have more in common than programming and constructing buildings. Developers sometimes like to call themselves "software engineers" or even "software architects", but the analogy is a bit thin. As Glenn Vanderburg points out, "software engineering" is quite unusual: the term "engineering" is generally reserved for those techniques which actually work in practice. Not so with software. And, despite the popularity of "patterns" in the software world, anyone who has actually read Christopher Alexander should understand that trying to copy the current practice of architecture is not such a clever idea.
Think instead of the organisation of a commercial kitchen, run by a "chef" — literally the "chief" cook — and not by a "culinary architect". Think of the respect for technique and the role of apprenticeship in learning those craft skills. Think of a more domestic scene, maybe on a Sunday morning. You break off from hacking some code and you go to the kitchen to put together a chili for lunch. Are these really such different activities? Can we learn something from the comparison? Whatever their medium, hackers like to understand how things really work, and relish the opportunity to practice their skills. (Perhaps the main difference between food and coding is that it takes a hacker to appreciate someone else's code, but anyone can appreciate food.)
So another part of what I'd like to do here is to see what we can learn from the comparison. But Nathan Myhrvold is not the only hacker to find food interesting in its own right. So I'll also take the opportunity to look at tools, techniques, ingredients and of course recipies.
Let's see how this goes.
Subscribe to:
Posts (Atom)