Geoffrey Hinton has made a number of breakthroughs in neural networks and also had a lot of students and advisees, like Yann LeCun, who made further advances.
Although he's been working in the field for decades, Geoff and his research have gained a lot of well-deserved attention recently as deep learning systems outperformed other approaches on a wide range of tasks, from image recognition to speech.
Much of this success is due to the power of a workhorse algorithm called backpropagation that he helped pioneer.
Many scientists with an amazing contribution like that to their credit would do everything to push it and extend its influence. But Geoff is a modest guy. He now thinks that AI researchers need to move beyond backpropagation and 'start again". The key issue is that, while backpropagation works well on supervised learning problems, where we have a lot of labeled training data, it doesn't work for unsupervised learning, which seems to be how we humans learn most things about the world. When I saw him in Toronto last week, Geoff said it was time for a new wave of researchers with different approaches. He repeated (with a wry smile) the old adage "science advances one funeral at a time."
Fortunately, more and more smart young people are flocking into the field, developing and testing new approaches. The biggest conference in the field, NIPS, keeps getting bigger and selling out faster and faster.
I'm optimistic that we will not only find new applications for existing approaches, but also invent a lot of new ones.
No comments:
Post a Comment