Dear friends,
I chatted recently with MIT researcher Lex Fridman on his Artificial Intelligence podcast, where we discussed our experiences teaching deep learning. It was the most fun I’ve had in an interview lately, and you can watch the video here.
Lex asked me what machine learning concepts students struggle with most. While I don’t think that any particular concept is especially difficult, studying deep learning is a lot like studying math. No particular math concept — addition, subtraction, and so on — is harder than others, but it’s hard to understand division if you don’t already understand multiplication. Similarly, deep learning involves many concepts, such as LSTMs with Attention, that build on other concepts, like LSTMs, which in turn build on RNNs.
If you’re taking a course on deep learning and struggling with an advanced concept like how ResNets work, you might want to review earlier concepts like how a basic ConvNet works.
As deep learning matures, our community builds new ideas on top of old ones. This is great for progress, but unfortunately it also creates longer “prerequisite chains” for learning the material. Putting in extra effort to master the basics will help you when you get to more advanced topics.
Keep learning!
Andrew