Dear friends,
Nearly a decade ago, I got excited by self-taught learning and unsupervised feature learning — ways to learn features from unlabeled data that afterward can be used in a supervised task. These ideas contributed only marginally to practical performance back then, but I’m pleased to see their resurgence and real traction in self-supervised learning.
Many of you know the story of how the increasing scale of computation and data, coupled with innovation in algorithms, drove the rise of deep learning. Recent progress in self-supervised learning also appears to be powered by greater computational and data scale — we can now train large neural networks on much larger unlabeled datasets — together with new algorithms like contrastive predictive coding.
Today feels very much like the early, heady days a decade-plus ago, when we saw neural networks start to work in practical settings. The number of exciting research directions seems larger than ever!
Keep learning,
Andrew