Multiple-Instance
Same Patient, Different Views: Contrastive pretraining improves medical imaging AI.
When you lack labeled training data, pretraining a model on unlabeled data can compensate. New research pretrained a model three times to boost performance on a medical imaging task.