Contrastive Self-Supervised Data Fusion for Satellite Imagery

Item Type Conference or Workshop Item (Paper)
Abstract Self-supervised learning has great potential for the remote sensing domain, where unlabelled observations are abundant, but labels are hard to obtain. This work leverages unlabelled multi-modal remote sensing data for augmentation-free contrastive self-supervised learning. Deep neural network models are trained to maximize the similarity of latent representations obtained with different sensing techniques from the same location, while distinguishing them from other locations. We showcase this idea with two self-supervised data fusion methods and compare against standard supervised and self-supervised learning approaches on a land-cover classification task. Our results show that contrastive data fusion is a powerful self-supervised technique to train image encoders that are capable of producing meaningful representations: Simple linear probing performs on par with fully supervised approaches and fine-tuning with as little as 10% of the labelled data results in higher accuracy than supervised training on the entire dataset.
Authors Scheibenreif, Linus Mathias; Mommert, Michael & Borth, Damian
Research Team AIML Lab
Journal or Publication Title ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Language English
Subjects computer science
HSG Classification contribution to scientific community
Date 2022
Publisher ISPRS
Volume V-3-2022
Page Range 705-711
Number of Pages 6
Event Title ISPRS Congress
Event Location Nice
Event Dates 6-11 June 2022
Depositing User Prof. Dr. Michael Mommert
Date Deposited 22 Jun 2022 10:26
Last Modified 17 Aug 2022 17:05
URI: https://www.alexandria.unisg.ch/publications/266528

Download

[img] Text
Scheibenreif2022_ContrastiveSSLDataFusion.pdf

Download (6MB)

Citation

Scheibenreif, Linus Mathias; Mommert, Michael & Borth, Damian: Contrastive Self-Supervised Data Fusion for Satellite Imagery. 2022. - ISPRS Congress. - Nice.

Statistics

https://www.alexandria.unisg.ch/id/eprint/266528
Edit item Edit item
Feedback?