ID2502: Self-supervised Transformer Model for EEG Data

Symbolic picture for the article. The link opens the image in a large view.

Master Thesis:

Electroencephalography (EEG) data can be very useful in various fields of medicine to establish medical diagnoses or for continuous patient care on anesthesia. However, labeled EEG data is not easily available on large datasets, due to the high cost of acquisition and annotation. This limits the performance of deep learning models trained on EEG data. In this thesis, we explore self-supervised pretraining for EEG data in the context of depth of anesthesia detection.

Details

You will investigate self-supervised pre-training methods and architectures for EEG data, and their transferability to the task of depth of anesthesia detection. You will also explore how those models can be used to detect artifacts in EEG signals

Tasks

  • Literature review on:
    • Self-supervised transformer models for EEG, with a focus on low channels setups
    • Artifact detection in EEG signals
  • Training of a self-supervised transformer model on EEG signals of the Vitaldb database
  • Evaluation of the model on internal dataset depth of anesthesia classification dataset
  • Clear and structured code documentation

Requirements

  • Proficiency in Python and familiarity with deep learning frameworks (Pytorch)
  • Preferred starting date: March/April 2025 (can be discussed)
  • Biomedical signal analysis lecture (preferred)

Supervisors

Arijana Bohr, M. Sc.

Researcher & PhD Candidate

Dr. Emmanuelle Salin

PostDoc and Group Leader

Please use the application form to apply for the topic. We will then get in contact with you.