Supernova light curve classification using attention and other techniques
View/ Open
Date
14/09/2023Author
Ibsen, Amanda
Metadata
Abstract
Even in the era of deep-learning the huge amount of astronomical data available
has not yet allowed us to solve the problem of Supernova (SN) light curve
classification.
These explosive transients are photometrically difficult to label not only because
of their irregular sampling and noise, but also because of the dissimilarities that
exist between different surveys and datasets. This is one of the major issues
in the field, as it is often the case that models work well with simulations, but
not necessarily with real data. This work explores Attention mechanisms and
other Deep Learning techniques with the aim of extracting meaningful feature
representations from SNe in order to be able to classify their light curves.
The main contributions of this thesis are: (1) I compare several Neural Network
architectures commonly used for Time Series and light curve classification to
establish some baselines, (2) I propose a model that uses simple additive SelfAttention that improves early classification of light curves and (3) I present a Transformer-based architecture adapted for light curve reconstruction and classification and show how, by framing it as a probabilistic Variational AutoEncoder, the combination of both a regular latent space and a Multi-headed Attention mechanism can help mitigate the difficulties of dealing with different datasets.