One flexible model for multiclass gravitational wave signal and glitch generation

Tom Dooney, R. Lyana Curier, Daniel Stanley Tan, Melissa Lopez, Chris Van Den Broeck, Stefano Bromuri

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

Simulating realistic time-domain observations of gravitational waves (GWs) and other events of interest in GW detectors, such as transient noise bursts called glitches, can help in advancing GW data analysis. Simulated data can be used in downstream data analysis tasks by augmenting datasets for signal searches, balancing datasets for machine learning applications, validating detection schemes, and constructing mock data challenges. In this work, we present a conditional derivative GAN (cDVGAN), a novel conditional model in the generative adversarial network framework for simulating multiple classes of time-domain observations that represent gravitational waves (GWs) and detector glitches. cDVGAN can also generate generalized hybrid samples that span the variation between classes through class interpolation in the conditioned class vector. cDVGAN introduces an additional player into the typical 2-player adversarial game of GANs, where an auxiliary discriminator analyzes the first-order derivative time series. Our results show that this provides synthetic data that better capture the features of the original data. cDVGAN conditions on three classes in the time domain, two denoized from LIGO blip and tomte glitch events from its third observing run (O3), and the third representing binary black hole (BBH) mergers. Our proposed cDVGAN outperforms four different baseline GAN models in replicating the features of the three classes. Specifically, our experiments show that training convolutional neural networks (CNNs) with our cDVGAN-generated data improves the detection of samples embedded in detector noise beyond the synthetic data from other state-of-the-art GAN models. Our best synthetic dataset yields as much as a 4.2% increase in area-under-the-curve (AUC) performance, maintaining the same CNN architecture, compared to synthetic datasets from baseline GANs. Moreover, training the CNN with class-interpolated hybrid samples from our cDVGAN outperforms CNNs trained only on the standard classes, when identifying real samples embedded in LIGO detector background between signal-to-noise ratios ranging from 1 to 16 (4% AUC improvement for cDVGAN). We also illustrate an application of cDVGAN in a data augmentation example, showing that it is competitive with a traditional augmentation approach. Lastly, we test cDVGAN's BBH signals in a fitting-factor study, showing that the synthetic signals are generally consistent with the semianalytical model used to generate the training signals and the corresponding parameter space.

Original languageEnglish
Article number022004
Number of pages19
JournalPhysical Review D
Volume110
Issue number2
DOIs
Publication statusPublished - 15 Jul 2024

Bibliographical note

Publisher Copyright:
© 2024 American Physical Society.

Funding

This research was conducted within the ET Technologies project (Project No. PROJ-03612), which is partly funded by EFRO, the Province of Limburg and The Dutch Ministry of Economic Affairs and Climate Policy within the REACT-EU Programme of OP Zuid. The authors are grateful for contributions by members of the ET Technologies research team, in particular, Stefano Schmidt, Andrew Miller, and Sarah Caudill. This material is based upon work supported by NSF's LIGO Laboratory which is a major facility fully funded by the National Science Foundation. The authors are grateful for computational resources provided by the LIGO Laboratory and supported by the National Science Foundation Grants No. PHY-0757058 and No. PHY-0823459.

FundersFunder number
European Regional Development Fund
ET Technologies
Ministerie van Economische Zaken en Klimaat
National Science FoundationPHY-0823459, PHY-0757058

    Fingerprint

    Dive into the research topics of 'One flexible model for multiclass gravitational wave signal and glitch generation'. Together they form a unique fingerprint.

    Cite this