Artificial intelligence is allowing scientists to see the sources of gravitational wave faster and more accurately than ever before. Credit: James Josephide
Following the recent overwhelming success of deep learning and artificial intelligence in several fields of research, industry and medicine, researchers from the ARC Centre of Excellence of Gravitational Wave Discovery (OzGrav) and the University of Western Australia (UWA), including PhD student and the paper’s first author Chayan Chatterjee, have built a deep learning model using an Artificial Neural Network to pinpoint where in the sky gravitational wave signals have come from. The model can localise the source of gravitational waves produced by colliding pairs of black holes potentially as much as a thousand times faster than any other technique.
Professor Amitava Datta, a scientist from UWA who contributed to the study, says: ‘This work is a very interesting example of learning patterns from simulated data for predicting the outcome of real events, in this case the location of gravitational wave sources. Perhaps this approach using deep learning will be more and more useful in basic sciences in general.’
Data intensive astronomy expert Kevin Vinsen from the International Centre for Radio Astronomy Research says: ‘This project is an excellent example of how a multi-disciplinary approach can solve the problem’.
The basic structure of an Artificial Neural Network. The circles represent the neurons or nodes and the arrows represent connections between one neuron to another. Credit: Chayan Chatterjee
Gravitational waves are small ripples in the space-time continuum caused by colossal stellar events such as colliding black holes. In September 2015, following recent advances in detector sensitivity, the LIGO Scientific Collaboration detected gravitational waves for the first time. This was a landmark achievement in human discovery and heralded the birth of the new field of gravitational wave astronomy.
The need for speed and accuracy is particularly important in the context of gravitational wave localisation—scientists need to tell a global network of telescopes where to point on the sky as quickly as possible, so they can see any electromagnetic light that may also have come from the gravitational wave event. The current algorithm used to locate gravitational wave sources in real time takes a few seconds to process. More accurate methods usually take several hours to compute. The light generated by gravitational wave events can be very short-lived at certain wavelengths, like short gamma ray bursts, which last a mere 2-3 seconds, so scientists need methods that can rapidly process huge data as fast and accurately as possible.
The idea behind deep learning is simple: it’s an algorithm designed to mimic the functioning of neurons in our brain to carry out tasks, like categorising observed stimuli. This is done by making the network learn the correlations between a labelled input dataset and the output it is trying to predict. Just like electric signals or synapses flow through neurons in our brain, the input information provided to an Artificial Neural Network travel through layers of nodes (usually several layers deep), with each layer introducing some non-linearity to the input. This non-linearity helps the network learn complex features of the data. The ‘learning’ happens through a rigorous ‘training’ of the network. During the training, the predictions of the network are compared with the true values, and the parameters of the network are adjusted to minimise any erroneous gaps.
In their recently published paper, Chatterjee and the team from UWA successfully trained the Artificial Neural Network to learn the input data for source localisation. The data was pre-processed to extract the important physical Physics parameters from simulated gravitational wave signals, injected into ‘random noise’. The network classified these signals into several classes and accurately identified the source direction of the gravitational waves. The model localised the test samples much faster than other methods and at a low computational cost. The researchers plan to extend this work for pairs of merging neutron stars and neutron star-black hole systems.
Chatterjee says: ‘Hopefully the methods we introduce can also be translated to other areas of research and industry and help further untap the seemingly limitless potential of deep learning and artificial intelligence’.
OzGrav’s Chief Investigator Professor Linqing Wen who led the study says: ‘The future is wide open for gravitational wave discovery using the machine learning technique’.
This paper is an outcome of a multi-disciplinary collaboration of UWA’s Gravitational Wave Astronomy Group led by OzGrav’s Chief Investigator Professor Linqing Wen, data intensive astronomy expert Kevin Vinsen from International Centre for Radio Astronomy Research (ICRAR), and Professor Amitava Datta of UWA’s Computer Science and Software Engineering.
Link to publication: https://journals.aps.org/prd/abstract/10.1103/PhysRevD.100.10302