My LinkedIn feed presented me with a write-up of work done at the US Department of Energy’s Pacific Northwest National Laboratory by a website called Verdic, titled A deep neural network is being harnessed to analyse nuclear events:
[T]he data is shrouded in external noise, which can hinder the discovery of more uncommon signals. Even a light switch being turned on in a building can produce noise and subsequently affect the data.
It’s not actually very informative, not giving any information about how “deep” the network is but since it is
running on a standard desktop computer
I suspect not too deep. Although it is of course entirely possible that it “runs on a desktop computer” for the (much cheaper) classification task but needs to be trained on something more powerful.
This doesn’t stop the write-up from proclaimining that
Deep learning is likely to become the AI technology that allows cognitive systems to surpass human intelligence for specific applications.
And I have to admit that I really don’t understand the described training procedure:
A sample of 32,000 pulses was used to adapt the network, programming it to learn the changing features the pulses exhibited that would be critical when interpreting the data. Jesse Ward then sent over thousands of additional pulses so that the network could begin to deduce what signals were good and which were bad; as time progressed, the more complex the pulses became.
What exactly is the difference between the first 32k pulses and how they were used, and the ones afterwards? It sounds to me a bit as if the first step is a feature construction one – maybe using an auto-encoder network – and the second one of discriminative learning. But yeah, far from clear.
There’s unfortunately no link to any scientific publication in the write-up, so I’ll add one from 2015: Bockermann et al. Online Analysis of High-Volume Data Streams in Astroparticle Physics (pdf) which tackles the problem of
A central problem in all these experiments is the distinction of the crucial gamma events from the background noise that is produced by hadronic rays and is inevitably recorded. This task is widely known as the gamma-hadron separation problem and is an essential step in the analysis chain.
i.e. something very similar to the problem described above. They did it by introducing
Enjoy the read!