By identifying all hits possibly caused by atmospheric muons (triggering ones and sub-threshold) and also cutting out the correlated (bursty) noise hits, we retrieve a data set of hits that represents the Poissonian part of the noise in the detector. This is the basis for a corrected significance statement for any supernova candidate trigger as well as a chance to improve our supernova detection capabilities beyond the Milky Way.
In hadronic cascades - in contrast to electromagnetic showers - a multitude of neutrons is produced. These neutrons subsequently scatter down to thermal energies and can get captured by hydrogen atoms in the ice. The relaxation of hydrogen leads to a delayed photon signal which follows an exponential decay with a decay time of approximately 200 microseconds. Taking Hitspool data around every HESE event will allow us to record the delayed signal and thus provide the possibility to determine the hadronic and electromagnetic fraction of cascade events. Based on this, e.g. interaction types can be distinguished and energy reconstruction can be refined.
A neutrino signal is expected to arise from hadron acceleration in solar flares. These flare neutrinos would confirm the hadronic nature of solar flares. In combination with photons this process would give insight into the acceleration mechanism(s). Neutrino observatories could help to constrain current parameters in solar flare physics.
Monitoring values for hitspool data taking and processing. SNDAQ Monitoring could be included here as well. Eventually to be integrated into i3live at some point.
We summarize here the analyses of HitSpool data related to supernova candidate triggers. We furthermore show the 2 ms fast analysis data stream of SNDAQ.
The displayed significance values ξ are calculated from hitspool data after subtreshold muon correction using the entire available data range [-30s, +90s] around the trigger time. The size of the data point indicates the binning size in which the candidate triggered (small = 500ms, middle = 1.5 s , large = 4.0 s) .
Entire list of completely processed datasets
By identifying all hits possibly caused by atmospheric muons (triggering ones and sub-threshold) and also cutting out the correlated (bursty) noise hits, we retrieve a data set of hits that represents the Poissonian part of the noise in the detector. This is the basis for a corrected significance statement for any supernova candidate trigger as well as a chance to improve our supernova detection capabilities beyond the Milky Way.
In summary, we have four levels of hit identification starting at considering all possible hits up to hits cleaned from triggering and subtrigger muons. In order to account for the bursty noise in IceCube we add a non paralyzing deadtime to all hits just as done in SNDAQ.
The hits that survive all of the below mentioned criteria are considered to be our noise hits sample that we investigate. The idea is that the supernova signal will present itself via an enhancement of that noise rate. By subtracting all possible atmospheric background hits due to muons we hope to improve / reduce the false positive alerts and to detect even at lower significances.
All (SLC and HLC) hits from all DOMs present during data taking.
The hit category basically comparable with the standard supernova scalers. An artificial deadtime of 247.5 μs is applied to the hits series of every DOM. Only hits surviving this deadtime are considered in this group.
Standard trigger (SMT3, SMT8, String and Volume) causing hits are identified by re-applying the trigger conditions to all hitspool hits and tagging the ones that are responsible for a trigger launch. Subtracting these hits from the rest froms the NoTrig group and eliminates the hits from triggering atmospheric muons in the data set.
Hits in the detector that are caused by atmospheric muons below the trigger threshold are identified by using the HiveSplitter cluster algorithm in combination with NoiseEngine.
Performed on a sliding time window as done in the online sndaq analysis. Since in hitspool data we usually have only [-30s,+60s] around the trigger time recorded, we perform two significance calculations: once considering a symmetrical time-window [-30s,+30s] and once the full data range i.e. an asymmetrical time-window [-30s,+60s]. Both analyses are done with all four data types (see above) but only in the binsize in which the original sndaq scaler online analys triggered.
The Supernova flux parameterization by Pagliaroli et al (here and here) is a minimum multi-parameter model than can be fit to the lightcurve in order to obtain valuable insight in the underlying process such a signal onset time, the rise-time τr, the accreting tie constant τa and cooling time constant τc
Since the parameterization mentioned above is not needed in order to determine the signal onset and the rise-time τ;r, we fit a simplier expression to the rising edge of the signal, following the model of Halzen & Raffelt for the signal onset (read more here)
Here is a test page
Lightcurves are produced from all 4 data gorps mentioned above each in 6 panel plots representing the 6 binnings in which the data is grouped: 1 ms, 2ms, 500ms, 1.5s 4 s and 10 second wide bins. The bin size in which the candidate triggered SNDAQ is underlayed in red.
The inice detector (without deepcore string) grouped in sets of 100 DOMs with binsize of 500 ms displayed in a sequence to illustrate the local hits distribution during the given time window. All hits refers to all hitspool hits whereas cleaned hits are thos hits surviving deadtime, trigger correction and subtrigger subtraction.
The SNDAQ 2 ms datastream is transfered North for supernova candidate triggers with ξ orig > 7.30 or orig > 4.00 and corr > 7.30
Lightcurves for these alerts will also be resented here. If a HitSpool data sets is available for the snalarm as well, we provide both on the same page for convenience.
Entire list of completely processed datasets
In hadronic cascades - in contrast to electromagnetic showers - a multitude of neutrons is produced. These neutrons subsequently scatter down to thermal energies and can get captured by hydrogen atoms in the ice. The disexcitation of hydrogen leads to a delayed photon signal which follows an exponential decay with a decay time of approximately 200 microseconds. Taking Hitsspool data around every HESE event will allow us to record the delayed signal and thus provide the possibility to determine the hadronic and electromagnetic fraction of cascade events. Based on this, e.g. interaction types can be distinguished and energy reconstruction can be refined.
A neutrino signal is expected to arise from hadron acceleration in solar flares. These flare neutrinos would confirm the hadronic nature of solar flares. In combination with photons this process would give insight into the acceleration mechanism(s). Neutrino observatories could help to constrain current parameters in solar flare physics..
Moni and logs from hitspool data taking via the hitspool interface at SPS. HitSpool Moni on I3Live
HitSpool data processing logs and moni from the condor job cluster visualized.
A short check list on what to check on a weakly basis with respect to processing and data taking of hitspool data.
The various issues that may arise in hitspool data grabbing or processing are documented