Question & Answers¶
WG Reviewer: Alexander Kappas
Renormalization:
Test statistics distribution: I am not sure I understand why this plot shows “the good compliance of scrambled background with simulated background maps”. Is it because it is centered around 0? How would this plot look like if you’d usedsimulated background for <C_l,bg^eff> instead of scrambled data (I guess that’s what you do), or a single simulated background that is then scrambled?
- In fact, this is a problem that originally showed up in Lisa’s analysis of the IC79 sample after she started using the energy weights. In particular the MonteCarlo TS distribution of the background gets very broad because of large fluctuations in the C_l^eff. In contrast by using scrambled experimental data we have the same zenith and energy information for each map, and hence the peak is very narrow. Lisa has done a detailed analysis for IC79 on her wikipage `showing that renormalization of the TS gives a better agreement between MC and scrambled experimental data and suppresses the large fluctuations for Monte Carlo maps.
Weights:
I don’t understand the sentence “The weight distributions differs in shape from the spectral distribution, …” Which distribution do you compare and where does the shape difference manifest?
- It is a comparison to the above plot of the (typical) C^l_eff power spectra. The shape difference can be seen for small l: Whereas the C^l_eff distribution is a monotonically decreasing function with C^l_eff>0 for l→0, the TS weight distribution has a maximum at l~50 and then going to 0 for l→0 since it is dominated by the high fluctuations of background maps at large angular scales (small l).
Also, I don’t understand why for different number of sources (which are I guess uniformly over the sky) the difference to the background always peaks at l~50. l defines the angular scale of the structure and that should change with the number of sources, shouldn’t it?
- Since the solid angle of each single source is given by the point spread function (average PSF angle ~1° equivalent to a solid angle of 7e-5*4*pi) this is the relevant angular scale for the multipole analysis. The shape would start to differ if we inject more an more overlapping point sources O(10^4) sources, with at least one event per source. Anyway, since we do not have that many astrophysical neutrinos, this is not the case. Hence, the maximum will always be at l~50.
Skymap Simulation
How do you inject signal events? Is this with a Poisson PDF with mean mu_tot?
- Well, no. I try to explain it in short here, although I think that the explanation in the section Simulation of Skymaps is probably more detailed. The problem with using mu_tot for the Poisson PDF is that the detector configuration changes over the samples (years). Therefore by just using mu_tot we cannot comply with the different zenith acceptances in the different samples. The problem is solved by simulating the skymaps separately for each year. Once the source strength for one sample, we can go from year to year by just using the effective area ratios at the given position. For visualization it could also be helpful to have a look at page 4 of the presentation I have prepared for the point source call at the 26 of June (You find it under “Talks” on the Wikipage).
Performance:
Doesn’t the above mentioned crossing point actually tell us that the diffuse cosmic neutrino flux (under the assumptions made for the source distribution, strength et.) is produced by at least 1000 sources because otherwise it is excluded by you analysis?
- I would probably put it the other way around: Under the given assumptions we need less than ~10^3 sources to get a 90% CL rejection of the BG-only Hypothesis in 50% of the cases. But yes, in general I think you are right. I’m just trying to be careful with the word “exclusion” here.
External Reviewer: Summer Blot
I just want to clarify - the background-only MC sky maps are validated by using scrambled experimental data?
- Yes, That’s correct.
I didn’t see anything about systematic uncertainties, except that you avoid a_l^m=0 to minimize your zenith systematic uncertainty. How are you explicitly handling systematics? Which ones are important? Or is your sensitivity/discovery potential dominated by statistical uncertainties? Related note - Looking through your backup plots I see that you are calculating your sensitivity with two values of gamma. Is that the standard way to estimate the impact of that uncertainty in this kind of analysis?
- In fact it is the standard procedure in the PS WG to estimate the systematic uncertainties for the relevant parameter space post-unblinding. These kind of calculations are based on MC samples including the relevant systematic errors - mainly deviations in the absorption and scattering coefficients or the DOM efficiency. In order to determine the influence of changes in these parameters the whole procedure of simulation and analysis has to be conducted on these MC samples since the Point Spread Function and the Effective Area needed in the skymaps’ simulation can also change. Finally the errors (in comparison to the best-fit sample) are added up quadratically to deduce an overall systematic error. For Point Source Analysis the value is typically in an order of 10%.Compared to that, the only statistical error is the error on the fit parameters used for describing the test-statistic distribution. For each fit-parameter it is in the order of 2% and therefore presumably less than the systematic error.The reason for showing the results of two gamma values is a different one. On the one side we have the gamma=2.0 spectral index, which is used as a benchmark value to make the performance of all the different analysis on different samples comparable to each other. On the other side we use gamma=2.13 since it is the best fit from the multi-year diffuse muon sample. In fact the flux parameters are not included as systematic errors since they are fixed values in the analysis which might not be optimal in case the flux has a completely different spectral index, but this is no systematic effect. Additionally, the possibility of getting a fake sensitivity was tested in the previous IC79 multipole-analysis. It appears that after renormalization the analysis is stable against changes in the flux normalization and spectral index.