A very unclear paper?

  • A
  • Thread starter ChrisVer
  • Start date
  • #1
ChrisVer
Gold Member
3,373
459
Trying to move a discussion over a paper I had here too...
I'm referring to this paper :
https://arxiv.org/abs/1604.05394
and in particular Table 2, the mass of the 2nd heavy neutralino [itex]m_{\tilde{\chi}_2}[/itex] is negative: [itex]m_{\tilde{\chi}_2}=-135.3 \text{ GeV}[/itex] .
Can it be a typo? (it has to be).

Also something more... concerning the light scalar Higgs [itex]m_{H_1}=93.8 \text{GeV}[/itex]... I was told that LEP extensively studied that mass range [without any evidence of a new particle]. Though, in the paper it's stated that they used for their fittings results from LEP as well how could they create such a Benchmark Model? In particular how would such a field still exist there while avoiding the previous searches (how exotic properties would it need to have)?

Finally, going into the "experiment" part of the paper, they give the plots of the centrality fraction [itex]f_{cent} = \frac{E^{\Delta R<0.1}}{E^{\Delta R<0.2}}[/itex] and the ratio of the 2- over 1- subjetiness, [itex]\tau_{12}[/itex].
My question mainly comes from their comment that:
ts issued
from the fragmentation of light quarks are always harder to distinguish from ditau boosted objects, as their properties are similar to the ditau case (see Figure 3).
To be honest, from Figure 3 I see the complete opposite: I can by eye (not even using BDT) cut most of the "QCD" components, something that I can generally do for all the plots for the tau-leptons.


subnote:
Also I would be interested to hearing from someone what is the boosted ditau tagging (such as the one shown in Fig7 left pannel- it's not really mentioned within the paper), and if anyone has a clue about how the systematics over Background of 3% and 1% (not even conservative) were taken at the right pannel.
 

Answers and Replies

  • #2
129
12
rying to move a discussion over a paper I had here too...
I'm referring to this paper :
https://arxiv.org/abs/1604.05394
and in particular Table 2, the mass of the 2nd heavy neutralino m~χ2mχ~2m_{\tilde{\chi}_2} is negative: m~χ2=−135.3 GeVmχ~2=−135.3 GeVm_{\tilde{\chi}_2}=-135.3 \text{ GeV} .
Can it be a typo? (it has to be).
I don't think it is a typo. The physical mass is of course positive, but the neutralino mass matrix can have negative eigenvalues.

Also something more... concerning the light scalar Higgs mH1=93.8GeVmH1=93.8GeVm_{H_1}=93.8 \text{GeV}... I was told that LEP extensively studied that mass range [without any evidence of a new particle]. Though, in the paper it's stated that they used for their fittings results from LEP as well how could they create such a Benchmark Model? In particular how would such a field still exist there while avoiding the previous searches (how exotic properties would it need to have)?
Well, they consider an extension to the MSSM with an addition of a Standard Model singlet. The A1 and H1 in the table are the pseudoscalar and scalar which are almost pure singlet, and therefore interact very little with known SM particles. So the LEP limit doesn't necessarily apply.

Finally, going into the "experiment" part of the paper, they give the plots of the centrality fraction fcent=EΔR<0.1EΔR<0.2fcent=EΔR<0.1EΔR<0.2f_{cent} = \frac{E^{\Delta Rτ12τ12\tau_{12}.
My question mainly comes from their comment that:
To be honest, from Figure 3 I see the complete opposite: I can by eye (not even using BDT) cut most of the "QCD" components, something that I can generally do for all the plots for the tau-leptons.
I think they mean that standard tau tagging techniques (like [itex]f_{cent}[/itex]) seperate better between single taus and qcd jets, than ditau jets and qcd jets. (As seen in figure 3)


subnote:
Also I would be interested to hearing from someone what is the boosted ditau tagging (such as the one shown in Fig7 left pannel- it's not really mentioned within the paper), and if anyone has a clue about how the systematics over Background of 3% and 1% (not even conservative) were taken at the right pannel.
ditau tagging is trying to take advantage of the two prong structure inside the jet ( by using [itex]\tau_{12}[/itex] for example, which measures how much better a jet substructure is described by two prongs rather than one) to distinguish between jets with two collimated taus than standard qcd jets.
The right panel in figure 7 is to show the effect the systematics have on the sensitivity. But I agree with you its seems to be non-conservative and low estimate.
 
  • Like
Likes ChrisVer
  • #3
ChrisVer
Gold Member
3,373
459
The physical mass is of course positive, but the neutralino mass matrix can have negative eigenvalues.
Isn't the Mass matrix symmetric?

Well, they consider an extension to the MSSM with an addition of a Standard Model singlet. The A1 and H1 in the table are the pseudoscalar and scalar which are almost pure singlet, and therefore interact very little with known SM particles. So the LEP limit doesn't necessarily apply.
It's written in the paper that the S and PS are almost purely-scalar interacting with the SM particles only via its mixing with [itex]H_u,H_d[/itex] fields (... however I don't understand how exactly this works. Would they mean [itex]h_u,h_d[/itex] (the doublet scalars)? Because by capital letters they denote the higgs superfields.

I think they mean that standard tau tagging techniques (like fcentf_{cent}) seperate better between single taus and qcd jets, than ditau jets and qcd jets. (As seen in figure 3)
Well if that's what they meant I think it's OK. But in general it seems their BDT is able to get a better bkg rejection of the taus than for the multijets.
I believe the answer is though in the a plot like Fig2-E=200- [itex]\tau_{21}[/itex]. it looks like that a huge fraction of taus are in the 1st bin (they even exceed their logarithmic scale plot). Probably this affects a lot the result?

to show the effect the systematics have on the sensitivity
Is that how generally a systematic uncertainty would alter the sensitivity? It looks like a pretty general relation then.

by using τ12\tau_{12} for example, which measures how much better a jet substructure is described by two prongs rather than one
Thanks for this clarrification. I was planning to find a clear way to "describe" and understand that variable...
 
  • #4
129
12
Isn't the Mass matrix symmetric?
A symmetric mass (or hermitian in the complex case) can have negative eigenvalues just not complex values.

It's written in the paper that the S and PS are almost purely-scalar interacting with the SM particles only via its mixing with [itex]H_u,H_d[/itex] fields (... however I don't understand how exactly this works. Would they mean [itex]h_u,h_d[/itex] (the doublet scalars)? Because by capital letters they denote the higgs superfields.
Because they are scalars they will mix with the higgs doublet scalars. There fermionic partners would mix with the fermionic partners of the higgs doublet scalars (higgsinos)

But in general it seems their BDT is able to get a better bkg rejection of the taus than for the multijets.
It is in general not surprising that in some cases the ditau jets are more easily separated from single tau jets than from qcd jets. qcd jets are much varied in their appearance and properties. While single tau jets have specific properties, and as long as the ditau structure can be probed, it should be well separated from the single tau case.


I believe the answer is though in the a plot like Fig2-E=200- [itex]\tau_{21}[/itex]. it looks like that a huge fraction of taus are in the 1st bin (they even exceed their logarithmic scale plot). Probably this affects a lot the result?
You mean figure 3 E=200 [itex]\tau_{21}[/itex]?
I don't know how much this affects the result, but the I don't understand the plot. Single tau jets should have a higher [itex]\tau_{21}[/itex] like qcd jets as they have a one prong structure.

Is that how generally a systematic uncertainty would alter the sensitivity? It looks like a pretty general relation then.
Increasing systematic uncertainty will always reduce the sensitivity, obviously. In their plot the change in sensitivity is just encoded in the formula they use for the sensitivity.
 
  • Like
Likes ChrisVer

Related Threads on A very unclear paper?

  • Last Post
Replies
4
Views
6K
  • Last Post
4
Replies
79
Views
7K
  • Last Post
Replies
2
Views
7K
  • Last Post
Replies
7
Views
3K
  • Last Post
Replies
13
Views
3K
Replies
2
Views
2K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
1
Views
2K
Top