Notes from talks and discussions in the SUSY weeks


Tuesday, April 26


Frank Golf: New Physics Searches from Run 2 at CMS

-Inclusive all-hadronic searches now include monojet regions to improve sensitivity to compressed spectra
-Ongoing questions about many search regions in CMS analyses. Choice of search regions in many-region CMS searches motivated by background decomposition, rather than optimization for signal.
-Invisible Z background characterized with photon+jets; various questions about validity of extrapolation, data/MC agreement.
-Sensitivity improvement in all-hadronic search from 8 TeV to 13 TeV: reach extended both in gluino mass for fixed LSP mass, as expected from xsec, but also in LSP mass for fixed gluino mass. Compressed regions are populating high HT bins coming from high-pT ISR, often including 3+ ISR jets. Do we understand that much ISR well? Not getting peak sensitivity from high-multiplicity ISR, so probably not seriously sensitive to these tails.
-Compressed spectra: CMS is whiting-out the m_stop = m_t + m_LSP corridor until various questions are resolved:
  • What do we need to understand about MC to trust transfer factors in tails of distributions?
  • Where does the sensitivity come from -- high ISR recoil, ISR multiplicity, b-jet multiplicity -- and do we trust signal MC in these regions?
  • Where does the signal yield come from?
  • How important is signal contamination, and how is it handled?
  • How important are differences in how signal and background samples are generated?
  • How can we bring in other production mechanisms or decays to probe these scenarios?

Discussion: Anomalies (Wed 29 Apr)


Brief summaries of prevailing anomalies in the LHC data:
  • Theory [Link] Introduction by Ian Low
  • Exp-1 [Link] Flavour, diboson, Z+jets+MET, 1-lepton stop, ATLAS diphoton by Andreas Hoecker
  • Exp-2 [Link] H -> taumu, multilepton, multilepton+b's (ttH, ttW, general), CMS diphoton by Frank Golf

On diphoton it was suggested to look at number of b-jets in event.

Detailed discussion around the Z+jets+MET excess: generally larger backgrounds in CMS in "ATLAS" signal region. CMS sees softer Z spectrum and no deformation of Z mass. CMS also has slightly larger diboson background, and stronger dependence of flavour-symmetric background on object pRT cuts.

Brian Petersen's slides about Ian Low et al's model [1507.01601] with light stop_R and intermediate-mass EWkinos. The model is not excluded by the current ATLAS analyses. The closest reach has the same-sign dilepton + 1b + jets + MET analysis.

For the multilepton-with-b-jets analyses it is crucial to improve the knowledge and MC description of ttW backgrounds.

Discussion: Holes in SUSY Searches (Thu 29 Apr)


  • RPV stop to light flavors
  • direct slepton (incl. stau) production searches, especially at low mass, just above LEP limit
  • searches where MET aligns with the jet. I.e. stuff that slips through because of the deltaPhi requirement in the all hadronic searches. Example: hidden valley e.g. Zprime to pair of dark quarks. Buzzword: semivisible jets.
  • pure weak production with no colored sparticles within reach of the LHC. e.g. approximately degenerate higgsinos, winos, or only bino
    • insult to injury: at splittings < ~1 GeV the soft lepton signature is BR suppressed to O(%) level. Gunion 990.2309
    • at splittings too large for visible lifetimes and too small for enough kinematics to make detectable soft leptons
      Small study of splitting of Higgsinos in ATLAS pMSSM models:
    • another example: 1604.07438
  • non-standard LSPs that at times provide different signatures, or branching fractions, e.g. sneutrinos, singlino, dirac gauginos, … and the work to understand how much of the theoretical phase is covered has not been done. “the Aschenputtel of the community”
  • What’s the status of VBF production searches ? preliminary results sus-14-005, sus-14-019 from CMS, 1509.07152 section 9 from this ATLAS paper …
  • WH+MET, HH+MET, ZH+MET, or equivalent with X20 -> X10 photon sensitivity are all still very limited …. i.e. generically all dibosons + MET in all perturbations
  • multiple soft photons
  • should we split out the ISR uncertainty from the limit curves, and show it separately, to more clearly indicate what the effect is ?
  • weak production with hadronic RPV, i.e. colored searches that reach very low Xsections. Can jet substructure help assuming boosted objects at low Xsections for the amount of visible energy in the event.
  • Does the O(10)GeV sbottom mass still present a hole ? This is the sbottom not coupling to the Z.
  • Cascades that end with light pseudoscalars that ultimately decay into pions, eta, photons pair, dimuon, …
  • Where are we at with full coverage for long lived? Anything that has a SM higgs in it is worth searching for an additional displaced signature because the higgs is an excellent portal.
  • If we took g-2 excess seriously, what guidance does this give for searches? Are we fully covered there?
    g-2 plot from ATLAS pMSSM paper: figure 19
  • stop at top mass decaying to chargino plus b instead of W plus b. And chargino degenerate with LSP. This seems excluded already.
    Excluded by mono-jet search. See ATLAS summary paper 1506.08616
  • stop to top LSP for stop mass near top mass and lsp mass near zero, experiments should keep pushing via both top pair Xsection and top spin correlations.
  • any search that’s picking up on higgs, we should add h to tautau
  • long lived squirks
  • trileptons with the same charge
    There is not veto of these in ATLAS same-sign search
  • 1.9TeV gluino with 1TeV stop
  • stop with gravitino LSP kinematics differs from neutralino as LSP. This affects our conclusion as to what is already excluded.
  • any RPV to light flavor
  • WW+MET without cheating by including sleptons in the decay to pump up the leptonic BR.
  • stealth mechanism in compressed spectra where the sparticle is almost degenerate with a SM particle like top or higgs or Z or W.
  • stealth like scenario that are stealth because of a large multiplicity of colored particles, or even pions and other low mass light flavor SM particles.
  • Are there holes in the R-hadron searches?
  • New/different signatures from SUSY in extra dimensions? E.g. three-body stop/sbottom decays to stau_R or sneutrino_L LOSP e.g. 1404.7554
  • Auto-concealment of supersymmetry in extra dimensions e.g. 1412.0805
  • Hiding MET with MET -- degenerate with compressed searches, but worth including for completeness.


Discussion: Theory-Experiment communication (Fri 30 Apr)


public data:
--for real science
--delayed in time
--open question: how much time should be spent on tracking down crackpots?

how to present MVAs.
-For search variables, one possibility is to give the code for the "black box" where an outsider could pass in the relevant variables and get the MVA discriminate value out for a given event.

-For objection selection, the performance is what really matters (efficiency, fake rate, etc.). getting to within a factor of 2 is good enough. often too hard to do it well for very complicated signatures. Maybe worth waiting to see how the searches develop before investing serious work in implementation capabilities of complex objects.

How to handle multi-binned SUSY searches and combinations.
CMS proposes to use super signal regions, which group related signal regions that may target a similar signature. Order 10 of these super signal regions will be used to present integrated background predictions and the observed counts.
solves issue of needing to provide correlations.
gives a discovery-like region.
General consensus that providing detailed correlation matrices between different signal regions is more trouble than it's worth on both experimental and re-interpretive sides.

In general, don't let better become the enemy of good. factor of 2 in the cross section is plenty good for reinterpretation efforts.

Simple message is more important than exceedingly great precision.

ATLAS has detailed list of reinterpretation information to go out with each result.
CMS has policies with similar information, though it is less consistently enforced. CMS plans to re-up the effort to get this information out consistently.
No major additions requested for the list of info provided by experiments in paper.

SK: there's actually a wishlist:
So there are still open points, like object efficiencies, which CMS plans to provide but which are not on the ATLAS guidelines.

Concluding comments (SK):

- the Les Houches recommendations 1a) and 3a) are more and more followed by the ATLAS+CMS SUSY groups and there is hope that this will indeed become the standard; there's really been a lot of progress in this respect !
- the CMS SUSY groups also makes an effort towards recommendation 2a), i.e. efficiencies.
- what's missing: a follow-up on the other Les Houches recommendations
- what's hard to do: make the exotics groups comply, too

My overall impression (GR): ATLAS is doing a pretty decent job. A few things we could do better:
  1. provide run cards for our signal samples
  2. publish object efficiencies in performance papers, not just data/MC scale factors
  3. more consistency; ATLAS SUSY group is pretty good, but Exotics group less so?
I see plenty of pushback for anything further, e.g. correlation matrices in shape fits, object-level efficiencies/resolution beyond what is provided in ATLAS performance papers (e.g. performance in extreme SUSY kinematic phase space). It would be helpful to provide the experiments with some motivation for providing more information than is currently being given out. How does the experiment benefit? One concrete example has been holes in current searches, but even here, just to be provocative, some would argue that no holes have been uncovered that the experiments were not already aware of. The benefit for experiments needs to be better fleshed out.

We did not discuss public release of data; we will take this up next week when Joe Incandela is around.

Discussion: Status of Naturalness (May 2 Mon)


Introductory talk slides from Monica: NaturalSUSYKITP

And here are quick notes on the topics covered during Monday morning discussion:
- Abandoning naturalness as guiding principle is an option? Most argued against, since there is no well defined "fine-tuning wall", there are many loopholes and ways to reduce fine-tuning (e.g Dirac gluinos, ...) and therefore experiments should not abandon naturalness as a search strategy
- On the other hand, experimentally one should not focus on current natural spectra
- There are still windows of lower mass coverage for stops and to lesser extent for gluinos in areas of parameter space
- Displaced gluinos transition between prompt and displaced
- A general lightest stop mass limit usable in fine tunings arguments is more difficult to extract than for gluinos
- Open space in stop parameter space is much wider than just a "corner", more like a "swimming pool"
- HL-LHC reach for stops and gluinos gluinos?
- 3TeV gluino 1.5TeV stop gets higgs mass right without being too unnatural
- from now to HL-LHC not too much room to improvement for discovery - while factor of ~10 per particle in worsening tuning
- Hiding SUSY particles in light mass "holes": tuning price of holes?
- 1TeV Higgsino motivations
- sub-TeV states with TeV-scale inos viability -> EW production + small branching ratios
- stop decaying t+chi0 and b+chi+ with higgsinos with ~ 50% 50% branching ratios has still poor limits if best expected exclusions are considered (maybe covered with full combination?)
- is there a over simplification of current limits when quoting stop exclusions in naturalness arguments?
- gluino with stop into charm neutralino reintrerpretation
- measures of fine-tuning dependent on various "UV" assumptions: quantifying naturalness progress as worsening fro pre-LHC to now instead of focusing on absolute measure? (equivalent to most conservative choice of assumptions since in that case there was no tuning pre-LHC)
- more systematic studies of role of sbottom searches (is it needed for covering corners of parameter space?)
- ATLAS vs CMS philosophy of cutting harder and having few signal regions vs. slicing and dicing in many signal regions: performance differences in corners of sbottom and stop searches where SM background is larger?
- status of searches as in general parameter space as function of m_Q3, m_U3, X_t (reduced p(M)SSM, with no requirement on 3rd generation spectrum from Higgs mass and no non-LHC constraints) as a manageable parameter space to assess current status of SUSY searches vs. naturalness. Can cover longer cascades that are difficult to cover by simplified models. Differences in results between two approaches?
- Searches for EWK-inos in VBF vs ISR - status/possibilities of cut optimizations in VBF searches for inos?
- search program/benchmarks for decoupled higgsino scenarios
- strategies for signatures falling in between the SUSY-exotica divide
- EWKino searches in NMSSM: include decays with light A involved on top of "standard" W,Z,h.

Discussion: Non-Conventional Signatures (Tue 3 May)

Are there any (valuable) cases where we can add a single parameter or make a small change to a well-known simplified model and generate interesting new signatures for non-conventional SUSY?
  • Don’t have to be holes, but could be things where we don’t understand the limits as well as we should
  • Single-sparticle production in RPV scenarios (resonant sneutrino to em/et/mt)
    • 5 TeV single squark/stop/whatever that is too heavy for pair production (mono-top style)
  • Change the production cross section (EWK->strong) and see how our exclusion looks
  • Final states
    • Multi-jets + multi-leptons (sleptons in the decays)
    • 2 jets + lepton + no b (we cover 3 or 4)
    • Leptons + few jets (EWK style – can we have higher cross sections?)

Are there useful or interesting new simplified models for non-standard signatures?
  • We can search for a signature, but having a model to optimize the search is very helpful!
  • Resonance to 4tau, a la EWK cascade through a non-SM Higgs through aa
  • Pseudo-scalar to mm
  • Better-motivated multi-photon searches? (Adding scalars)
  • Stau co-annihilation (soft taus + ISR + MET), or general soft leptons+MET

  • For LLP, can we define a set of simplified models such that
    • Each model might occur in nature, even if only as part of a larger model
    • Each model has a reasonable number of parameters (≤3) to specify the model
    • Each model is sufficiently generic to allow results to be re-interpreted by theorists
    • A limited set of such models (≤20) covers a large range of allowed, interesting models
  • We should try to avoid “anything + a displaced vertex” type models
  • Classes of non-conventional signatures
    • Long-lived invisible particles that decay to something visible (experimentally we break this up further)
    • Long-lived visible particles that decay to something visible
    • Long lived visible particles that decay to something invisible
    • Long-lived visible particles that do not decay and escape the detector
    • Long-lived particles that stop inside the detector and eventually decay
  • Is there any “simplified model” that covers special cases?
    • (S)Quirks? Monopoles / q-balls / high-charge bound states? Does SUSY particularly care for any of these, even in corners of phase space?
    • These kinds of final states are great candidates for being killed by cleaning cuts in normal searches; do we miss things as a result?
    • Could the cleaning of the 0L analysis cause problems for a short-lived LSP?

Should we try less-model driven measures of “simplified weirdness” top of inclusive search results?
  • Low-rejection observables for testing events for deviations from the standard model in more than just the raw data counts. These could be jet properties, event properties, detector observables…
  • If so, what sorts of observables should we use, and are there models that exhibit such weirdness?
  • Displaced vertices going to something “easy” (jets / leptons)
  • Strange jet properties (substructure, EM-rich and many tracks, had-rich and no tracks…)
  • Disappearing tracks or very high dE/dX
  • Excessive numbers of {hits in the tracker; segments in the muon system (behind jets or not); low-energy tracks; muon-like tracks (calo MIPs)}

During the afternoon discussion it was pointed out that the application of "simplified weirdness" is best for cases where one can double-tag and rely on a lack of correlation between candidates to estimate the background. It seems appropriate also to apply these to systematics-dominated search regions, or at least regions with reasonable statistics; applying these measures to regions with a few events does not make a lot of sense.

Discussion: SUSY via searches for additional Higgs(inos) (May 4 Wed -- Star Wars day)


Overview of what we'd like to discuss and cover.

(*) How to get SM-like Higgs: Alignment vs. decoupling
(*) Heavy H searches & holes (decays to EW particles?). Interpreting searches.
(*) Higgs from SUSY cascades (strong production + EW production)
(*) Higgsino mass splittings and search strategies.


(*) How to get SM-like Higgs in the MSSM: Alignment vs. decoupling
  • Alignment without decoupling in the MSSM is possible due to an accidental cancellation between the tree-level and loop-level contributions to a parameter called Z_6, which is the coefficient of the (H_1^\dagger H_1)(H_1^\dagger H_2)+h.c. term of the scalar potential in the Higgs basis. Although this cancellation is accidental (and is not a consequence of some symmetry), it will show up in a comprehensive scan of the pMSSM parameter space.
  • Naively, one can reach the alignment limit at tree-level by setting v_u=v_d, v_u-0 or v_d=0. The case of v_u=v_d corresponds to tan(beta)=1, which yields a tree-level Higgs mass of zero. When radiative corrections are included, this is probably not a viable alignment limit. The case of v_u=0 corresponds to a massless top quark which is not viable. Likewise for the case of v_d=0 corresponding to a massless bottom quark. But, there exists a model in the literature by Bogdan Dobrescu and Patrick Fox [https://arxiv.org/abs/1001.3147]in which v_d=0 abd the bottom quark Yukawa coupling is radiatively generated. Question: how does the alignment limit work in this model?
  • Question: An ATLAS plot was shown claiming that M_A<400 GeV was excluded (for all tan(beta) values), based on the precision h(125) data. What were the underlying assumptions that went into this plot? Answer: this plot is based on the hMSSM formalism, where the tree-level MSSM Higgs sector is augmented by one additional parameter that is tuned to get m_h=125 GeV. This one additional parameter reflects the part of the radiative corrections that contributes the bulk of the mass shift from the tree-level to the loop-corrected value of m_h. But, of course, this is a very crude approximation, and it completely misses the region of the MSSM parameter space in which approximate alignment (via Z_6=0) arises.
  • What is the meaning of the "exact alignment limit"? In the MSSM is there a SUSY parameter regime in which one obtains precise SM Higgs couplings that is completely insensitive to the value of m_A? Answer: identifying the alignment limit by demanding that the off diagonal term of a squared-mass matrix vanishes is an approximation. For example, higher dimensional operators are neglected. So, setting an effective Z_6=0 should be viewed as a leading term that has additional corrections that cannot be described by an effective squared-mass matrix for the CP-even Higgs bosons.
  • Bottom line: the masses of the non-minimal Higgs bosons in the 200--400 GeV range [at moderate values of tan(beta)] cannot yet be ruled out in light of precision h(125) data or by direct searches for H and A decaying to tau pairs. So, keep on looking!

(*) Heavy H searches & holes (decays to EW particles?). Interpreting searches.
  • One does not have to be in the exact alignment limit to be consistent with current precision measurements of h(125) couplings. With modest deviations from alignment, various coupling that vanish in exact alignment can be phenomenologically relevant. Examples: Hhh, ZAh couplings. In the following we will focus on those decays that are open even in the exact alignment limit.
  • Gluon fusion production with A -> tt is very challenging. The difference between S+B and B is at most at the 1% level. This is for tan(beta)=1. For higher values of tan(beta) the effect is even smaller.
  • Question: why is peaking structure in A->tt more pronounced at low mA? Due to the smaller width? Why is the peak shifted from the mass? Comments: At lower masses we get bigger rates and the position of the peak depends no the signal + background interference.
  • The plots presented in the talk are for a CP-odd A. The prospects for a scalar H are similar, even if slightly worse.
  • ttH, with H -> tt can also be probed by a SSDL search. How does it perform if compared to multi-lepton + jets or 1 lepton + many jets. Maybe we can get additional information considering a Ht(q) or Ht(W) production? Rates are comparable to the ttH production.
  • ttH, with H -> tt search really needs lots of data. >300 fb-1.
  • At high values of tan(beta) we might have regions in which the cross section for pp-> bb H and H-> neutralinos or charginos is not too smaller than the Drell-Yan (DY) production of charginos and neutralinos. Can it help in looking for EW-inos? maybe we already have a reach recasting the bb + MET searches
  • H decays into staus in e.g. the tau-phobic MSSM scenario can help in looking for staus since the DY production of staus is quite rate limited and the pp-> stau stau -> 2 tau + MET is difficult to be looked for.
  • In searches for H and A production, where H,A -> tau tau, the limits quoted in the m_A vs. tan(beta) plane depend on whether the H,A have decay modes into lighter SUSY particles (such as chargino/neutralino pairs). Indeed in the m_h^{mod+} benchmark, BR's to chi chi can be as large as 35%, depending on the value of tan(beta). It would be useful to present limits in the m_A vs. tan(beta) plane in benchmark scenarios where there are NO decays of H,A into lighter SUSY states.

(*) How to get SM-like Higgs in the NMSSM: Alignment vs. decoupling
  • It is easier to achieve the alignment limit in the NMSSM---it points to a region of parameter space where lambda=0.6--0.7 and tan*(beta)=1--3. In this region, the one-loop radiative corrections are not as significant compared to the MSSM Higgs sector.
  • The leading log contributions to Z_1 and Z_6 are proportional to the fourth power of the top-quark Yukawa coupling. There are also important logarithimic one-loop corrections proportional to lambda, but these are only relevant for large values of lambda at large tan(beta), which lies outside the alignment regime.
  • New signals in Higgs+Higgs and Higgs+vector boson final states due to new singlet-like scalars was reviewed in the summary talk by Brian Petersen on Tuesday (link).

(*) Higgs from SUSY cascades (strong production + EW production)
  • Somewhat surprisingly, the h->gamma,gamma search is the most powerful for the electroweak susy channels at low mass. At higher mass, multi-leptons take over, and for the highest masses, h->bb tends to win out. For low mass chi20->chi10+higgs decays (chi20 mass not much above 125 GeV), the low mass does not allow for much MET and the channels with available triggers without MET can do well. For higher masses, the MET is larger, and the h->bb channels with higher branching fractions start to achieve the best sensitivity.
  • In Higgsino->h+LSP, is it natural to have the Higgsino just above the h(125) mass? Answer: yes.
  • Question: Are there reasonable susy scenarios with very boosted Higgs and low MET? Yes, it could happen with a Higgsino just above the Higgs mass, if the Higgsino is given a significant boost.
  • For searches with h->bb in cascades, the inclusive jets + b-jets + MET searches already do very well. For resolved h->bb candidates, the combinatorics become difficult to find the correct b-jet pair.
  • For boosted h->bb the combinatorics issue goes away, and there may be more to gain if the Higgs are sufficiently boosted.
  • For low mass cases with very little MET, multi-lepton searches can do very well with the leptons coming from Higgs decays.

(*) Higgsino mass splittings and search strategies.
  • All branching fractions are shown for electrons. The branching fractions for the muons are similar, unless the mass splitting is very small (comparable to the muon mass).
  • For Higgsinos, what happens to the rest of the branching fraction when it doesn't go to W*? Can very often go to a pion and the chi0. That case is especially important for very small mass splitting between the Higgsino states.
  • In the case of the charged pion from the Higgsino decay, it could be worth adding to the search a requirement of a soft isolated pion. This could happen at the end of essentially any susy cascade with a Higgsino or Wino LSP.
  • For the neutral pion decay, the situation is hopeless since there is no way to identify the pi0 is from the correct decay vertex.

Follow-up discussion to SUSY via searches for additional Higgs(inos) (May 5 Thursday, 11am)
  • Searches for heavy Higgs in two Higgs doublet models such as the MSSM offer an attractive opportunity to further probe EWSB. Current experimental limits are strong for large values of tan(beta) from H/A->tau+tau searches. At moderate and low values of tan(beta), however, sensitivity drops off rapidly. In particular, there are no limits for m_A above the t-tbar threshold for roughly tan(beta)<10. Searches in this region, known as "the wedge," were discussed in the previous session.
  • In the case of compressed chargino decays, the entire SUSY decay product may be too soft to be detected. In such cases, the "monojet" approach of looking for the invisible system boosted against an ISR jet is the only search option. It is interesting to note that the "monojet" searches from CMS and ATLAS typically actually allow more than one jet to be present in the event to increase the efficiency to signal events.
  • VBF production of Higgsinos is another possibility. It is not really clear whether this is better the ISR monojet searches, but for the h(125) to invisible measurements, the VBF channel is currently the most sensitive.

Discussion: Complete vs Simplified Models (Thu 5 May)

What >2 or >3 parameter models are worth our time?
  • Is the mgluino v msquark model still useful? (Yes)
  • Answer: experimentalists are ok to just stick to SMS, and leave it to theorists to explore beyond.
  • Answer: for compressed regions of existing SMSs it is worth having dedicated signal regions in the experimental searches that focus explicitly on boost + usual signature. Might it be useful to do a better job re-interpreting mono-X analyses in our SUSY SMS planes?
  • Are we systematically covering RPV SUSY ? Not clear that we have the right SMS basis for this yet. Requires some work if we want to be in a position during LS2 to use these for a pMSSM-like scan (see below)
  • We could begin by adding lifetimes to a number of 2-parameter models, which could be interesting
  • Do we sufficiently cover asymmetric decays? General feeling that we do, but if there are cases that theorists find that produce substantially different signatures, the experiments could begin looking into them again

What is the framework through which we can identify or prioritize new interesting final states if it is not the pMSSM?
  • If it is the pMSSM, is it still worth making these studies public?
  • If we do the work for the pMSSM, should we make a “truth-level” study framework public?
  • General sense that this is the role of theorists, but if the experimentalists embark on a pMSSM-like scan again during LS2, and if we begin with a ‘truth-level’ analysis, we should start from a format that is preferred by the theory community, and we should invest some effort in understanding whether it is the pMSSM that should be used or not
  • If we do go the road of the pMSSM, then we should relax all loop-level constrains, as there are ways around those, and they might have excluded scenarios that would have appeared as “holes”