The LHC program is in excellent shape, it will continue its journey throughout a series of upgrades up to its highest luminosity (sensitivity) until 2035. As for today, after the Higgs (H) discovery, the accumulated data do not reveal any sign of a new particle. With increasing sensitivity, its potential for discovery will improve and new particles may show up that would dramatically change the course of high-energy physics (HEP). However LHC has inherent weaknesses that will limit its sensitivity no matter how high the integrated luminosity is or how long the collider runs. Take for example, the measurement of the Higgs couplings to fermions or bosons. These couplings are sensitive to the imprint of possible new particles beyond the Standard Model (SM). The side figure, here, has been at the center of many discussions at the last Linear Collider workshop in Morioka December 2016. I’ll try to describe it here in some details.

Particles collisions or decays are quantum phenomena, therefore, probabilistic. The larger the number of events produced, the lower^{1} the statistical (random) error. A smaller error leads to a better sensitivity to unaccounted effects. This is true for all measurements at collider and even beyond, for any measurement whatsoever, quantum or classical.

Aside of the statistical error there is an other type of errors called systematics. They are more difficult to estimate as there are due to multiple effects like unaccounted or merely approximated effects, limited theoretical precision, event selection and reconstruction errors, detector simulation biases including acceptance and more. The systematic errors do not significantly decrease with the number of events and measurements are ultimately limited by them.

LHC, the Large Hadron Collider at CERN Geneva (Switzerland) has been running at full blow in 2016 accumulating 40 fb^{-1 }and promises to break new records in the coming years reaching its ultimate sensitivity to new physics with up to 3000 fb^{-1} accumulated luminosity. That will take time (~18 years) and money. But CERN says it is dedicated to use the LHC to its last drop as expected from a responsible organization.

**What are the inherent precision limitations of the LHC?**

All colliders and HEP experiments have limitations: detector precision in space and time, detector spatial acceptance, colliders luminosity, …. they may have different values depending on the technology used, but they are always present.

However, pp colliders have a specific limitation not showing up at lepton colliders: the proton itself. The proton is not an elementary particle but a pack of confined quarks(q), anti-quarks(\bar{q}) and gluons(g) (also called partons). At these energies, the interest is more on the collision of its basic constituents than on the proton it-self as a whole. For example, interesting processes are: g g \rightarrow H, q q \rightarrow q q H… To finely study these interactions, both the initial state (here g g or q q) and the final states (H or q q H) and the H decay products must be known precisely.

Concerning the final states, the burden is on the detectors and the precision keeps improving with the technology. But for the initial states, it is a different matter as the type of initial partons is not directly measurable, neither are their relative energy, momentum and spin state^{2}. This raises significant ambiguities and errors. The uncertainty due to composite proton is a substantial part of the total systematic errors occurring at pp colliders.

In addition the rest of the proton insides (others q , g) may also interact making the recorded event a superposition of different processes (underlying event), a very complicated picture, adding to the systematic errors. One more thing, the increase in luminosity is in part due to a higher proton density in the colliding bunches. The probability of getting more than one interaction in a given bunch becomes large and the pile up (the superposition of several collisions from different protons in the recorded event) may reach 140. Picking up the event of interest and making the correct identification and attribution of each track, namely succeeding in correctly reconstructing the event, become a difficult task prone to additional errors. And last but not least, the theoretical calculations that have to deal with this complex initial state are also quite involved, specially when precision requires accounting for higher-order corrections. The theoretical uncertainties will contribute substantially to the global systematic errors.

Proton collisions give rise to a large number of possible interactions and final states. Some of them, although of different nature, may mimic the studied final state. These spurious events, called background come in a number often much larger than the studied process. Reducing them requires highly sophisticated selection algorithms, inevitably introducing additional systematic errors.

After the Higgs boson discovery which has been a fantastic achievement for the theory, the machine, the experiments and the analysis, it is now essential to measure with high precision all its parameters: mass, various decay and couplings, … However, the discused limitations make pp colliders unfit to the challenge.

**Effect of the systematic and statistical errors on the measurement of the Higgs couplings**

The graph shown at last Linear Collider Workshop held in Morioka (LCWS) in December 2016, shed some light on what should be done next to improve the precision. The picture is quite busy but I will explain here the various items.

In the vertical axis the precision in % of the measurement of 7 coupling constants (\kappa) on the decay of the Higgs boson. The Higgs boson is a very unstable particle, as soon as it is produced, it decays in pairs of Z, W, q\bar{q}… Any discrepancy in the measurement of the couplings compared to the Standard Model prediction would sign the onset of a different mechanism or the contribution of a new particle.

For each decay, the various colors correspond to different conditions for two colliders. The dark and light green bars are for the pp machine LHC, the red and yellow bars are for the e+e- colliders ILC and the blue ones are when LHC and ILC data are combined. The LHC light green bars include the current irreducible errors from the theoretical calculations. The dark green assume a smaller theoretical error that could be obtained if more extensive calculations

*Fig:1 Precision on the Higgs couplings to Z, W, b quark… [1]*

involving additional complex subprocesses (higher-order processes, foreseen in the coming years but not fully guaranteed [2]) are performed.

In the legend caption: in TeV or GeV the collider energy, in fb^{-1} the integrated luminosity that will be reached for the LHC in 2035 (18 years) and for the ILC. The larger the fb^{-1}, the longer the data taking time. The red bars correspond to 8 years of data taking, the yellow bars would need 12 more years so in total ~ 20 years.

The conclusion is clear: e+e- colliders provide a factor 10 (or more) better precision than pp colliders like LHC if one assumes the current theoretical achieved precision for Z, W, b, c, t (namely comparing the light green and yellow bars). There is a notable exception with the coupling to γ. This is essentially due to the very small probability of the Higgs decaying in 2 γ (H \rightarrow \gamma \gamma)^{3}. Here, we see the value of the complementarity of the two types of colliders. Neither of them provides a good accuracy but the combination of the data improves drastically the figure.

The e+ and e- being elementary particles, the collisions do not suffer from the limitations the pp colliders have. The initial state is well measured and unambiguous (even the particle spin direction can be selected), the theoretical calculations are more manageable, the background is much smaller, no pile-up, no underlying events, … The systematic errors are therefore dramatically smaller. The limitation here come from the statistical error, namely the luminosity and the production rate of the Higgs are much lower. However overall, the figure shows that the e+e- colliders deliver a much better precision.

**Does one need this high accuracy?**

High precisions on the coupling measurements are needed to probe possible deviations from the Standard Model. As said, a tiny discrepancy would sign the existence of new particles or mechanisms, a guidance to a much more global and unified theory. These “beyond the Standard Model” theories are many (see a few in the below table). Experimental measurements are needed to choose the one selected by Nature.

**But how precise should it be?**

At LCWS, it was recalled, as demonstrated, for example, in the (arxiv:1206.3560) that the expected effects on these couplings from beyond SM models like Minimal Supersymmetry are of the order of 1-3%, unreachable at the LHC, but fully accessible to e+e- colliders.

*Table 1: Required precision on the Higgs coupling to vector bosons (\Delta hVV, V=Z, W) or quarks (\Delta H\bar{t}t, \Delta H\bar{b}b) to be sensitive to three possible Standard Models extensions. *

This is just one example of the advantage of e+e- over pp physics for prec ision physics, many other examples (Higgs self-couplings, top physics,…) will be discussed in this blog.

History shows that high-energy physics has evolved alternating e+e- and pp colliders/accelerators, both have tremendous values. Proton-proton colliders have shown to be discovering machines (Sp\bar{p}S for the discovery Z and W), Tevatron for the top quark, the LHC for the Higgs), but precision is in the realm of e+e- colliders. LEP provided a precise analysis of the Z and W and brought the Standard Model to the high level status it has today.

With two “recently” discovered, but poorly known particles, the top quark and the Higgs boson, and no other particle showing up, so with no hints on which energy to tune a future pp discovering machine, there is a clear incentive to turn to precision physics, namely to e+e- colliders but in sync with the LHC running up to its term.

References:

- “The Physical Case for the International Linear Collider”
- Snowmass Higgs working group report
- The LCWS timetable (most presentations are available)
- the LCWS presentation from Roman Poeschl and the summary talk of Marcel Vos

See also:

- Cosmos Magazine about the “Next king Collider”

- The error is proportional to √N/N, where N is the number of events as long are the event distribution is “normal” like a Gaussian: for N=100 events, the error is 10%, 10000 events error is 1% ↩
- This actually can be traced back by analyzing the final elements of the interaction and relying on the proton composition probabilities obtained from other measurements as long as the event is complete (no invisible particle) ↩
- The γ having no mass, the Higgs does not couple directly to γ and this process only occurs through higher-order corrections which are rarer than those of leading order ↩