Article

PDF
Access to the PDF text
Advertising


Free Article !

Comptes Rendus Physique
Volume 18, n° 5-6
pages 358-364 (mai 2017)
Doi : 10.1016/j.crhy.2017.09.010
On the universality (or not) of beautiful penguins
Sur l'universalité (ou non) des beaux pingouins
 

Yasmine Amhis
 Laboratoire de l'accélérateur linéaire, Campus scientifique d'Orsay, 91898 Orsay cedex, France 

Abstract

Despite the enduring resilience of the Standard Model of particle physics, there remain reasons to expect that it is not a “final” theory. In particular, the Standard Model can not explain either dark matter or the observed matter–antimatter asymmetry of the universe. LHCb is a forward acceptance spectrometer at the Large Hadron Collider, dedicated to precision measurements of heavy flavour particles. Because new particles can appear virtually in the decays of heavy flavour particles, and thus alter their properties, such measurements are inherently sensitive to much higher mass scales that direct searches. We present in this article how the presence of new particles can be probed by testing Lepton Universality in the decay of hadrons containing a b -quark.

The full text of this article is available in PDF format.
Résumé

Malgré la solidité du modèle standard de la physique des particules, il a y de bonnes raisons de penser que ce n'est pas la théorie « ultime ». En particulier, le modèle standard ne peut expliquer, ni la matière noire, ni l'asymétrie matière–antimatière dans l'Univers. LHCb est un spectromètre du LHC (Large Hadron Collider ) consacré à des mesures de précision des particules de saveurs lourdes. Comme des particules de nouvelle physique peuvent contribuer de manière virtuelle aux désintégrations de particules de saveurs lourdes et ainsi modifier leurs propriétés, de telles mesures sont sensibles à des masses bien plus élevées que celles accessibles grâce aux mesures directes. Dans le présent article, nous expliquons comment la présence de particules nouvelles peut être détectée en testant l'universalité du couplage aux leptons dans les désintégrations de hadrons contenant un quark b .

The full text of this article is available in PDF format.

Keywords : LHCb, Standard Model, Indirect search

Mots-clés : LHCb, Modèle standard, Recherche indirecte


A powerful yet incomplete theory

The Standard Model (SM) of particle physics was invented in the late 1960's. It is a theory built on three pillars: special relativity, quantum mechanics, and symmetries. In a fashion similar to a Lego game, there are only very few different kinds of fundamental particles describing visible matter as we know it. The SM is structured in three families or generations of fermions named quarks and leptons. There are six type of quarks up , down , charm , strange , top , and bottom . The bottom quark (b ) is also referred to as  -quark. There are three charged leptons   and three associated neutral leptons  . By construction in the SM, three interactions, i.e. the electromagnetic, weak and strong interactions, will be responsible for the transitions between these particles. These interactions are mediated by particles called gauge bosons. The electromagnetic interaction is mediated by the photon, the strong interaction is mediated by gluons, and the weak interaction is mediated by the   and   bosons [[1]]. Depending on the properties of the fermions and bosons at play, such as electric charge and other quantum numbers, particles will be affected by the interactions in different ways. For example, because they carry a charge of “colour”, only quarks are affected by the strong interaction. Last but not least, the Higgs boson generates the mass of the particles via a symmetry-breaking mechanism. The SM of particle physics has often been described as an elegant theory; this is partly due to its strong and precise predictive power. The SM also is appealing since it offers a wide class of physical observables that can be confronted to experimental measurements. An example will be discussed in this article. This being said, the SM fails to describe some fairly fundamental aspects of nature. The SM does not accommodate gravity, nor can it explain the mass hierarchy of many orders of magnitude observed between particles. For instance, the mass of an electron is 0.511 MeV/ , and the mass of the top quark in 173 GeV/  [[1]]. Furthermore, satellite experiments such as Planck show that visible matter only comprises 5 % of the universe, the rest being attributed to dark energy (68%) and dark matter (27%) [[2]]. In other words, the SM is excellent at describing what seems to be a tip of an iceberg. Both the theory and experimental physics communities share the viewpoint that there must exist sources of New Physics (NP), new theories able to propose a more complete description of nature. One of the main purposes of the experiments located at the Large Hadron Collider at CERN is to search for new particles. During data acquisition, protons of high energy, between 7 and 13 TeV, collide every 25 ns at the LHC. Detectors were built around each interaction point, and their aim is to collect the product of these collisions in the best possible way. Experimentally, there are two strategies to search for these unknown particles. The first approach is to “hunt” directly for new particles in the decay products of the proton-proton collisions. This is how the famous Higgs boson was discovered in 2012 by the ATLAS and CMS collaborations [[3]]. A second approach consists in searching for new particles in an indirect way. The heart of indirect searches relies on a class of physics observables that may differ from SM predictions in the presence of NP. This method is heavily exploited by the LHCb experiment. An example of an indirect search will be discussed in this article.

The LHCb detector

The LHCb detector [[4]], shown in Fig. 1, was designed to study the asymmetry between matter and anti-matter and search for physics beyond the SM in b -hadron decays. It turned out that the excellent performances of the detector were such that the physics program was extended very rapidly to exploring also charm, electroweak and more recently heavy ion physics. It is a single-arm spectrometer located in interaction point (IP) 8 of the LHC. The choice of detector geometry is justified by the fact both b and   hadrons are predominantly produced in the same forward or backward cone. Charged tracks are detected by a vertex locator near to the IP, and their momenta and charge are subsequently determined with tracking stations (TT, T1, T2 and T3), either side of a magnet providing an integrated field of 4 Tm. Ring imaging Cherenkov (RICH) counters are used to distinguish kaons from pions. Electromagnetic and hadronic calorimeters (ECAL and HCAL) together with an instrumented preshower (SPD/PS) system provide measurements of the energies of neutral as well as charged particles, and muons are identified with a dedicated detection system (M1–M5). For final states involving electrons, the ECAL performs the same role, providing a trigger signal as well as positive electron identification. Stable charged hadrons (pions, kaons, and protons) are distinguished based on their signatures in the RICH detectors. The positive identification of all the final state particles reduces the second largest potential source of background: that from other, higher rate, charm or bottom hadron decays. The tracking stations are essential to measure the momenta of the particles involved in the decay, and the VELO provides precise information on the origin of the tracks. This provides an extremely powerful information to reject background, since signal tracks must originate from common vertices that are displaced from the primary vertex of the LHC proton–proton collision (due to the non-negligible lifetimes of the decaying bottom hadrons), whereas the largest potential source of background comes from random combinations of tracks that originate from the primary vertex. The LHCb trigger is the key to the success of the experiment. It operates in two stages. A hardware trigger reduces the frequency of events from the LHC beam crossing rate of up to 40 MHz to a rate of 1 MHz, at which the entire detector can be read out. This is achieved by reconstructing high-transverse-momentum particles (hadrons, electrons and photons in the calorimeters, muons in the muon chambers). High-transverse-momentum particles are much more likely to be associated with the decays of high mass particles produced in the collisions than to originate directly from the collision debris, and are therefore highly effective to trigger signal events. In the second stage, the high-level trigger uses information from the whole detector to reduce the event rate from 1 MHz down to 2–5 kHz, at which rate events can be written to be stored and later used for data analysis. During RUN 1, the collected LHCb data representing an integrated luminosity of   at center-of-mass energies of 7 and 8 TeV (fb = femtobarn). The experimental results discussed in this article are based on this dataset.



Fig. 1


Fig. 1. 

The LHCb detector [[4]].

Zoom

Penguins used as probes for New Physics searches

The original penguin diagram as formulated by J. Ellis describing a Flavour Changing Neutral Current involving a b quark decaying to an s quark, where a gluon produces an   pair is shown in Fig. 2; the complete story of where the, maybe quite odd, term penguin comes from is told in the CERN Courrier dated May 2013. The type of penguin decays that will be discussed in this article is displayed in Fig. 3. Here, a pair of oppositely charged leptons, where  ,  ,   is produced by a   boson or photon in the final state. It is important to recall that quarks can not be found “free” in nature; they are always confined together with one or two other quarks in hadrons.1 The(se) additional quark(s) are called spectator(s), they are not shown in Fig. 3 since they play a less important role than the b -quark. Quantum Field Theory2 on which particle physics is constructed provides tools that allow us to calculate the probability of a decay to happen. The complexity of a Feynman diagram which can be assessed for example by the number of gauge bosons involved, tells us how likely is a decay to occur. Because more than one gauge boson are involved in the penguin decays discussed in this article, these decays will be suppressed i.e. : less probable than simpler ones. For instance the probability of a   meson to decay to   which occurs through the exchange of only one   is a thousand time more likely than the decay of a   to   which occurs only via a   transition in the SM.



Fig. 2


Fig. 2. 

The original penguin diagram for a b quark to an s quark decay, where a gluon produces an   pair. Image taken from CERN Courrier , May 2013.

Zoom



Fig. 3


Fig. 3. 

Example of a penguin diagram in the SM: decay of a beauty quark into a strange quark, where a γ or a Z 0 produce two leptons in the SM.

Zoom

The phenomenology of b -decays can be described in a convenient manner, and hopefully not too off-putting for the reader, with an Effective Hamiltonian formalism [[5]]:
(1)Heff=GF2∑i=1i=10ΛiCKMCi(μ)Oi(μ) where   is the Fermi constant and   are the Cabibbo–Kobayashi–Maskawa terms describing the coupling between the   bosons and the quarks [[1]]. In this formalism, contributions generated by the electromagnetic, weak and strong interactions can be factorised depending on their energy scale, μ . The low-energy part of the transition, typically lower than the mass of the b -quark, mediated for example by soft-gluons, is encoded in what is called matrix elements,  . These elements can be evaluated using “numerical” techniques such as Lattice Quantum Chromo Dynamics [[6]]. Given the way that the Effective Hamiltonian is constructed, a good knowledge of the matrix elements is crucial, since they cloud the information that one can extract from the factors  , called Wilson coefficients. On the other hand, they describe short-distance contributions to the transition occurring at high energy, typically  ( ). These coefficients can be calculated using perturbative methods [[7]]. What makes   transitions a particularly interesting laboratory is the fact that NP particles can appear virtually in these diagrams; therefore they can compete with well-known SM processes. For instance, in Fig. 3, the SM gauge boson   could be replaced by a new particle, such as a heavier gauge boson for example the  . These new phenomena are often predicted to appear at large energy scales of the order of the TeV. Therefore, Wilson coefficients can be modified by the presence of these new phenomena. Hence, they can be seen as probes to test the existence of NP at certain energy scales. This is a typical example of an indirect search for new particles.

Fig. 4 shows the variation of the differential rate, i.e. the probability of the occurrence of a   decay as a function of  , where   is the sum square of the di-lepton system invariant mass. It is worth to note that Wilson coefficients ( ) contribute differently to the different   regions. For example, the very low   region is dominated by the Wilson coefficient   corresponding to a Feynman where a photon is produced as illustrated in Fig. 3. Therefore, to fully exploit the rich phenomenology of these decays, experimental measurements will be done in different regions of  .



Fig. 4


Fig. 4. 

Differential rate as a function of q 2 in b sl +l transitions.

Zoom

Example of lepton universality measurements

There are in High-Energy Physics many observable physical quantities; they have been or are right now being used to test the presence of NP in an indirect way at LHCb. One could mention for example, the branching fraction of the very rare decay   [[8]] or the weak phase   measured in   decays [[9]]. To push further the discussion on the indirect searches, Lepton Universality (LU) tests in  transitions will be explored in this article. In the SM, the coupling of gauge boson with leptons is predicted to be equal to unity [[10], [11], [12], [13], [14]]. LU is often referred to as an accidental symmetry, because it is not required explicitly in the construction of the SM theory. Using a sample of data corresponding to an integrated luminosity of   collected during the first run of the LHC, the LHCb Collaboration performed two LU tests using   decays. A first measurement published in 2014,  , analysed   decays [[15]]. More recently, in 2017 a second measurement,  , analysed   [[16]]. For each possible hadron in the final state ( ), the expression of the ratio used to test LU is shown below:
(2)RH=∫dΓ(B→Hμ+μ−)dq2dq2∫dΓ(B→He+e−)dq2dq2 Following the discussion in Section 3, the measurements are performed in bins of  . Experimentally, these two measurements rely on very similar techniques, they are also delicate to conduct, for similar reasons. In both cases, part of the challenge comes from the electron requirement in the final state. Their presence will impact the measurements in two ways. When electrons are bent in the magnet, they emit bremsstrahlung photons, and therefore loose part of their energy. In data processing of the LHCb experiment, algorithms were designed to allow the recovery of some of these photons. The second challenge comes from constraints that have to be imposed in the hardware trigger to cope with the high occupancy in the ECAL. The combination of all of these effects will lead mainly to a degradation of the resolution of the invariant mass of the reconstructed decays. The main experimental steps to measure   can be summarised as follow. Signal candidates of interest have to be extracted from the recorded data using an event selection.3 An event selection consists of a set of requirements that can be applied to the kinematics of decay, for instance the momentum of the B or the topology of the decay inside the detector, etc. Its aim is to suppress as efficiently as possible candidates, called background candidates, that could originate from random combinations of particles while maintaining the number of signal candidates high. Sophisticated multivariate methods, based on Machine Learning algorithms, are also intensely used in different stages of this cleaning process. While these approaches are very effective in suppressing a large fraction of these background candidates, additional contributions can remain. As an illustration, one can consider decay chains where the final states particles will be identical to those of the decay of interest. For example, the decay   with  . These processes occur via tree level Feynman diagrams and are therefore abundant with respect to decays of interest. These background candidates remaining after the selection have to be taken into account. A maximum likelihood fit is performed to extract the number of signal candidates, also called raw yields, of the muonic and electronic mode in the data that passed all the selection steps. A set of Probability Density Functions (PDFs) is adjusted to describe the data in the best possible way. Fig. 5 shows the reconstructed invariant masses of   on the left and   on the right, as well as the projection of the PDFs for the signal and background components of the fit. Each step of the data processing, each decision taken or cut applied during the analysis has a certain cost, an efficiency. These efficiencies have to be estimated and used to correct the raw yields and measure the numerator and the denominator of Equation (2). Thanks to the usage of many calibration samples, detailed studies have been conducted to ensure the accuracy and robustness of these efficiencies. It is imperative to perform a series of tests to ensure the stability of the efficiencies across the available phase space, so that no biases are introduced on the final measurements. Finally,   was found to be equal to   for  , and   was found to be equal to   for   and   for  . The LHCb results, as well as the measurements previously performed by B-Factories, Belle and BaBar [[17], [18]], are shown in Fig. 6.



Fig. 5


Fig. 5. 

B 0K ⁎0l +l for  , l +=μ + (left) or l +=e + (right). The dashed line is the signal PDF, the shaded shapes are the background PDFs and the solid line is the total PDF. The fit residuals normalised to the data uncertainty are shown at the bottom of each distribution.

Zoom



Fig. 6


Fig. 6. 

LHCb, Belle and BaBar R K (left) and   (right) measurements in bins of q 2 [[17], [18]].

Zoom

After the publication of both results, a very large number of articles appeared, proposing interpretations of these measurements within the context of physics beyond the SM. There is a general consensus within the theory community that it is unlikely that these results could be due to a bad knowledge of hadronic effects, causing an underestimation of the SM predictions. To explain the results, some theorists proposed NP models based on new particles named lepto-quarks [[19], [20], [21]] carrying both quark and lepton quantum numbers. Others invoked the existence of heavy new gauge bosons such as   [[22], [23]]. In both cases, it will be interesting to test and maybe confirm the existence of these new particles in direct searches at the LHC. Moreover, phenomenological analyses not relying on a particular NP model attempt to explain the measurements using global fits to the Wilson coefficients. Differences from three to four standard deviations to the Wilson coefficient   with respect to the SM prediction are reported in Refs. [[24], [25]], for instance.

Conclusion

In the last decades, experimental high-energy physicists have been strongly testing the SM. Despite the numerous direct NP searches for specific particles such as super-symmetric ones or new bosons conducted by the ATLAS and CMS Collaborations [[26]], the SM has been shown to be a very robust and resilient theory. In this article, we discussed a method to test the presence of NP examining Lepton Universality. Two results from the LHCb Collaboration, based on   transitions, indicate tensions with the SM predictions. It should be recalled that, in the SM, no difference between the coupling of gauge bosons to muons and electrons is expected. A class of NP models has been proposed by the theory community to explain and interpret these results in many NP schemes. To conclude, one can say that exciting times are certainly ahead of us, especially since the LHCb experiment will undergo an upgrade in the near future [[27], [28]]. This upgrade will allow us to collect more data, to improve the statistical accuracy of the measurements such as the Lepton Universality ones, and to widen the spectrum of possible measurements. It will also contribute to shed more light on the reported anomalies and establish their nature.


Acknowledgements

The author would like to thank her colleagues for the lively discussions and collaborations, J. Lefrançois, M.-H. Schune, T. Gershon, R. Quagliani, V. Lisovskyi, F. Mercier, and V.V. Gligorov.

References

Particle Data GroupPatrignani C., and al. Review of particle physics Chin. Phys. C 2016 ;  40 (10) : 100001
PlanckAde P.A.R., and al. Planck 2015 results. XIII. Cosmological parameters Astron. Astrophys. 2016 ;  594 : A13arXiv:1502.01589
ATLAS, CMSAad G., and al. Measurements of the Higgs boson production and decay rates and constraints on its couplings from a combined ATLAS and CMS analysis of the LHC pp collision data at   and 8 TeV J. High Energy Phys. 2016 ;  08 : 045arXiv:1606.02266
LHCbAaij R., and al. LHCb detector performance Int. J. Mod. Phys. A 2015 ;  30 (07) : 1530022arXiv:1412.6352
For example. Buchalla G., Buras A.J., Lautenbacher M.E. Weak decays beyond leading logarithms Rev. Mod. Phys. 1996 ;  68 : 1125arXiv:hep-ph/9512380
See for example. Fermilab Lattice Collaboration, MILC Collaboration, HPQCD CollaborationAubin C., and al. Semileptonic decays of d mesons in three-flavor lattice QCD Phys. Rev. Lett. 2005 ;  94 : 011601
Bobeth C., Misiak M., Urban J. Photonic penguins at two loops and   dependence of   Nucl. Phys. B 2000 ;  574 : 291arXiv:hep-ph/9910220
LHCbAaij R., and al. Measurement of the   branching fraction and effective lifetime and search for   decays Phys. Rev. Lett. 2017 ;  118 (19) : 191801arXiv:1703.05747
LHCbAaij R., and al. Precision measurement of CP violation in   decays Phys. Rev. Lett. 2015 ;  114 (4) : 041801arXiv:1411.3104
Bordone M., Isidori G., Pattori A. On the Standard Model predictions for   and   Eur. Phys. J. C 2016 ;  76 (8) : 440arXiv:1605.07633 [cross-ref]
Capdevila B., Descotes-Genon S., Hofer L., Matias J. Hadronic uncertainties in  : a state-of-the-art analysis J. High Energy Phys. 2017 ;  04 : 016arXiv:1701.08672
Serra N., Coutinho R.S., van Dyk D. Measuring the breaking of lepton flavor universality in   Phys. Rev. D 2017 ;  95 (3) : 035029arXiv:1610.08761
Altmannshofer W., Niehoff C., Stangl P., Straub D.M. Status of the   anomaly after Moriond 2017 Eur. Phys. J. C 2017 ;  77 (6) : 377arXiv:1703.09189 [cross-ref]
Jäger S., Jorge M.C. Reassessing the discovery potential of the   decays in the large-recoil region: SM challenges and BSM opportunities Phys. Rev. D 2016 ;  93 (1) : 014028arXiv:1412.3183
LHCbAaij R., and al. Test of lepton universality using   decays Phys. Rev. Lett. 2014 ;  113 : 151601arXiv:1406.6482
LHCbAaij R., and al. Test of lepton universality with   decaysarXiv:1705.05802
BaBarLees J.P., and al. Measurement of branching fractions and rate asymmetries in the rare decays   Phys. Rev. D 2012 ;  86 : 032012arXiv:1204.3933
BelleWei J.-T., and al. Measurement of the differential branching fraction and forward–backword asymmetry for   Phys. Rev. Lett. 2009 ;  103 : 171801arXiv:0904.0770
Becireciv D., Sumensari O. A leptoquark model to accommodate   and  arXiv:1704.05835
Fajfer S., Kosnik N. Vector leptoquark resolution of   and   puzzles Phys. Lett. B 2016 ;  755 : 270arXiv:1511.06024
Hiller G., Nisandzic I.   and   beyond the Standard ModelarXiv:1704.05444
D'Amico G., and al. Flavour anomalies after the   measurementarXiv:1704.05438
Kamenik J.F., Soreq Y., Zupan J. Lepton flavor universality violation without new sources of quark flavor violationarXiv:1704.06005
Altmannshofer W., Stangl P., Straub D.M. Interpreting hints for lepton flavor universality violationarXiv:1704.05435
Capdevila B., and al. Patterns of New Physics in   transitions in the light of recent dataarXiv:1704.05340
Autermann C. Experimental status of supersymmetry after the LHC Run-I Prog. Part. Nucl. Phys. 2016 ;  90 : 125arXiv:1609.01686
LHCb collaborationAaij R., and al. Framework TDR for the LHCb Upgrade: Technical Design ReportTech. Rep. CERN-LHCC-2012-007, LHCb-TDR-12.   :  (April 2012). 
LHCb CollaborationAaij R., and al. Expression of Interest for a Phase-II LHCb Upgrade: Opportunities in Flavour Physics, and Beyond, in the HL-LHC EraTech. Rep. CERN-LHCC-2017-003.   Geneva, Switzerland: CERN (February 2017). 

1  Meson are formed of a quark and an antiquark, baryons are formed of 3 quarks or three anti-quarks.
2  An Introduction To Quantum Field Theory ” by Michael E. Peskin, Daniel V. Schroeder Avalon Publishing, is an excellent reference for this topic.
3  Important data processing stages, such as detector calibration and alignment, track reconstruction, particle identification etc. will not be covered in this article. See Ref. [[4]] for further information.


© 2017  Académie des sciences@@#104156@@
EM-CONSULTE.COM is registrered at the CNIL, déclaration n° 1286925.
As per the Law relating to information storage and personal integrity, you have the right to oppose (art 26 of that law), access (art 34 of that law) and rectify (art 36 of that law) your personal data. You may thus request that your data, should it be inaccurate, incomplete, unclear, outdated, not be used or stored, be corrected, clarified, updated or deleted.
Personal information regarding our website's visitors, including their identity, is confidential.
The owners of this website hereby guarantee to respect the legal confidentiality conditions, applicable in France, and not to disclose this data to third parties.
Close
Article Outline