This book details a new and ground-breaking contribution to the search for a successor to the Standard Model (SM) of particle physics - the largest modern endeavour in the field. In the hope of seeing a discrepancy with the SM's predictions, this work discusses two hitherto unforeseen measurements at the frontier of experimental precision: a measurement of W-boson mass and a test of the fundamental axiom of the W boson's lepton flavour universality (LFU). Both measurements are made by analysing collision data from the LHCb experiment at the Large Hadron Collider (LHC) at CERN, and represent the establishment of a new field of high-precision Standard Model tests with LHCb. This book also describes the development of new software tools for the optimisation of the LHCb trigger system, which helps to ensure that LHCb's exciting physics program can continue to prosper into the future. This book is accessible to those with graduate—or master's—level training in experimental particle physics.
This thesis presents the measurement of the Higgs boson cross section in the diphoton decay channel. The measurement relies on proton-proton collision data at a center-of-mass energy √s = 13 TeV recorded by the ATLAS experiment at the Large Hadron Collider (LHC). The collected data correspond to the full Run-2 dataset with an integrated luminosity of 139 fb-1. The measured cross sections are used to constrain anomalous Higgs boson interactions in the Effective Field Theory (EFT) framework. The results presented in this thesis represent a reduction by a factor 2 of the different photon and jet energy scale and resolution systematic uncertainties with respect to the previous ATLAS publication. The thesis details the calibration of electron and photon energies in ATLAS, in particular the measurement of the presampler energy scale and the estimation of its systematic uncertainty. This calibration was used to perform a measurement of the Higgs boson mass in the H → γγ and H → 4l channels using the 36 fb−1 dataset.
The first part of this thesis presents the measurement of the inclusive cross-section for electron production from heavy-flavour decays in the electron transverse momentum range 7 GeV
This unique volume introduces and discusses the methods of validating computer simulations in scientific research. The core concepts, strategies, and techniques of validation are explained by an international team of pre-eminent authorities, drawing on expertise from various fields ranging from engineering and the physical sciences to the social sciences and history. The work also offers new and original philosophical perspectives on the validation of simulations. Topics and features: introduces the fundamental concepts and principles related to the validation of computer simulations, and examines philosophical frameworks for thinking about validation; provides an overview of the various strategies and techniques available for validating simulations, as well as the preparatory steps that have to be taken prior to validation; describes commonly used reference points and mathematical frameworks applicable to simulation validation; reviews the legal prescriptions, and the administrative and procedural activities related to simulation validation; presents examples of best practice that demonstrate how methods of validation are applied in various disciplines and with different types of simulation models; covers important practical challenges faced by simulation scientists when applying validation methods and techniques; offers a selection of general philosophical reflections that explore the significance of validation from a broader perspective. This truly interdisciplinary handbook will appeal to a broad audience, from professional scientists spanning all natural and social sciences, to young scholars new to research with computer simulations. Philosophers of science, and methodologists seeking to increase their understanding of simulation validation, will also find much to benefit from in the text.
In an epoch when particle physics is awaiting a major step forward, the Large Hydron Collider (LHC) at CERN, Geneva will soon be operational. It will collide a beam of high energy protons with another similar beam circulation in the same 27 km tunnel but in the opposite direction, resulting in the production of many elementary particles some never created in the laboratory before. It is widely expected that the LHC will discover the Higgs boson, the particle which supposedly lends masses to all other fundamental particles. In addition, the question as to whether there is some new law of physics at such high energy is likely to be answered through this experiment. The present volume contains a collection of articles written by international experts, both theoreticians and experimentalists, from India and abroad, which aims to acquaint a non-specialist with some basic issues related to the LHC. At the same time, it is expected to be a useful, rudimentary companion of introductory exposition and technical expertise alike, and it is hoped to become unique in its kind. The fact that there is substantial Indian involvement in the entire LHC endeavour, at all levels including fabrication, physics analysis procedures as well as theoretical studies, is also amply brought out in the collection.
The Large Hadron Collider (LHC) is the highest energy collider ever built. It resides near Geneva in a tunnel 3.8m wide, with a circumference of 26.7km, which was excavated in 1983-1988 to initially house the electron-positron collider LEP. The LHC was approved in 1995, and it took until 2010 for reliable operation. By now, a larger set of larger integrated luminosities have been accumulated for physics analyses in the four collider experiments: ATLAS, CMS, LHCb and ALICE.The LHC operates with an extended cryogenic plant, using a multi-stage injection system comprising the PS and SPS accelerators (still in use for particle physics experiments at lower energies). The beams are guided by 1232 superconducting high field dipole magnets.Intense works are underway in preparation of the High Luminosity LHC, aimed at upgrading the LHC and detectors for collecting ten times more luminosity, and extending the collider life to the early 2040's. So far, the (HL-)LHC project represents a cumulation of around one hundred thousand person-years of innovative work by technicians, engineers, and physicists from all over the world; probably the largest scientific effort ever in the history of humanity. The book is driven by the realisation of the unique value of this accelerator complex and by the recognition of the status of high energy physics, described by a Standard Model — which still leaves too many questions unanswered to be the appropriate theory of elementary particles and their interactions.Following the Introduction are: three chapters which focus on the initial decade of operation, leading to the celebrated discovery of the Higgs Boson, on the techniques and physics of the luminosity upgrade, and finally on major options - of using the LHC in a concurrent, power economic, electron-hadron scattering mode, when upgraded to higher energies or eventually as an injector for the next big machine. The various technical and physics chapters, provided by 61 authors, characterise the fascinating opportunities the LHC offers for the next two decades ahead (possibly longer), with the goal to substantially advance our understanding of nature.
This book discusses the study of double charm B decays and the first observation of B0->D0D0Kst0 decay using Run I data from the LHCb experiment. It also describes in detail the upgrade for the Run III of the LHCb tracking system and the trigger and tracking strategy for the LHCb upgrade, as well as the development and performance studies of a novel standalone tracking algorithm for the scintillating fibre tracker that will be used for the LHCb upgrade. This algorithm alone allows the LHCb upgrade physics program to achieve incredibly high sensitivity to decays containing long-lived particles as final states as well as to boost the physics capabilities for the reconstruction of low momentum particles.
This thesis presents innovative contributions to the CMS experiment in the new trigger system for the restart of the LHC collisions in Run II, as well as original analysis methods and important results that led to official publications of the Collaboration. The author's novel reconstruction algorithms, deployed on the Field-Programmable Gate Arrays of the new CMS trigger architecture, have brought a gain of over a factor 2 in efficiency for the identification of tau leptons, with a very significant impact on important H boson measurements, such as its decays to tau lepton pairs and the search for H boson pair production. He also describes a novel analysis of HH → bb tautau, a high priority physics topic in a difficult channel. The original strategy, optimisation of event categories, and the control of the background have made the result one of the most sensitive concerning the self-coupling of the Higgs boson among all possible channels at the LHC.
The Black Book of Quantum Chromodynamics is an in-depth introduction to the particle physics of current and future experiments at particle accelerators. The book offers the reader an overview of practically all aspects of the strong interaction necessary to understand and appreciate modern particle phenomenology at the energy frontier. It assumes a working knowledge of quantum field theory at the level of introductory textbooks used for advanced undergraduate or in standard postgraduate lectures. The book expands this knowledge with an intuitive understanding of relevant physical concepts, an introduction to modern techniques, and their application to the phenomenology of the strong interaction at the highest energies. Aimed at graduate students and researchers, it also serves as a comprehensive reference for LHC experimenters and theorists. This book offers an exhaustive presentation of the technologies developed and used by practitioners in the field of fixed-order perturbation theory and an overview of results relevant for the ongoing research programme at the LHC. It includes an in-depth description of various analytic resummation techniques, which form the basis for our understanding of the QCD radiation pattern and how strong production processes manifest themselves in data, and a concise discussion of numerical resummation through parton showers, which form the basis of event generators for the simulation of LHC physics, and their matching and merging with fixed-order matrix elements. It also gives a detailed presentation of the physics behind the parton distribution functions, which are a necessary ingredient for every calculation relevant for physics at hadron colliders such as the LHC, and an introduction to non-perturbative aspects of the strong interaction, including inclusive observables such as total and elastic cross sections, and non-trivial effects such as multiple parton interactions and hadronization. The book concludes with a useful overview contextualising data from previous experiments such as the Tevatron and the Run I of the LHC which have shaped our understanding of QCD at hadron colliders.
This second open access volume of the handbook series deals with detectors, large experimental facilities and data handling, both for accelerator and non-accelerator based experiments. It also covers applications in medicine and life sciences. A joint CERN-Springer initiative, the "Particle Physics Reference Library" provides revised and updated contributions based on previously published material in the well-known Landolt-Boernstein series on particle physics, accelerators and detectors (volumes 21A, B1,B2,C), which took stock of the field approximately one decade ago. Central to this new initiative is publication under full open access