HOME

TheInfoList



OR:

Computational particle physics refers to the methods and computing tools developed in and used by
particle physics Particle physics or high energy physics is the study of fundamental particles and forces that constitute matter and radiation. The fundamental particles in the universe are classified in the Standard Model as fermions (matter particles) an ...
research. Like computational chemistry or computational biology, it is, for
particle physics Particle physics or high energy physics is the study of fundamental particles and forces that constitute matter and radiation. The fundamental particles in the universe are classified in the Standard Model as fermions (matter particles) an ...
both a specific branch and an interdisciplinary field relying on computer science, theoretical and experimental particle physics and mathematics. The main fields of computational particle physics are:
lattice field theory In physics, lattice field theory is the study of lattice models of quantum field theory, that is, of field theory on a space or spacetime that has been discretised onto a lattice. Details Although most lattice field theories are not exactly sol ...
(numerical computations),
automatic calculation of particle interaction or decay The automatic calculation of particle interaction or decay is part of the computational particle physics branch. It refers to computing tools that help calculating the complex particle interactions as studied in high-energy physics, astroparticl ...
(computer algebra) and event generators (stochastic methods).https://arxiv.org/abs/1301.1211 ''Computational Particle Physics for Event Generators and Data Analysis'' retrieved 8/24/20https://www.researchgate.net/publication/234060239_Computational_Particle_Physics_for_Event_Generators_and_Data_Analysis ''Computational Particle Physics for Event Generators and Data Analysis'' retrieved 8/24/20https://www2.ccs.tsukuba.ac.jp/projects/ILFTNet/ ''International research network for computational particle physics'' retrieved 8/24/20


Computing tools

* Computer algebra: Many of the computer algebra languages were developed initially to help particle physics calculations: Reduce, Mathematica, Schoonschip, Form, GiNaC.
Data Grid
The largest planned use of the grid systems will be for the analysis of the LHC - produced data. Large software packages have been developed to support this application like the LHC Computing Grid (LCG) . A similar effort in the wider
e-Science E-Science or eScience is computationally intensive science that is carried out in highly distributed network environments, or science that uses immense data sets that require grid computing; the term sometimes includes technologies that enable dist ...
community is the
GridPP GridPP is a collaboration of particle physicists and computer scientists from the United Kingdom and CERN. They manage and maintain a distributed grid computing, computing grid across the UK with the primary aim of providing resources to particle ...
collaboration, a consortium of particle physicists from UK institutions and CERN. * Data Analysis Tools: These tools are motivated by the fact that particle physics experiments and simulations often create large datasets, e.g. see references. * Software Libraries: Many software libraries are used for particle physics computations. Also important are packages that simulate particle physics interactions using Monte Carlo simulation techniques (i.e. event generators).


History

Particle physics played a role in the early history of the internet; the World-Wide Web was created by Tim Berners-Lee when working at CERN in 1991.


Computer Algebra

Note: This section contains an excerpt from 'Computer Algebra in Particle Physics' by Stefan Weinzierl Particle physics is an important field of application for computer algebra and exploits the capabilities of Computer Algebra Systems (CAS). This leads to valuable feed-back for the development of CAS. Looking at the history of computer algebra systems, the first programs date back to the 1960s. The first systems were almost entirely based on LISP ("LISt Programming language"). LISP is an
interpreted language In computer science, an interpreter is a computer program that directly executes instructions written in a programming or scripting language, without requiring them previously to have been compiled into a machine language program. An interprete ...
and, as the name already indicates, designed for the manipulation of lists. Its importance for symbolic computer programs in the early days has been compared to the importance of FORTRAN for numerical programs in the same period. Already in this first period, the program REDUCE had some special features for the application to high energy physics. An exception to the LISP-based programs was SCHOONSHIP, written in
assembler language In computer programming, assembly language (or assembler language, or symbolic machine code), often referred to simply as Assembly and commonly abbreviated as ASM or asm, is any low-level programming language with a very strong correspondence be ...
by Martinus J. G. Veltman and specially designed for applications in particle physics. The use of assembler code lead to an incredible fast program (compared to the interpreted programs at that time) and allowed the calculation of more complex scattering processes in high energy physics. It has been claimed the program's importance was recognized in 1998 by awarding the half of the Nobel prize to Veltman. Also the program
MACSYMA Macsyma (; "Project MAC's SYmbolic MAnipulator") is one of the oldest general-purpose computer algebra systems still in wide use. It was originally developed from 1968 to 1982 at MIT's Project MAC. In 1982, Macsyma was licensed to Symbolics a ...
deserves to be mentioned explicitly, since it triggered important development with regard to algorithms. In the 1980s new computer algebra systems started to be written in C. This enabled the better exploitation of the
resources Resource refers to all the materials available in our environment which are technologically accessible, economically feasible and culturally sustainable and help us to satisfy our needs and wants. Resources can broadly be classified upon their av ...
of the computer (compared to the interpreted language LISP) and at the same time allowed to maintain portability (which would not have been possible in assembler language). This period marked also the appearance of the first commercial computer algebra system, among which Mathematica and
Maple ''Acer'' () is a genus of trees and shrubs commonly known as maples. The genus is placed in the family Sapindaceae.Stevens, P. F. (2001 onwards). Angiosperm Phylogeny Website. Version 9, June 2008 nd more or less continuously updated since http ...
are the best known examples. In addition, also a few dedicated programs appeared, an example relevant to particle physics is the program FORM by J. Vermaseren as a (portable) successor to SCHOONSHIP. More recently issues of the
maintainability In engineering, maintainability is the ease with which a product can be maintained to: * correct defects or their cause, * Repair or replace faulty or worn-out components without having to replace still working parts, * prevent unexpected working ...
of large projects became more and more important and the overall
programming paradigm Programming paradigms are a way to classify programming languages based on their features. Languages can be classified into multiple paradigms. Some paradigms are concerned mainly with implications for the execution model of the language, suc ...
a changed from
procedural programming Procedural programming is a programming paradigm, derived from imperative programming, based on the concept of the '' procedure call''. Procedures (a type of routine or subroutine) simply contain a series of computational steps to be carrie ...
to
object-oriented Object-oriented programming (OOP) is a programming paradigm based on the concept of " objects", which can contain data and code. The data is in the form of fields (often known as attributes or ''properties''), and the code is in the form of p ...
design. In terms of programming languages this was reflected by a move from C to
C++ C++ (pronounced "C plus plus") is a high-level general-purpose programming language created by Danish computer scientist Bjarne Stroustrup as an extension of the C programming language, or "C with Classes". The language has expanded significan ...
. Following this change of paradigma, the library GiNaC was developed. The GiNac library allows symbolic calculations in C++. Code generation for computer algebra can also be used in this area.


Lattice field theory

Lattice field theory In physics, lattice field theory is the study of lattice models of quantum field theory, that is, of field theory on a space or spacetime that has been discretised onto a lattice. Details Although most lattice field theories are not exactly sol ...
was created by Kenneth Wilson in 1974. Simulation techniques were later developed from statistical mechanics. Since the early 1980s, LQCD researchers have pioneered the use of
massively parallel Massively parallel is the term for using a large number of computer processors (or separate computers) to simultaneously perform a set of coordinated computations in parallel. GPUs are massively parallel architecture with tens of thousands of t ...
computers in large scientific applications, using virtually all available computing systems including traditional main-frames, large PC clusters, and high-performance systems. In addition, it has also been used as a
benchmark Benchmark may refer to: Business and economics * Benchmarking, evaluating performance within organizations * Benchmark price * Benchmark (crude oil), oil-specific practices Science and technology * Benchmark (surveying), a point of known elevati ...
for
high-performance computing High-performance computing (HPC) uses supercomputers and computer clusters to solve advanced computation problems. Overview HPC integrates systems administration (including network and security knowledge) and parallel programming into a mult ...
, starting with the IBM
Blue Gene Blue Gene is an IBM project aimed at designing supercomputers that can reach operating speeds in the petaFLOPS (PFLOPS) range, with low power consumption. The project created three generations of supercomputers, Blue Gene/L, Blue Gene/P, ...
supercomputer. Eventually national and regional QCD grids were created: LATFOR (continental Europe), UKQCD and USQCD. The ILDG (International Lattice Data Grid) is an international venture comprising grids from the UK, the US, Australia, Japan and Germany, and was formed in 2002.


See also

*
Les Houches Accords The Les Houches Accords are agreements between particle physicists to standardize the interface between the matrix element programs and the event generators used to calculate different quantities. The original accord was initially formed in 2001, ...
* CHEP Conference *
Computational physics Computational physics is the study and implementation of numerical analysis to solve problems in physics for which a quantitative theory already exists. Historically, computational physics was the first application of modern computers in science, ...


References


External links

* Brown University
Computational High Energy Physics (CHEP) group page
*
International Research Network for Computational Particle Physics
. Center for Computational Sciences, Univ. of Tsukuba, Japan.
History of computing at CERN
{{DEFAULTSORT:Computational Particle Physics Computational fields of study