In
theoretical physics, quantum field theory (QFT) is a theoretical framework that combines
classical field theory,
special relativity, and
quantum mechanics
Quantum mechanics is a fundamental theory in physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and f ...
.
QFT is used in
particle physics to construct
physical models of
subatomic particles and in
condensed matter physics to construct models of
quasiparticles.
QFT treats particles as
excited state
In quantum mechanics
Quantum mechanics is a fundamental theory in physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related ...
s (also called
quanta) of their underlying quantum
fields, which are more fundamental than the particles. The
equation of motion of the particle is determined by minimization of the Lagrangian, a functional of fields associated with the particle. Interactions between particles are described by interaction terms in the
Lagrangian involving their corresponding quantum fields. Each interaction can be visually represented by
Feynman diagrams according to
perturbation theory in quantum mechanics.
History
Quantum field theory emerged from the work of generations of theoretical physicists spanning much of the 20th century. Its development began in the 1920s with the description of interactions between
light and
electrons, culminating in the first quantum field theory—
quantum electrodynamics. A major theoretical obstacle soon followed with the appearance and persistence of various infinities in perturbative calculations, a problem only resolved in the 1950s with the invention of the
renormalization procedure. A second major barrier came with QFT's apparent inability to describe the
weak and
strong interactions, to the point where some theorists called for the abandonment of the field theoretic approach. The development of
gauge theory and the completion of the
Standard Model in the 1970s led to a renaissance of quantum field theory.
Theoretical background

Quantum field theory results from the combination of
classical field theory,
quantum mechanics
Quantum mechanics is a fundamental theory in physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and f ...
, and
special relativity.
A brief overview of these theoretical precursors follows.
The earliest successful classical field theory is one that emerged from
Newton's law of universal gravitation, despite the complete absence of the concept of fields from his 1687 treatise ''
Philosophiæ Naturalis Principia Mathematica
( English: ''Mathematical Principles of Natural Philosophy'') often referred to as simply the (), is a book by Isaac Newton
Sir Isaac Newton (25 December 1642 – 20 March 1726/27) was an English mathematician, physicist, ...
''. The force of gravity as described by Newton is an "
action at a distance"—its effects on faraway objects are instantaneous, no matter the distance. In an exchange of letters with
Richard Bentley, however, Newton stated that "it is inconceivable that inanimate brute matter should, without the mediation of something else which is not material, operate upon and affect other matter without mutual contact."
[ It was not until the 18th century that mathematical physicists discovered a convenient description of gravity based on fields—a numerical quantity (a ]vector
Vector most often refers to:
* Euclidean vector, a quantity with a magnitude and a direction
* Vector (epidemiology), an agent that carries and transmits an infectious pathogen into another living organism
Vector may also refer to:
Mathemat ...
in the case of gravitational field) assigned to every point in space indicating the action of gravity on any particle at that point. However, this was considered merely a mathematical trick.
Fields began to take on an existence of their own with the development of electromagnetism in the 19th century. Michael Faraday
Michael Faraday (; 22 September 1791 – 25 August 1867) was an English scientist who contributed to the study of electromagnetism and electrochemistry. His main discoveries include the principles underlying electromagnetic induction, ...
coined the English term "field" in 1845. He introduced fields as properties of space (even when it is devoid of matter) having physical effects. He argued against "action at a distance", and proposed that interactions between objects occur via space-filling "lines of force". This description of fields remains to this day.
The theory of classical electromagnetism was completed in 1864 with Maxwell's equations, which described the relationship between the electric field, the magnetic field
A magnetic field is a vector field that describes the magnetic influence on moving electric charges, electric currents, and magnetic materials. A moving charge in a magnetic field experiences a force perpendicular to its own velocity and t ...
, electric current, and electric charge
Electric charge is the physical property of matter that causes charged matter to experience a force
In physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior throu ...
. Maxwell's equations implied the existence of electromagnetic waves, a phenomenon whereby electric and magnetic fields propagate from one spatial point to another at a finite speed, which turns out to be the speed of light. Action-at-a-distance was thus conclusively refuted.[
Despite the enormous success of classical electromagnetism, it was unable to account for the discrete lines in atomic spectra, nor for the distribution of blackbody radiation in different wavelengths.] Max Planck
Max Karl Ernst Ludwig Planck (, ; 23 April 1858 – 4 October 1947) was a Germans, German theoretical physicist whose discovery of quantum mechanics, energy quanta won him the Nobel Prize in Physics in 1918.
Planck made many substantial con ...
's study of blackbody radiation marked the beginning of quantum mechanics. He treated atoms, which absorb and emit electromagnetic radiation
In physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of k ...
, as tiny oscillators with the crucial property that their energies can only take on a series of discrete, rather than continuous, values. These are known as quantum harmonic oscillators. This process of restricting energies to discrete values is called quantization. Building on this idea, Albert Einstein proposed in 1905 an explanation for the photoelectric effect, that light is composed of individual packets of energy called photons (the quanta of light). This implied that the electromagnetic radiation, while being waves in the classical electromagnetic field, also exists in the form of particles.
In 1913, Niels Bohr introduced the Bohr model of atomic structure, wherein electrons within atoms can only take on a series of discrete, rather than continuous, energies. This is another example of quantization. The Bohr model successfully explained the discrete nature of atomic spectral lines. In 1924, Louis de Broglie proposed the hypothesis of wave–particle duality, that microscopic particles exhibit both wave-like and particle-like properties under different circumstances. Uniting these scattered ideas, a coherent discipline, quantum mechanics
Quantum mechanics is a fundamental theory in physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and f ...
, was formulated between 1925 and 1926, with important contributions from Max Planck
Max Karl Ernst Ludwig Planck (, ; 23 April 1858 – 4 October 1947) was a Germans, German theoretical physicist whose discovery of quantum mechanics, energy quanta won him the Nobel Prize in Physics in 1918.
Planck made many substantial con ...
, Louis de Broglie, Werner Heisenberg, Max Born, Erwin Schrödinger, Paul Dirac, and Wolfgang Pauli.
In the same year as his paper on the photoelectric effect, Einstein published his theory of special relativity, built on Maxwell's electromagnetism. New rules, called Lorentz transformations, were given for the way time and space coordinates of an event change under changes in the observer's velocity, and the distinction between time and space was blurred. It was proposed that all physical laws must be the same for observers at different velocities, i.e. that physical laws be invariant under Lorentz transformations.
Two difficulties remained. Observationally, the Schrödinger equation underlying quantum mechanics could explain the stimulated emission of radiation from atoms, where an electron emits a new photon under the action of an external electromagnetic field, but it was unable to explain spontaneous emission, where an electron spontaneously decreases in energy and emits a photon even without the action of an external electromagnetic field. Theoretically, the Schrödinger equation could not describe photons and was inconsistent with the principles of special relativity—it treats time as an ordinary number while promoting spatial coordinates to linear operators.
Quantum electrodynamics
Quantum field theory naturally began with the study of electromagnetic interactions, as the electromagnetic field was the only known classical field as of the 1920s.
Through the works of Born, Heisenberg, and Pascual Jordan in 1925–1926, a quantum theory of the free electromagnetic field (one with no interactions with matter) was developed via canonical quantization by treating the electromagnetic field as a set of quantum harmonic oscillators. With the exclusion of interactions, however, such a theory was yet incapable of making quantitative predictions about the real world.
In his seminal 1927 paper ''The quantum theory of the emission and absorption of radiation'', Dirac coined the term quantum electrodynamics (QED), a theory that adds upon the terms describing the free electromagnetic field an additional interaction term between electric current density and the electromagnetic vector potential. Using first-order perturbation theory, he successfully explained the phenomenon of spontaneous emission. According to the uncertainty principle in quantum mechanics, quantum harmonic oscillators cannot remain stationary, but they have a non-zero minimum energy and must always be oscillating, even in the lowest energy state (the ground state
The ground state of a quantum-mechanical system is its stationary state of lowest energy; the energy of the ground state is known as the zero-point energy of the system. An excited state
In quantum mechanics
Quantum mechanics is ...
). Therefore, even in a perfect vacuum
A vacuum is a space devoid of matter. The word is derived from the Latin adjective ''vacuus'' for "vacant" or " void". An approximation to such vacuum is a region with a gaseous pressure much less than atmospheric pressure. Physicists ofte ...
, there remains an oscillating electromagnetic field having zero-point energy. It is this quantum fluctuation of electromagnetic fields in the vacuum that "stimulates" the spontaneous emission of radiation by electrons in atoms. Dirac's theory was hugely successful in explaining both the emission and absorption of radiation by atoms; by applying second-order perturbation theory, it was able to account for the scattering of photons, resonance fluorescence and non-relativistic Compton scattering. Nonetheless, the application of higher-order perturbation theory was plagued with problematic infinities in calculations.
In 1928, Dirac wrote down a wave equation that described relativistic electrons—the Dirac equation. It had the following important consequences: the spin
Spin or spinning most often refers to:
* Spinning (textiles), the creation of yarn or thread by twisting fibers together, traditionally by hand spinning
* Spin, the rotation of an object around a central axis
* Spin (propaganda), an intentional ...
of an electron is 1/2; the electron ''g''-factor is 2; it led to the correct Sommerfeld formula for the fine structure of the hydrogen atom; and it could be used to derive the Klein–Nishina formula for relativistic Compton scattering. Although the results were fruitful, the theory also apparently implied the existence of negative energy states, which would cause atoms to be unstable, since they could always decay to lower energy states by the emission of radiation.
The prevailing view at the time was that the world was composed of two very different ingredients: material particles (such as electrons) and quantum fields (such as photons). Material particles were considered to be eternal, with their physical state described by the probabilities of finding each particle in any given region of space or range of velocities. On the other hand, photons were considered merely the excited state
In quantum mechanics
Quantum mechanics is a fundamental theory in physics
Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related ...
s of the underlying quantized electromagnetic field, and could be freely created or destroyed. It was between 1928 and 1930 that Jordan, Eugene Wigner, Heisenberg, Pauli, and Enrico Fermi discovered that material particles could also be seen as excited states of quantum fields. Just as photons are excited states of the quantized electromagnetic field, so each type of particle had its corresponding quantum field: an electron field, a proton field, etc. Given enough energy, it would now be possible to create material particles. Building on this idea, Fermi proposed in 1932 an explanation for beta decay
In nuclear physics, beta decay (β-decay) is a type of radioactive decay in which a beta particle (fast energetic electron or positron) is emitted from an atomic nucleus, transforming the original nuclide to an isobar of that nuclide. F ...
known as Fermi's interaction. Atomic nuclei do not contain electrons ''per se'', but in the process of decay, an electron is created out of the surrounding electron field, analogous to the photon created from the surrounding electromagnetic field in the radiative decay of an excited atom.
It was realized in 1929 by Dirac and others that negative energy states implied by the Dirac equation could be removed by assuming the existence of particles with the same mass as electrons but opposite electric charge. This not only ensured the stability of atoms, but it was also the first proposal of the existence of antimatter. Indeed, the evidence for positron
The positron or antielectron is the antiparticle or the antimatter counterpart of the electron
The electron ( or ) is a subatomic particle with a negative one elementary electric charge. Electrons belong to the first generation of th ...
s was discovered in 1932 by Carl David Anderson in cosmic rays. With enough energy, such as by absorbing a photon, an electron-positron pair could be created, a process called pair production; the reverse process, annihilation, could also occur with the emission of a photon. This showed that particle numbers need not be fixed during an interaction. Historically, however, positrons were at first thought of as "holes" in an infinite electron sea, rather than a new kind of particle, and this theory was referred to as the Dirac hole theory. QFT naturally incorporated antiparticles in its formalism.
Infinities and renormalization
Robert Oppenheimer
J. Robert Oppenheimer (; April 22, 1904 – February 18, 1967) was an American theoretical physicist. A professor of physics at the University of California, Berkeley
The University of California, Berkeley (UC Berkeley, Berkeley, Cal, o ...
showed in 1930 that higher-order perturbative calculations in QED always resulted in infinite quantities, such as the electron self-energy and the vacuum zero-point energy of the electron and photon fields, suggesting that the computational methods at the time could not properly deal with interactions involving photons with extremely high momenta. It was not until 20 years later that a systematic approach to remove such infinities was developed.
A series of papers was published between 1934 and 1938 by Ernst Stueckelberg that established a relativistically invariant formulation of QFT. In 1947, Stueckelberg also independently developed a complete renormalization procedure. Unfortunately, such achievements were not understood and recognized by the theoretical community.
Faced with these infinities, John Archibald Wheeler and Heisenberg proposed, in 1937 and 1943 respectively, to supplant the problematic QFT with the so-called S-matrix theory. Since the specific details of microscopic interactions are inaccessible to observations, the theory should only attempt to describe the relationships between a small number of observables (''e.g.'' the energy of an atom) in an interaction, rather than be concerned with the microscopic minutiae of the interaction. In 1945, Richard Feynman and Wheeler daringly suggested abandoning QFT altogether and proposed action-at-a-distance as the mechanism of particle interactions.
In 1947, Willis Lamb and Robert Retherford measured the minute difference in the 2''S''1/2 and 2''P''1/2 energy levels of the hydrogen atom, also called the Lamb shift. By ignoring the contribution of photons whose energy exceeds the electron mass, Hans Bethe successfully estimated the numerical value of the Lamb shift. Subsequently, Norman Myles Kroll, Lamb, James Bruce French, and Victor Weisskopf again confirmed this value using an approach in which infinities cancelled other infinities to result in finite quantities. However, this method was clumsy and unreliable and could not be generalized to other calculations.
The breakthrough eventually came around 1950 when a more robust method for eliminating infinities was developed by Julian Schwinger, Richard Feynman, Freeman Dyson, and Shinichiro Tomonaga. The main idea is to replace the calculated values of mass and charge, infinite though they may be, by their finite measured values. This systematic computational procedure is known as renormalization and can be applied to arbitrary order in perturbation theory. As Tomonaga said in his Nobel lecture:Since those parts of the modified mass and charge due to field reactions ecome infinite it is impossible to calculate them by the theory. However, the mass and charge observed in experiments are not the original mass and charge but the mass and charge as modified by field reactions, and they are finite. On the other hand, the mass and charge appearing in the theory are… the values modified by field reactions. Since this is so, and particularly since the theory is unable to calculate the modified mass and charge, we may adopt the procedure of substituting experimental values for them phenomenologically... This procedure is called the renormalization of mass and charge… After long, laborious calculations, less skillful than Schwinger's, we obtained a result... which was in agreement with heAmericans'.
By applying the renormalization procedure, calculations were finally made to explain the electron's anomalous magnetic moment (the deviation of the electron ''g''-factor from 2) and vacuum polarization. These results agreed with experimental measurements to a remarkable degree, thus marking the end of a "war against infinities".
At the same time, Feynman introduced the path integral formulation of quantum mechanics and Feynman diagrams. The latter can be used to visually and intuitively organize and to help compute terms in the perturbative expansion. Each diagram can be interpreted as paths of particles in an interaction, with each vertex and line having a corresponding mathematical expression, and the product of these expressions gives the scattering amplitude of the interaction represented by the diagram.
It was with the invention of the renormalization procedure and Feynman diagrams that QFT finally arose as a complete theoretical framework.
Non-renormalizability
Given the tremendous success of QED, many theorists believed, in the few years after 1949, that QFT could soon provide an understanding of all microscopic phenomena, not only the interactions between photons, electrons, and positrons. Contrary to this optimism, QFT entered yet another period of depression that lasted for almost two decades.
The first obstacle was the limited applicability of the renormalization procedure. In perturbative calculations in QED, all infinite quantities could be eliminated by redefining a small (finite) number of physical quantities (namely the mass and charge of the electron). Dyson proved in 1949 that this is only possible for a small class of theories called "renormalizable theories", of which QED is an example. However, most theories, including the Fermi theory of the weak interaction, are "non-renormalizable". Any perturbative calculation in these theories beyond the first order would result in infinities that could not be removed by redefining a finite number of physical quantities.
The second major problem stemmed from the limited validity of the Feynman diagram method, which is based on a series expansion in perturbation theory. In order for the series to converge and low-order calculations to be a good approximation, the coupling constant, in which the series is expanded, must be a sufficiently small number. The coupling constant in QED is the fine-structure constant , which is small enough that only the simplest, lowest order, Feynman diagrams need to be considered in realistic calculations. In contrast, the coupling constant in the strong interaction is roughly of the order of one, making complicated, higher order, Feynman diagrams just as important as simple ones. There was thus no way of deriving reliable quantitative predictions for the strong interaction using perturbative QFT methods.
With these difficulties looming, many theorists began to turn away from QFT. Some focused on symmetry principles and conservation laws, while others picked up the old S-matrix theory of Wheeler and Heisenberg. QFT was used heuristically as guiding principles, but not as a basis for quantitative calculations.
Schwinger, however, took a different route. For more than a decade he and his students had been nearly the only exponents of field theory, but in 1966 he found a way around the problem of the infinities with a new method he called source theory. Developments in pion physics, in which the new viewpoint was most successfully applied, convinced him of the great advantages of mathematical simplicity and conceptual clarity that its use bestowed.
In source theory there are no divergences, and no renormalization. It may be regarded as the calculational tool of field theory, but it is more general. Using source theory, Schwinger was able to calculate the anomalous magnetic moment of the electron, which he had done in 1947, but this time with no ‘distracting remarks’ about infinite quantities.
Schwinger also applied source theory to his QFT theory of gravity, and was able to reproduce all four of Einstein’s classic results: gravitational red shift, deflection and slowing of light by gravity, and the perihelion precession of Mercury. The neglect of source theory by the physics community was a major disappointment for Schwinger:The lack of appreciation of these facts by others was depressing, but understandable. -J. Schwinger
Standard-Model
In 1954, Yang Chen-Ning and Robert Mills generalized the local symmetry of QED, leading to non-Abelian gauge theories (also known as Yang–Mills theories), which are based on more complicated local symmetry groups. In QED, (electrically) charged particles interact via the exchange of photons, while in non-Abelian gauge theory, particles carrying a new type of " charge" interact via the exchange of massless gauge bosons. Unlike photons, these gauge bosons themselves carry charge.
Sheldon Glashow developed a non-Abelian gauge theory that unified the electromagnetic and weak interactions in 1960. In 1964, Abdus Salam and John Clive Ward arrived at the same theory through a different path. This theory, nevertheless, was non-renormalizable.
Peter Higgs, Robert Brout, François Englert, Gerald Guralnik, Carl Hagen, and Tom Kibble proposed in their famous ''Physical Review Letters'' papers that the gauge symmetry in Yang–Mills theories could be broken by a mechanism called spontaneous symmetry breaking, through which originally massless gauge bosons could acquire mass.
By combining the earlier theory of Glashow, Salam, and Ward with the idea of spontaneous symmetry breaking, Steven Weinberg wrote down in 1967 a theory describing electroweak interactions between all leptons and the effects of the Higgs boson. His theory was at first mostly ignored, until it was brought back to light in 1971 by Gerard 't Hooft's proof that non-Abelian gauge theories are renormalizable. The electroweak theory of Weinberg and Salam was extended from leptons to quarks in 1970 by Glashow, John Iliopoulos, and Luciano Maiani, marking its completion.
Harald Fritzsch, Murray Gell-Mann, and Heinrich Leutwyler discovered in 1971 that certain phenomena involving the strong interaction could also be explained by non-Abelian gauge theory. Quantum chromodynamics (QCD) was born. In 1973, David Gross, Frank Wilczek, and Hugh David Politzer showed that non-Abelian gauge theories are " asymptotically free", meaning that under renormalization, the coupling constant of the strong interaction decreases as the interaction energy increases. (Similar discoveries had been made numerous times previously, but they had been largely ignored.) Therefore, at least in high-energy interactions, the coupling constant in QCD becomes sufficiently small to warrant a perturbative series expansion, making quantitative predictions for the strong interaction possible.
These theoretical breakthroughs brought about a renaissance in QFT. The full theory, which includes the electroweak theory and chromodynamics, is referred to today as the Standard Model of elementary particles. The Standard Model successfully describes all fundamental interactions except gravity
In physics, gravity () is a fundamental interaction which causes mutual attraction between all things with mass
Mass is an intrinsic property of a body. It was traditionally believed to be related to the quantity of matter in a ...
, and its many predictions have been met with remarkable experimental confirmation in subsequent decades. The Higgs boson, central to the mechanism of spontaneous symmetry breaking, was finally detected in 2012 at CERN
The European Organization for Nuclear Research, known as CERN (; ; ), is an intergovernmental organization that operates the largest particle physics laboratory in the world. Established in 1954, it is based in a northwestern suburb of Gene ...
, marking the complete verification of the existence of all constituents of the Standard Model.
Other developments
The 1970s saw the development of non-perturbative methods in non-Abelian gauge theories. The 't Hooft–Polyakov monopole was discovered theoretically by 't Hooft and Alexander Polyakov, flux tubes by Holger Bech Nielsen and Poul Olesen, and instantons by Polyakov and coauthors. These objects are inaccessible through perturbation theory.
Supersymmetry also appeared in the same period. The first supersymmetric QFT in four dimensions was built by Yuri Golfand and Evgeny Likhtman in 1970, but their result failed to garner widespread interest due to the Iron Curtain
The Iron Curtain was the political boundary dividing Europe
Europe is a large peninsula conventionally considered a continent in its own right because of its great physical size and the weight of its history and traditions. Europe is al ...
. Supersymmetry only took off in the theoretical community after the work of Julius Wess and Bruno Zumino in 1973.
Among the four fundamental interactions, gravity remains the only one that lacks a consistent QFT description. Various attempts at a theory of quantum gravity led to the development of string theory, itself a type of two-dimensional QFT with conformal symmetry. Joël Scherk and John Schwarz first proposed in 1974 that string theory could be ''the'' quantum theory of gravity.
Condensed-matter-physics
Although quantum field theory arose from the study of interactions between elementary particles, it has been successfully applied to other physical systems, particularly to many-body systems in condensed matter physics.
Historically, the Higgs mechanism of spontaneous symmetry breaking was a result of Yoichiro Nambu's application of superconductor theory to elementary particles, while the concept of renormalization came out of the study of second-order phase transition
In chemistry
Chemistry is the scientific study of the properties and behavior of matter. It is a natural science that covers the elements that make up matter to the compounds made of atoms, molecules and ions: their composition, ...
s in matter.
Soon after the introduction of photons, Einstein performed the quantization procedure on vibrations in a crystal, leading to the first quasiparticle— phonons. Lev Landau claimed that low-energy excitations in many condensed matter systems could be described in terms of interactions between a set of quasiparticles. The Feynman diagram method of QFT was naturally well suited to the analysis of various phenomena in condensed matter systems.
Gauge theory is used to describe the quantization of magnetic flux in superconductors, the resistivity in the quantum Hall effect, as well as the relation between frequency and voltage in the AC Josephson effect.
Principles
For simplicity, natural units are used in the following sections, in which the reduced Planck constant and the speed of light are both set to one.
Classical fields
A classical field is a function of spatial and time coordinates. Examples include the gravitational field in Newtonian gravity and the electric field and magnetic field
A magnetic field is a vector field that describes the magnetic influence on moving electric charges, electric currents, and magnetic materials. A moving charge in a magnetic field experiences a force perpendicular to its own velocity and t ...
in classical electromagnetism. A classical field can be thought of as a numerical quantity assigned to every point in space that changes in time. Hence, it has infinitely many degrees of freedom.
Many phenomena exhibiting quantum mechanical properties cannot be explained by classical fields alone. Phenomena such as the photoelectric effect are best explained by discrete particles ( photons), rather than a spatially continuous field. The goal of quantum field theory is to describe various quantum mechanical phenomena using a modified concept of fields.
Canonical quantization and path integrals are two common formulations of QFT. To motivate the fundamentals of QFT, an overview of classical field theory follows.
The simplest classical field is a real scalar field — a real number
In mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in ...
at every point in space that changes in time. It is denoted as , where is the position vector, and is the time. Suppose the Lagrangian of the field, , is
:
where is the Lagrangian density, is the time-derivative of the field, is the gradient operator, and is a real parameter (the "mass" of the field). Applying the Euler–Lagrange equation on the Lagrangian:
:
we obtain the equations of motion for the field, which describe the way it varies in time and space:
:
This is known as the Klein–Gordon equation.
The Klein–Gordon equation is a wave equation, so its solutions can be expressed as a sum of normal modes (obtained via Fourier transform) as follows:
:
where is a complex number
In mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented i ...
(normalized by convention), denotes complex conjugation, and is the frequency of the normal mode:
:
Thus each normal mode corresponding to a single can be seen as a classical harmonic oscillator with frequency .
Canonical quantization
The quantization procedure for the above classical field to a quantum operator field is analogous to the promotion of a classical harmonic oscillator to a quantum harmonic oscillator.
The displacement of a classical harmonic oscillator is described by
:
where is a complex number (normalized by convention), and is the oscillator's frequency. Note that is the displacement of a particle in simple harmonic motion from the equilibrium position, not to be confused with the spatial label of a quantum field.
For a quantum harmonic oscillator, is promoted to a linear operator :
:
Complex numbers and are replaced by the annihilation operator and the creation operator , respectively, where denotes Hermitian conjugation. The commutation relation between the two is
:
The Hamiltonian of the simple harmonic oscillator can be written as
:
The vacuum state , which is the lowest energy state, is defined by
:
and has energy
One can easily check that which implies that increases the energy of the simple harmonic oscillator by . For example, the state is an eigenstate of energy .
Any energy eigenstate state of a single harmonic oscillator can be obtained from by successively applying the creation operator : and any state of the system can be expressed as a linear combination of the states
:
A similar procedure can be applied to the real scalar field , by promoting it to a quantum field operator , while the annihilation operator , the creation operator and the angular frequency are now for a particular :
:
Their commutation relations are:
:
where is the Dirac delta function. The vacuum state is defined by
:
Any quantum state of the field can be obtained from by successively applying creation operators (or by a linear combination of such states), e.g.
:
While the state space of a single quantum harmonic oscillator contains all the discrete energy states of one oscillating particle, the state space of a quantum field contains the discrete energy levels of an arbitrary number of particles. The latter space is known as a Fock space, which can account for the fact that particle numbers are not fixed in relativistic quantum systems. The process of quantizing an arbitrary number of particles instead of a single particle is often also called second quantization.
The foregoing procedure is a direct application of non-relativistic quantum mechanics and can be used to quantize (complex) scalar fields, Dirac fields, vector fields (''e.g.'' the electromagnetic field), and even strings. However, creation and annihilation operators are only well defined in the simplest theories that contain no interactions (so-called free theory). In the case of the real scalar field, the existence of these operators was a consequence of the decomposition of solutions of the classical equations of motion into a sum of normal modes. To perform calculations on any realistic interacting theory, perturbation theory would be necessary.
The Lagrangian of any quantum field in nature would contain interaction terms in addition to the free theory terms. For example, a quartic interaction term could be introduced to the Lagrangian of the real scalar field:
:
where is a spacetime index, , etc. The summation over the index has been omitted following the Einstein notation. If the parameter is sufficiently small, then the interacting theory described by the above Lagrangian can be considered as a small perturbation from the free theory.
Path integrals
The path integral formulation of QFT is concerned with the direct computation of the scattering amplitude of a certain interaction process, rather than the establishment of operators and state spaces. To calculate the probability amplitude for a system to evolve from some initial state at time to some final state at , the total time is divided into small intervals. The overall amplitude is the product of the amplitude of evolution within each interval, integrated over all intermediate states. Let be the Hamiltonian (''i.e.'' generator of time evolution), then
:
Taking the limit , the above product of integrals becomes the Feynman path integral:
:
where is the Lagrangian involving and its derivatives with respect to spatial and time coordinates, obtained from the Hamiltonian via Legendre transformation. The initial and final conditions of the path integral are respectively
:
In other words, the overall amplitude is the sum over the amplitude of every possible path between the initial and final states, where the amplitude of a path is given by the exponential in the integrand.
Two-point correlation function
In calculations, one often encounters expression likein the free or interacting theory, respectively. Here, and are position four-vectors, is the time ordering operator that shuffles its operands so the time-components and increase from right to left, and is the ground state (vacuum state) of the interacting theory, different from the free ground state . This expression represents the probability amplitude for the field to propagate from to , and goes by multiple names, like the two-point propagator, two-point correlation function, two-point Green's function or two-point function for short.
The free two-point function, also known as the Feynman propagator, can be found for the real scalar field by either canonical quantization or path integrals to be
:
In an interacting theory, where the Lagrangian or Hamiltonian contains terms or that describe interactions, the two-point function is more difficult to define. However, through both the canonical quantization formulation and the path integral formulation, it is possible to express it through an infinite perturbation series of the ''free'' two-point function.
In canonical quantization, the two-point correlation function can be written as:
:
where is an infinitesimal number and is the field operator under the free theory. Here, the exponential should be understood as its power series expansion. For example, in -theory, the interacting term of the Hamiltonian is , and the expansion of the two-point correlator in terms of becomesThis perturbation expansion expresses the interacting two-point function in terms of quantities that are evaluated in the ''free'' theory.
In the path integral formulation, the two-point correlation function can be written
:
where is the Lagrangian density. As in the previous paragraph, the exponential can be expanded as a series in , reducing the interacting two-point function to quantities in the free theory.
Wick's theorem further reduce any -point correlation function in the free theory to a sum of products of two-point correlation functions. For example,
:
Since interacting correlation functions can be expressed in terms of free correlation functions, only the latter need to be evaluated in order to calculate all physical quantities in the (perturbative) interacting theory. This makes the Feynman propagator one of the most important quantities in quantum field theory.
Feynman diagram
Correlation functions in the interacting theory can be written as a perturbation series. Each term in the series is a product of Feynman propagators in the free theory and can be represented visually by a Feynman diagram. For example, the term in the two-point correlation function in the theory is
:
After applying Wick's theorem, one of the terms is
:
This term can instead be obtained from the Feynman diagram
:
.
The diagram consists of
* ''external vertices'' connected with one edge and represented by dots (here labeled and ).
* ''internal vertices'' connected with four edges and represented by dots (here labeled ).
* ''edges'' connecting the vertices and represented by lines.
Every vertex corresponds to a single field factor at the corresponding point in spacetime, while the edges correspond to the propagators between the spacetime points. The term in the perturbation series corresponding to the diagram is obtained by writing down the expression that follows from the so-called Feynman rules:
# For every internal vertex , write down a factor .
# For every edge that connects two vertices and , write down a factor .
# Divide by the symmetry factor of the diagram.
With the symmetry factor , following these rules yields exactly the expression above. By Fourier transforming the propagator, the Feynman rules can be reformulated from position space into momentum space.
In order to compute the -point correlation function to the -th order, list all valid Feynman diagrams with external points and or fewer vertices, and then use Feynman rules to obtain the expression for each term. To be precise,
:
is equal to the sum of (expressions corresponding to) all connected diagrams with external points. (Connected diagrams are those in which every vertex is connected to an external point through lines. Components that are totally disconnected from external lines are sometimes called "vacuum bubbles".) In the interaction theory discussed above, every vertex must have four legs.
In realistic applications, the scattering amplitude of a certain interaction or the decay rate of a particle can be computed from the S-matrix, which itself can be found using the Feynman diagram method.
Feynman diagrams devoid of "loops" are called tree-level diagrams, which describe the lowest-order interaction processes; those containing loops are referred to as -loop diagrams, which describe higher-order contributions, or radiative corrections, to the interaction. Lines whose end points are vertices can be thought of as the propagation of virtual particles.
Renormalization
Feynman rules can be used to directly evaluate tree-level diagrams. However, naïve computation of loop diagrams such as the one shown above will result in divergent momentum integrals, which seems to imply that almost all terms in the perturbative expansion are infinite. The renormalisation procedure is a systematic process for removing such infinities.
Parameters appearing in the Lagrangian, such as the mass and the coupling constant , have no physical meaning — , , and the field strength are not experimentally measurable quantities and are referred to here as the bare mass, bare coupling constant, and bare field, respectively. The physical mass and coupling constant are measured in some interaction process and are generally different from the bare quantities. While computing physical quantities from this interaction process, one may limit the domain of divergent momentum integrals to be below some momentum cut-off , obtain expressions for the physical quantities, and then take the limit . This is an example of regularization, a class of methods to treat divergences in QFT, with being the regulator.
The approach illustrated above is called bare perturbation theory, as calculations involve only the bare quantities such as mass and coupling constant. A different approach, called renormalized perturbation theory, is to use physically meaningful quantities from the very beginning. In the case of theory, the field strength is first redefined:
:
where is the bare field, is the renormalized field, and is a constant to be determined. The Lagrangian density becomes:
:
where and are the experimentally measurable, renormalized, mass and coupling constant, respectively, and
:
are constants to be determined. The first three terms are the Lagrangian density written in terms of the renormalized quantities, while the latter three terms are referred to as "counterterms". As the Lagrangian now contains more terms, so the Feynman diagrams should include additional elements, each with their own Feynman rules. The procedure is outlined as follows. First select a regularization scheme (such as the cut-off regularization introduced above or dimensional regularization); call the regulator . Compute Feynman diagrams, in which divergent terms will depend on . Then, define , , and such that Feynman diagrams for the counterterms will exactly cancel the divergent terms in the normal Feynman diagrams when the limit is taken. In this way, meaningful finite quantities are obtained.
It is only possible to eliminate all infinities to obtain a finite result in renormalizable theories, whereas in non-renormalizable theories infinities cannot be removed by the redefinition of a small number of parameters. The Standard Model of elementary particles is a renormalizable QFT, while quantum gravity is non-renormalizable.
Renormalization group
The renormalization group, developed by Kenneth Wilson, is a mathematical apparatus used to study the changes in physical parameters (coefficients in the Lagrangian) as the system is viewed at different scales. The way in which each parameter changes with scale is described by its ''β'' function. Correlation functions, which underlie quantitative physical predictions, change with scale according to the Callan–Symanzik equation.
As an example, the coupling constant in QED, namely the elementary charge , has the following ''β'' function:
:
where is the energy scale under which the measurement of is performed. This differential equation implies that the observed elementary charge increases as the scale increases. The renormalized coupling constant, which changes with the energy scale, is also called the running coupling constant.
The coupling constant in quantum chromodynamics, a non-Abelian gauge theory based on the symmetry group , has the following ''β'' function:
:
where is the number of quark flavours. In the case where (the Standard Model has ), the coupling constant decreases as the energy scale increases. Hence, while the strong interaction is strong at low energies, it becomes very weak in high-energy interactions, a phenomenon known as asymptotic freedom.
Conformal field theories (CFTs) are special QFTs that admit conformal symmetry. They are insensitive to changes in the scale, as all their coupling constants have vanishing ''β'' function. (The converse is not true, however — the vanishing of all ''β'' functions does not imply conformal symmetry of the theory.) Examples include string theory and supersymmetric Yang–Mills theory.
According to Wilson's picture, every QFT is fundamentally accompanied by its energy cut-off , ''i.e.'' that the theory is no longer valid at energies higher than , and all degrees of freedom above the scale are to be omitted. For example, the cut-off could be the inverse of the atomic spacing in a condensed matter system, and in elementary particle physics it could be associated with the fundamental "graininess" of spacetime caused by quantum fluctuations in gravity. The cut-off scale of theories of particle interactions lies far beyond current experiments. Even if the theory were very complicated at that scale, as long as its couplings are sufficiently weak, it must be described at low energies by a renormalizable effective field theory. The difference between renormalizable and non-renormalizable theories is that the former are insensitive to details at high energies, whereas the latter do depend on them. According to this view, non-renormalizable theories are to be seen as low-energy effective theories of a more fundamental theory. The failure to remove the cut-off from calculations in such a theory merely indicates that new physical phenomena appear at scales above , where a new theory is necessary.
Other theories
The quantization and renormalization procedures outlined in the preceding sections are performed for the free theory and theory of the real scalar field. A similar process can be done for other types of fields, including the complex scalar field, the vector field, and the Dirac field, as well as other types of interaction terms, including the electromagnetic interaction and the Yukawa interaction.
As an example, quantum electrodynamics contains a Dirac field representing the electron
The electron ( or ) is a subatomic particle with a negative one elementary electric charge. Electrons belong to the first generation of the lepton particle family,
and are generally thought to be elementary particles because they have n ...
field and a vector field representing the electromagnetic field ( photon field). (Despite its name, the quantum electromagnetic "field" actually corresponds to the classical electromagnetic four-potential, rather than the classical electric and magnetic fields.) The full QED Lagrangian density is:
:
where are Dirac matrices, , and is the electromagnetic field strength. The parameters in this theory are the (bare) electron mass and the (bare) elementary charge . The first and second terms in the Lagrangian density correspond to the free Dirac field and free vector fields, respectively. The last term describes the interaction between the electron and photon fields, which is treated as a perturbation from the free theories.
Shown above is an example of a tree-level Feynman diagram in QED. It describes an electron and a positron annihilating, creating an off-shell photon, and then decaying into a new pair of electron and positron. Time runs from left to right. Arrows pointing forward in time represent the propagation of positrons, while those pointing backward in time represent the propagation of electrons. A wavy line represents the propagation of a photon. Each vertex in QED Feynman diagrams must have an incoming and an outgoing fermion (positron/electron) leg as well as a photon leg.
Gauge symmetry
If the following transformation to the fields is performed at every spacetime point (a local transformation), then the QED Lagrangian remains unchanged, or invariant:
:
where is any function of spacetime coordinates. If a theory's Lagrangian (or more precisely the action) is invariant under a certain local transformation, then the transformation is referred to as a gauge symmetry of the theory. Gauge symmetries form a group at every spacetime point. In the case of QED, the successive application of two different local symmetry transformations and is yet another symmetry transformation . For any , is an element of the group, thus QED is said to have gauge symmetry. The photon field may be referred to as the gauge boson.
is an Abelian group, meaning that the result is the same regardless of the order in which its elements are applied. QFTs can also be built on non-Abelian groups, giving rise to non-Abelian gauge theories (also known as Yang–Mills theories). Quantum chromodynamics, which describes the strong interaction, is a non-Abelian gauge theory with an gauge symmetry. It contains three Dirac fields representing quark fields as well as eight vector fields representing gluon fields, which are the gauge bosons. The QCD Lagrangian density is:
:
where is the gauge covariant derivative:
:
where is the coupling constant, are the eight generators of in the fundamental representation ( matrices),
:
and are the structure constants of . Repeated indices are implicitly summed over following Einstein notation. This Lagrangian is invariant under the transformation:
:
where is an element of at every spacetime point :
:
The preceding discussion of symmetries is on the level of the Lagrangian. In other words, these are "classical" symmetries. After quantization, some theories will no longer exhibit their classical symmetries, a phenomenon called anomaly. For instance, in the path integral formulation, despite the invariance of the Lagrangian density