HOME

TheInfoList



OR:

In
calculus Calculus, originally called infinitesimal calculus or "the calculus of infinitesimals", is the mathematics, mathematical study of continuous change, in the same way that geometry is the study of shape, and algebra is the study of generalizati ...
, the
derivative In mathematics, the derivative of a function of a real variable measures the sensitivity to change of the function value (output value) with respect to a change in its argument (input value). Derivatives are a fundamental tool of calculus. ...
of any linear combination of
function Function or functionality may refer to: Computing * Function key, a type of key on computer keyboards * Function model, a structured representation of processes in a system * Function object or functor or functionoid, a concept of object-oriente ...
s equals the same linear combination of the derivatives of the functions; this property is known as linearity of differentiation, the rule of linearity, or the superposition rule for differentiation. It is a fundamental property of the derivative that encapsulates in a single rule two simpler rules of differentiation, the sum rule (the derivative of the sum of two functions is the sum of the derivatives) and the constant factor rule (the derivative of a constant multiple of a function is the same constant multiple of the derivative). Thus it can be said that differentiation is
linear Linearity is the property of a mathematical relationship ('' function'') that can be graphically represented as a straight line. Linearity is closely related to '' proportionality''. Examples in physics include rectilinear motion, the linear ...
, or the
differential operator In mathematics, a differential operator is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and retur ...
is a
linear Linearity is the property of a mathematical relationship ('' function'') that can be graphically represented as a straight line. Linearity is closely related to '' proportionality''. Examples in physics include rectilinear motion, the linear ...
operator.


Statement and derivation

Let and be functions, with and constants. Now consider :\frac ( \alpha \cdot f(x) + \beta \cdot g(x) ). By the
sum rule in differentiation This is a summary of differentiation rules, that is, rules for computing the derivative of a function in calculus. Elementary rules of differentiation Unless otherwise stated, all functions are functions of real numbers (R) that return real ...
, this is :\frac ( \alpha \cdot f(x) ) + \frac (\beta \cdot g(x)), and by the
constant factor rule in differentiation This is a summary of differentiation rules, that is, rules for computing the derivative of a function in calculus. Elementary rules of differentiation Unless otherwise stated, all functions are functions of real numbers (R) that return real ...
, this reduces to :\alpha \cdot f'(x) + \beta \cdot g'(x). Therefore, :\frac(\alpha \cdot f(x) + \beta \cdot g(x)) = \alpha \cdot f'(x) + \beta \cdot g'(x). Omitting the
bracket A bracket is either of two tall fore- or back-facing punctuation marks commonly used to isolate a segment of text or data from its surroundings. Typically deployed in symmetric pairs, an individual bracket may be identified as a 'left' or 'r ...
s, this is often written as: :(\alpha \cdot f + \beta \cdot g)' = \alpha \cdot f'+ \beta \cdot g'.


Detailed proofs/derivations from definition

We can prove the entire linearity principle at once, or, we can prove the individual steps (of constant factor and adding) individually. Here, both will be shown. Proving linearity directly also proves the constant factor rule, the sum rule, and the difference rule as special cases. The sum rule is obtained by setting both constant coefficients to 1. The difference rule is obtained by setting the first constant coefficient to 1 and the second constant coefficient to -1. The constant factor rule is obtained by setting either the second constant coefficient or the second function to 0. (From a technical standpoint, the
domain Domain may refer to: Mathematics *Domain of a function, the set of input values for which the (total) function is defined ** Domain of definition of a partial function ** Natural domain of a partial function **Domain of holomorphy of a function * ...
of the second function must also be considered - one way to avoid issues is setting the second function equal to the first function and the second constant coefficient equal to 0. One could also define both the second constant coefficient and the second function to be 0, where the domain of the second function is a superset of the first function, among other possibilities.) On the contrary, if we first prove the constant factor rule and the sum rule, we can prove linearity and the difference rule. Proving linearity is done by defining the first and second functions as being two other functions being multiplied by constant coefficients. Then, as shown in the derivation from the previous section, we can first use the sum law while differentiation, and then use the constant factor rule, which will reach our conclusion for linearity. In order to prove the difference rule, the second function can be redefined as another function multiplied by the constant coefficient of -1. This would, when simplified, give us the difference rule for differentiation. In the proofs/derivations below, the coefficients a, b are used; they correspond to the coefficients \alpha, \beta above.


Linearity (directly)

Let a, b \in \mathbb. Let f, g be functions. Let j be a function, where j is defined only where f and g are both defined. (In other words, the domain of j is the intersection of the domains of f and g.) Let x be in the domain of j. Let j(x) = af(x) + bg(x). We want to prove that j^(x) = af^(x) + bg^(x). By definition, we can see that \begin j^(x) &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \left( \frac + \frac \right) \\ &= \lim_ \left( \frac + \frac \right) \\ &= \lim_ \left( a\frac + b\frac \right) \\ \end In order to use the limits law for the sum of limits, we need to know that \lim_ a\frac and \lim_ b\frac both individually exist. For these smaller limits, we need to know that \lim_ \frac and \lim_ \frac both individually exist to use the coefficient law for limits. By definition, f^(x) = \lim_ \frac and g^(x) = \lim_ \frac. So, if we know that f^(x) and g^(x) both exist, we will know that \lim_ \frac and \lim_ \frac both individually exist. This allows us to use the coefficient law for limits to write \lim_ a\frac = a\lim_\frac and \lim_ b\frac = b\lim_\frac. With this, we can go back to apply the limit law for the sum of limits, since we know that \lim_ a\frac and \lim_ b\frac both individually exist. From here, we can directly go back to the derivative we were working on.\begin j^(x) &= \lim_ \frac \\ &\;\;\vdots \\ &= \lim_ \left( a\frac + b\frac \right) \\ &= \lim_ \left( a\frac\right) + \lim_ \left(b\frac \right) \\ &= a\lim_ \left( \frac\right) + b\lim_ \left(\frac \right) \\ &= af^(x) + bg^(x) \endFinally, we have shown what we claimed in the beginning: j^(x) = af^(x) + bg^(x).


Sum

Let f, g be functions. Let j be a function, where j is defined only where f and g are both defined. (In other words, the domain of j is the intersection of the domains of f and g.) Let x be in the domain of j. Let j(x) = f(x) + g(x). We want to prove that j^(x) = f^(x) + g^(x). By definition, we can see that \begin j^(x) &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \left( \frac + \frac \right) \\ \endIn order to use the law for the sum of limits here, we need to show that the individual limits, \lim_ \frac and \lim_ \frac both exist. By definition, f^(x) = \lim_ \fracand g^(x) = \lim_ \frac, so the limits exist whenever the derivatives f^(x) and g^(x) exist. So, assuming that the derivatives exist, we can continue the above derivation \begin j^(x) &= \lim_ \frac \\ &\;\;\vdots \\ &= \lim_ \left( \frac + \frac \right) \\ &= \lim_ \frac + \lim_ \frac \\ &= f^(x) + g^(x) \end Thus, we have shown what we wanted to show, that: j^(x) = f^(x) + g^(x).


Difference

Let f, g be functions. Let j be a function, where j is defined only where f and g are both defined. (In other words, the domain of j is the intersection of the domains of f and g.) Let x be in the domain of j. Let j(x) = f(x) - g(x). We want to prove that j^(x) = f^(x) - g^(x). By definition, we can see that: \begin j^(x) &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \left( \frac - \frac \right) \\ \end In order to use the law for the difference of limits here, we need to show that the individual limits, \lim_ \frac and \lim_ \frac both exist. By definition, f^(x) = \lim_ \frac and that g^(x) = \lim_ \frac, so these limits exist whenever the derivatives f^(x) and g^(x) exist. So, assuming that the derivatives exist, we can continue the above derivation \begin j^(x) &= \lim_ \frac \\ &\;\;\vdots \\ &= \lim_ \left( \frac - \frac \right) \\ &= \lim_ \frac - \lim_ \frac \\ &= f^(x) - g^(x) \end Thus, we have shown what we wanted to show, that: j^(x) = f^(x) - g^(x).


Constant coefficient

Let f be a function. Let a \in \mathbb; a will be the constant coefficient. Let j be a function, where j is defined only where f is defined. (In other words, the domain of j is equal to the domain of f.) Let x be in the domain of j. Let j(x) = af(x). We want to prove that j^(x) = af^(x). By definition, we can see that: \begin j^(x) &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ \frac \\ &= \lim_ a\frac \\ \end Now, in order to use a limit law for constant coefficients to show that \lim_ a\frac = a\lim_ \frac we need to show that \lim_ \frac exists. However, f^(x) = \lim_ \frac, by the definition of the derivative. So, if f^(x) exists, then \lim_ \frac exists. Thus, if we assume that f^(x) exists, we can use the limit law and continue our proof. \begin j^(x) &= \lim_ \frac \\ &\;\;\vdots \\ &= \lim_ a\frac \\ &= a\lim_ \frac \\ &= af^(x) \\ \end Thus, we have proven that when j(x) = af(x), we have j^(x) = af^(x).


See also

* * * * * * * * * * *


References

{{Calculus topics Articles containing proofs Differential calculus Differentiation rules Theorems in analysis Theorems in calculus