Stochastic Logarithm
   HOME

TheInfoList



OR:

In stochastic calculus, stochastic logarithm of a semimartingale Ysuch that Y\neq0 and Y_-\neq0 is the semimartingale X given bydX_t=\frac,\quad X_0=0.In layperson's terms, stochastic logarithm of Y measures the cumulative percentage change in Y.


Notation and terminology

The process X obtained above is commonly denoted \mathcal(Y). The terminology ''stochastic logarithm'' arises from the similarity of \mathcal(Y) to the
natural logarithm The natural logarithm of a number is its logarithm to the base of a logarithm, base of the e (mathematical constant), mathematical constant , which is an Irrational number, irrational and Transcendental number, transcendental number approxima ...
\log(Y): If Y is absolutely continuous with respect to time and Y\neq 0, then ''X'' solves, path-by-path, the differential equation \frac = \frac,whose solution is X =\log, Y, -\log, Y_0, .


General formula and special cases

* Without any assumptions on the semimartingale Y (other than Y\neq 0, Y_-\neq 0), one has\mathcal(Y)_t = \log\Biggl, \frac\Biggl, +\frac12\int_0^t\frac +\sum_\Biggl(\log\Biggl, 1 + \frac \Biggr, -\frac\Biggr),\qquad t\ge0,where c is the continuous part of quadratic variation of Y and the sum extends over the (countably many) jumps of Y up to time t. * If Y is continuous, then \mathcal(Y)_t = \log\Biggl, \frac\Biggl, +\frac12\int_0^t\frac,\qquad t\ge0.In particular, if Y is a geometric Brownian motion, then X is a Brownian motion with a constant drift rate. * If Y is continuous and of finite variation, then\mathcal(Y) = \log\Biggl, \frac\Biggl, .Here Y need not be differentiable with respect to time; for example, Y can equal 1 plus the
Cantor function In mathematics, the Cantor function is an example of a function (mathematics), function that is continuous function, continuous, but not absolute continuity, absolutely continuous. It is a notorious Pathological_(mathematics)#Pathological_exampl ...
.


Properties

* Stochastic logarithm is an inverse operation to
stochastic exponential Stochastic (; ) is the property of being well-described by a random probability distribution. ''Stochasticity'' and ''randomness'' are technically distinct concepts: the former refers to a modeling approach, while the latter describes phenomena; in ...
: If \Delta X\neq -1, then \mathcal(\mathcal(X)) = X-X_0. Conversely, if Y\neq 0 and Y_-\neq 0, then \mathcal(\mathcal(Y)) = Y/Y_0. * Unlike the natural logarithm \log(Y_t), which depends only of the value of Y at time t, the stochastic logarithm \mathcal(Y)_t depends not only on Y_t but on the whole history of Y in the time interval ,t/math>. For this reason one must write \mathcal(Y)_t and not \mathcal(Y_t). * Stochastic logarithm of a local martingale that does not vanish together with its left limit is again a local martingale. * All the formulae and properties above apply also to stochastic logarithm of a
complex Complex commonly refers to: * Complexity, the behaviour of a system whose components interact in multiple ways so possible interactions are difficult to describe ** Complex system, a system composed of many components which may interact with each ...
-valued Y. * Stochastic logarithm can be defined also for processes Y that are absorbed in zero after jumping to zero. Such definition is meaningful up to the first time that Y reaches 0 continuously.


Useful identities

* Converse of the Yor formula: If Y^,Y^ do not vanish together with their left limits, then\mathcal\bigl(Y^Y^\bigr) = \mathcal\bigl(Y^\bigr) + \mathcal\bigl(Y^\bigr) + \bigl mathcal\bigl(Y^\bigr),\mathcal\bigl(Y^\bigr)\bigr * Stochastic logarithm of 1/\mathcal(X): If \Delta X\neq -1, then\mathcal\biggl(\frac\biggr)_t = X_0-X_t- c_t +\sum_\frac.


Applications

*
Girsanov's theorem In probability theory, Girsanov's theorem or the Cameron-Martin-Girsanov theorem explains how stochastic processes change under changes in measure. The theorem is especially important in the theory of financial mathematics as it explains how to ...
can be paraphrased as follows: Let Q be a probability measure equivalent to another probability measure P. Denote by Z the uniformly integrable martingale closed by Z_\infty = dQ/dP. For a semimartingale U the following are equivalent: *# Process U is special under Q. *# Process U+ ,\mathcal(Z)/math> is special under P. *+ If either of these conditions holds, then the Q-drift of U equals the P-drift of U+ ,\mathcal(Z)/math>.


References

{{Reflist

See also

*
Stochastic exponential Stochastic (; ) is the property of being well-described by a random probability distribution. ''Stochasticity'' and ''randomness'' are technically distinct concepts: the former refers to a modeling approach, while the latter describes phenomena; in ...
Stochastic calculus