Definitions
Small O: convergence in probability
For a set of random variables ''Xn'' and a corresponding set of constants ''an'' (both indexed by ''n'', which need not be discrete), the notation : means that the set of values ''Xn''/''an'' converges to zero in probability as ''n'' approaches an appropriate limit. Equivalently, ''X''''n'' = o''p''(''a''''n'') can be written as ''X''''n''/''a''''n'' = o''p''(1), where ''X''''n'' = o''p''(1) is defined as, : for every positive ε. Yvonne M. Bishop, Stephen E.Fienberg, Paul W. Holland. (1975,2007) ''Discrete multivariate analysis'', Springer. ,Big O: stochastic boundedness
The notation : means that the set of values ''Xn''/''an'' is stochastically bounded. That is, for any ''ε'' > 0, there exists a finite ''M'' > 0 and a finite ''N'' > 0 such that :Comparison of the two definitions
The difference between the definition is subtle. If one uses the definition of the limit, one gets: * Big O''p''(1): * Small o''p''(1): The difference lies in the δ: for stochastic boundedness, it suffices that there exists one (arbitrary large) δ to satisfy the inequality, and δ is allowed to be dependent on ε (hence the δε). On the other side, for convergence, the statement has to hold not only for one, but for any (arbitrary small) δ. In a sense, this means that the sequence must be bounded, with a bound that gets smaller as the sample size increases. This suggests that if a sequence is o''p''(1), then it is O''p''(1), i.e. convergence in probability implies stochastic boundedness. But the reverse does not hold.Example
If is a stochastic sequence such that each element has finite variance, then : (see Theorem 14.4-1 in Bishop et al.) If, moreover, is a null sequence for a sequence of real numbers, then converges to zero in probability by Chebyshev's inequality, so :References