HOME





RiskMetrics Group
The RiskMetrics variance model (also known as exponential smoother) was first established in 1989, when Sir Dennis Weatherstone, the new chairman of J.P. Morgan, asked for a daily report measuring and explaining the risks of his firm. Nearly four years later in 1992, J.P. Morgan launched the RiskMetrics methodology to the marketplace, making the substantive research and analysis that satisfied Sir Dennis Weatherstone's request freely available to all market participants. In 1998, as client demand for the group's risk management expertise exceeded the firm's internal risk management resources, the Corporate Risk Management Department was spun off from J.P. Morgan as RiskMetrics Group with 23 founding employees. The RiskMetrics technical document was revised in 1996. In 2001, it was revised again in ''Return to RiskMetrics''. In 2006, a new method for modeling risk factor returns was introduced (RM2006). On 25 January 2008, RiskMetrics Group listed on the New York Stock Exchan ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Risk Metric
In the context of risk measurement, a risk metric is the concept quantified by a risk measure. When choosing a risk metric, an agent is picking an aspect of perceived risk to investigate, such as volatility or probability of default. Risk measure and risk metric In a general sense, a measure is a procedure for quantifying something. A metric is that which is being quantified. In other words, the method or formula to calculate a risk metric is called a risk measure. For example, in finance, the volatility of a stock might be calculated in any one of the three following ways: * Calculate the sample standard deviation of the stock's returns over the past 30 trading days. * Calculate the sample standard deviation of the stock's returns over the past 100 trading days. * Calculate the implied volatility of the stock from some specified call option on the stock. These are three distinct risk measures. Each could be used to measure the single risk metric volatility. Examples * De ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Downside Risk
Downside risk is the financial risk associated with losses. That is, it is the risk of the actual return being below the expected return, or the uncertainty about the magnitude of that difference. Risk measures typically quantify the downside risk, whereas the standard deviation (an example of a deviation risk measure) measures both the upside and downside risk. Specifically, downside risk can be measured either with downside beta or by measuring lower semi-deviation. The statistic ''below-target semi-deviation'' or simply ''target semi-deviation'' (TSV) has become the industry standard. History Downside risk was first modeled by Roy (1952), who assumed that an investor's goal was to minimize his/her risk. This mean-semivariance, or downside risk, model is also known as “safety-first” technique, and only looks at the lower standard deviations of expected returns which are the potential losses. This is about the same time Harry Markowitz was developing mean-variance theory. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Value-at-Risk
Value at risk (VaR) is a measure of the risk of loss for investments. It estimates how much a set of investments might lose (with a given probability), given normal market conditions, in a set time period such as a day. VaR is typically used by firms and regulators in the financial industry to gauge the amount of assets needed to cover possible losses. For a given portfolio, time horizon, and probability ''p'', the ''p'' VaR can be defined informally as the maximum possible loss during that time after excluding all worse outcomes whose combined probability is at most ''p''. This assumes mark-to-market pricing, and no trading in the portfolio. For example, if a portfolio of stocks has a one-day 95% VaR of $1 million, that means that there is a 0.05 probability that the portfolio will fall in value by more than $1 million over a one-day period if there is no trading. Informally, a loss of $1 million or more on this portfolio is expected on 1 day out of 20 days (because of 5% proba ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Market Exposure
In finance, market exposure (or exposure) is a measure of the proportion of money invested in the same industry sector. For example, a stock portfolio with a total worth of $500,000, with $100,000 in semiconductor A semiconductor is a material which has an electrical conductivity value falling between that of a conductor, such as copper, and an insulator, such as glass. Its resistivity falls as its temperature rises; metals behave in the opposite way. ... industry stocks, would have a 20% exposure in "chip" stocks. References Finance theories Investment {{finance-stub ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Black Swan Theory
The black swan theory or theory of black swan events is a metaphor that describes an event that comes as a surprise, has a major effect, and is often inappropriately rationalized after the fact with the benefit of hindsight. The term is based on an ancient saying that presumed black swans did not exist a saying that became reinterpreted to teach a different lesson after they were discovered in Australia. The theory was developed by Nassim Nicholas Taleb, starting in 2001, to explain: # The disproportionate role of high-profile, hard-to-predict, and rare events that are beyond the realm of normal expectations in history, science, finance, and technology. # The non-computability of the probability of consequential rare events using scientific methods (owing to the very nature of small probabilities). # The psychological biases that blind people, both individually and collectively, to uncertainty and a rare event's massive role in historical affairs. Taleb's "black swan theory" ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




The Black Swan (Taleb Book)
''The Black Swan: The Impact of the Highly Improbable'' is a 2007 book by Nassim Nicholas Taleb, who is a former options trader. The book focuses on the extreme impact of rare and unpredictable outlier events—and the human tendency to find simplistic explanations for these events, retrospectively. Taleb calls this the Black Swan theory. The book covers subjects relating to knowledge, aesthetics, as well as ways of life, and uses elements of fiction and anecdotes from the author's life to elaborate his theories. It spent 36 weeks on the ''New York Times'' best-seller list. The book is part of Taleb's five-volume series, titled the ''Incerto'', including '' Fooled by Randomness'' (2001), ''The Black Swan'' (2007–2010), '' The Bed of Procrustes'' (2010–2016), '' Antifragile'' (2012), and ''Skin in the Game'' (2018). Coping with Black Swan events A central idea in Taleb's book is not to attempt to predict Black Swan events, but to build robustness to negative events and ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Nassim Taleb
Nassim Nicholas Taleb (; alternatively ''Nessim ''or'' Nissim''; born 12 September 1960) is a Lebanese-American essayist, mathematical statistician, former option trader, risk analyst, and aphorist whose work concerns problems of randomness, probability, and uncertainty. ''The Sunday Times'' called his 2007 book '' The Black Swan'' one of the 12 most influential books since World War II. Taleb is the author of the ''Incerto'', a five-volume philosophical essay on uncertainty published between 2001 and 2018 (of which the best-known books are ''The Black Swan'' and ''Antifragile''). He has been a professor at several universities, serving as a Distinguished Professor of Risk Engineering at the New York University Tandon School of Engineering since September 2008. He has been co-editor-in-chief of the academic journal ''Risk and Decision Analysis'' since September 2014. He has also been a practitioner of mathematical finance, a hedge fund manager, and a derivatives trader, and is ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Monte Carlo Algorithm
In computing, a Monte Carlo algorithm is a randomized algorithm whose output may be incorrect with a certain (typically small) probability. Two examples of such algorithms are Karger–Stein algorithm and Monte Carlo algorithm for minimum Feedback arc set. The name refers to the grand casino in the Principality of Monaco at Monte Carlo, which is well-known around the world as an icon of gambling. The term "Monte Carlo" was first introduced in 1947 by Nicholas Metropolis. Las Vegas algorithms are a dual of Monte Carlo algorithms that never return an incorrect answer. However, they may make random choices as part of their work. As a result, the time taken might vary between runs, even with the same input. If there is a procedure for verifying whether the answer given by a Monte Carlo algorithm is correct, and the probability of a correct answer is bounded above zero, then with probability, one running the algorithm repeatedly while testing the answers will eventually give a cor ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Multivariate Normal Distribution
In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One definition is that a random vector is said to be ''k''-variate normally distributed if every linear combination of its ''k'' components has a univariate normal distribution. Its importance derives mainly from the multivariate central limit theorem. The multivariate normal distribution is often used to describe, at least approximately, any set of (possibly) correlated real-valued random variables each of which clusters around a mean value. Definitions Notation and parameterization The multivariate normal distribution of a ''k''-dimensional random vector \mathbf = (X_1,\ldots,X_k)^ can be written in the following notation: : \mathbf\ \sim\ \mathcal(\boldsymbol\mu,\, \boldsymbol\Sigma), or to make it explicitly known that ''X ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Normal Distribution
In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu is the mean or expectation of the distribution (and also its median and mode), while the parameter \sigma is its standard deviation. The variance of the distribution is \sigma^2. A random variable with a Gaussian distribution is said to be normally distributed, and is called a normal deviate. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known. Their importance is partly due to the central limit theorem. It states that, under some conditions, the average of many samples (observations) of a random variable with finite mean and variance is itself a random variable—whose distribution converges to a normal dist ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Coherent Risk Measure
In the fields of actuarial science and financial economics there are a number of ways that risk can be defined; to clarify the concept theoreticians have described a number of properties that a risk measure might or might not have. A coherent risk measure is a function that satisfies properties of monotonicity, sub-additivity, homogeneity, and translational invariance. Properties Consider a random outcome X viewed as an element of a linear space \mathcal of measurable functions, defined on an appropriate probability space. A functional \varrho : \mathcal → \R \cup \ is said to be coherent risk measure for \mathcal if it satisfies the following properties: Normalized : \varrho(0) = 0 That is, the risk when holding no assets is zero. Monotonicity : \mathrm\; Z_1,Z_2 \in \mathcal \;\mathrm\; Z_1 \leq Z_2 \; \mathrm ,\; \mathrm \; \varrho(Z_1) \geq \varrho(Z_2) That is, if portfolio Z_2 always has better values than portfolio Z_1 under almost all scenarios then the risk of Z_ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Philippe Jorion
Philippe Jorion is an author, professor and risk manager. He is the author of more than 100 publications on the topic of risk management and international finance, and is credited with pioneering the Value at Risk approach to risk management. Jorion's works include ''Financial Risk Manager Handbook'' and ''Value at Risk: The New Benchmark for Managing Financial Risk''. He serves as the Chancellor’s Professor of Finance at the Paul Merage School of Business at the University of California at Irvine and is a managing director at investment firm PAAMCO where he heads the Risk Management group. Jorion has received several awards honoring excellence in research and financial writing, including two from the CFA Institute. Jorion holds an MBA and a PhD from the University of Chicago The University of Chicago (UChicago, Chicago, U of C, or UChi) is a private university, private research university in Chicago, Illinois. Its main campus is located in Chicago's Hyde Park, Chicago, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]