What is entropy?
Entropy is a quantitative measure of randomness. Like the concept of noise, entropy is used to help model and represent the degree of uncertainty of a random variable, such as the prices of securities in a market.
Financial analysts and market technicians use the concept of entropy to determine the probabilities that specific types of predicted price action of a security or market will materialize.
- Entropy refers to the degree of randomness or uncertainty pertaining to a market or security.
- Market analysts and technicians use entropy to describe the level of error that can be expected for a particular prediction or strategy.
- Entropy, along with the concepts of noise and volatility, helps explain why markets can seem inefficient or irrational at times.
How Entropy Works
Entropy has long been a source of study and debate by market analysts and traders. It is used in quantitative analysis and can help predict the probability that a security will move in a certain direction or according to a certain pattern. Volatile securities have higher entropy than stable ones that remain relatively constant in price. The concept of entropy is explored in “A Random Walk on Wall Street.”
One source of entropy in markets is due to noise. Noise refers to random, irrational, or misreported activity that confuses, distorts, or misrepresents genuine underlying trends. This often comes from the trading behaviors of beginning investors or retailers who trade based on excitement, trend chasing, or rumors. The entropy caused by market noise can make it difficult for investors to discern what is driving the trend and whether a trend is changing or just experiencing short-term volatility.
Entropy as a measure of risk
Like beta and volatility, entropy is used to measure financial risk as a measure of randomness. In the world of finance, risk is both beneficial and detrimental depending on the investor’s needs; however, it is generally assumed that higher risk can enhance growth. Investors looking for higher growth are taught to look for high beta or high volatility stocks.
Entropy is used in the same way. An action with a high level of entropy is considered riskier than others. Some analysts believe that entropy provides a better risk model than beta. Entropy, such as beta, and standard deviation have been shown to decrease when the number of assets or securities in a portfolio increases.
In finance, the holy grail has been finding the best way to build a portfolio that exhibits growth and low drawdowns. Another way to put it is, maximum return for the least amount of risk. Much time and energy has been invested in studying data sets and testing many variables. When looking for advantage in portfolio construction, entropy optimization can be very useful. Entropy is a way that analysts and researchers can isolate the randomness or expected surprise of a portfolio.
The main problem with the use of entropy is the calculation itself. Among analysts, there are several theories about the best way to apply the concept in computational finance.
For example, in financial derivatives, entropy is used as a way to identify and minimize risk. In the traditional Black-Scholes capital asset pricing model, the model assumes that all risks can be hedged. That is, all risks can be determined and accounted for. This is not always a realistic model.
The concept of entropy can be applied and represented by a variable to eliminate the randomness created by the underlying security or asset, allowing the analyst to isolate the price of the derivative. In other words, entropy is used as a way of identifying the best variable for which to define risk within a given system or arrangement of financial instruments. The best variable is the one that deviates least from physical reality.
In finance, this can be represented by the use of probabilities and expected values. While the calculus itself is evolving, the purpose is clear; analysts are using the concept to find a better way to price complex financial instruments.