Quantitative analysis infinance refers to the application ofmathematical andstatistical methods to problems in financial markets andinvestment management. Professionals in this field are known asquantitative analysts orquants.
Quants typically specialize in areas such asderivative structuring and pricing,risk management,portfolio management, and other finance-related activities. The role is analogous to that of specialists inindustrial mathematics working in non-financial industries.[1]
Quantitative analysis often involves examining large datasets to identify patterns, such as correlations among liquid assets or price dynamics, including strategies based ontrend following ormean reversion.
Although the original quantitative analysts were "sell side quants" frommarket maker firms, concerned with derivatives pricing and risk management, the meaning of the term has expanded over time to include those individuals involved in almost any application of mathematical finance, including thebuy side.[2] Applied quantitative analysis is commonly associated withquantitative investment management which includes a variety of methods such asstatistical arbitrage,algorithmic trading andelectronic trading.
Some of the larger investment managers using quantitative analysis includeRenaissance Technologies,D. E. Shaw & Co., andAQR Capital Management.[3]
Quantitative finance started in 1900 withLouis Bachelier's doctoralthesis "Theory of Speculation", which provided a model to priceoptions under anormal distribution.Jules Regnault had posited already in 1863 that stock prices can be modelled as arandom walk, suggesting "in a more literary form, the conceptual setting for the application of probability to stockmarket operations".[4]It was, however, only in the years 1960-1970 that the "merit of [these] was recognized"[4]asoptions pricing theory was developed.
Harry Markowitz's 1952 doctoral thesis "Portfolio Selection" and its published version was one of the first efforts in economics journals to formally adapt mathematical concepts to finance (mathematics was until then confined to specialized economics journals).[5] Markowitz formalized a notion ofmean return and covariances for common stocks which allowed him to quantify the concept of "diversification" in a market. He showedhow to compute the mean return and variance for a given portfolio and argued that investors should hold only those portfolios whose variance is minimal among all portfolios with a given mean return. Thus, although the language of finance now involvesItô calculus, management of risk in a quantifiable manner underlies much of the modern theory.
Modernquantitative investment management was first introduced from the research ofEdward Thorp, a mathematics professor atNew Mexico State University (1961–1965) andUniversity of California, Irvine (1965–1977).[6] Considered the "Father of Quantitative Investing",[6] Thorp sought to predict and simulateblackjack, a card-game he played in Las Vegas casinos.[7] He was able to create a system, known broadly ascard counting, which usedprobability theory and statistical analysis to successfully win blackjack games.[7] His research was subsequently used during the 1980s and 1990s by investment management firms seeking to generate systematic and consistent returns in the U.S. stock market.[7]The field has grown to incorporate numerous approaches and techniques; seeOutline of finance § Quantitative investing,Post-modern portfolio theory,Financial economics § Portfolio theory.
In 1965,Paul Samuelson introducedstochastic calculus into the study of finance.[8][9] In 1969,Robert Merton promoted continuous stochastic calculus andcontinuous-time processes. Merton was motivated by the desire to understand how prices are set in financial markets, which is the classical economics question of "equilibrium", and in later papers he used the machinery of stochastic calculus to begin investigation of this issue. At the same time as Merton's work and with Merton's assistance,Fischer Black andMyron Scholes developed theBlack–Scholes model, which was awarded the 1997Nobel Memorial Prize in Economic Sciences. It provided a solution for a practical problem, that of finding a fair price for aEuropean call option, i.e., the right to buy one share of a given stock at a specified price and time. Such options are frequently purchased by investors as a risk-hedging device.
In 1981, Harrison and Pliska used the general theory of continuous-time stochastic processes to put the Black–Scholes model on a solid theoretical basis, and showed how to price numerous other derivative securities, laying the groundwork for the development of thefundamental theorem of asset pricing.[10] The variousshort-rate models (beginning withVasicek in 1977), and the more generalHJM Framework (1987), relatedly allowed for an extension tofixed income andinterest rate derivatives. Similarly, and in parallel, models were developed for various other underpinnings and applications, includingcredit derivatives,exotic derivatives,real options, andemployee stock options. Quants are thus involved in pricing and hedging a wide range of securities –asset-backed,government, andcorporate – additional to classic derivatives; seecontingent claim analysis.Emanuel Derman's 2004 bookMy Life as a Quant helped to both make the role of a quantitative analyst better known outside of finance, and to popularize the abbreviation "quant" for a quantitative analyst.[11]
After the2008 financial crisis, considerations regardingcounterparty credit risk were incorporated into the modelling, previously performed in an entirely "risk neutral world", entailing three major developments; seeValuation of options § Post crisis: (i) Option pricing and hedging inhere the relevantvolatility surface - to some extent, equity-option prices have incorporated thevolatility smile since the1987 crash - and banks then apply "surface aware"local- orstochastic volatility models;(ii) The risk neutral value is adjusted for the impact of counter-party credit risk via acredit valuation adjustment, or CVA, as well as various of the otherXVA;(iii) For discounting, theOIS curve is used for the "risk free rate", as opposed toLIBOR as previously, and, relatedly, quants must model under a "multi-curve framework"(LIBOR is being phased out, with replacements includingSOFR andTONAR, necessitating technical changes to the latter framework, while the underlying logic is unaffected).
This sectiondoes notcite anysources. Please helpimprove this section byadding citations to reliable sources. Unsourced material may be challenged andremoved.(June 2010) (Learn how and when to remove this message) |
Insales and trading, quantitative analysts work to determine prices, manage risk, and identify profitable opportunities. Historically this was a distinct activity fromtrading but the boundary between adesk quantitative analyst and a quantitative trader is increasingly blurred, and it is now difficult to enter trading as a profession without at least some quantitative analysis education.
Front office work favours a higher speed to quality ratio, with a greater emphasis on solutions to specific problems than detailed modeling. FOQs typically are significantly better paid than those in back office, risk, and model validation. Although highly skilled analysts, FOQs frequently lack software engineering experience or formal training, and bound by time constraints and business pressures, tactical solutions are often adopted.
Increasingly, quants are attached to specific desks. Two cases are:XVA specialists, responsible for managingcounterparty risk as well as (minimizing) the capital requirements underBasel III; andstructurers, tasked withthe design and manufacture of client specific solutions.
Quantitative analysis is used extensively byasset managers. Some, such as FQ,AQR orBarclays, rely almost exclusively on quantitative strategies while others, such asPIMCO,BlackRock orCitadel use a mix of quantitative andfundamental methods.
One of the first quantitative investment funds to launch was based inSanta Fe, New Mexico and began trading in 1991 under the namePrediction Company.[7][12] By the late-1990s, Prediction Company began usingstatistical arbitrage to secure investment returns, along with three other funds at the time,Renaissance Technologies andD. E. Shaw & Co, both based in New York.[7] Prediction hired scientists and computer programmers from the neighboringLos Alamos National Laboratory to create sophisticated statistical models using "industrial-strength computers" in order to "[build] theSupercollider of Finance".[13][14]
Machine learning models are now capable of identifying complex patterns in financial market data. With the aid of artificial intelligence, investors are increasingly turning to deep learning techniques to forecast and analyze trends in stock and foreign exchange markets.[15]SeeApplications of artificial intelligence § Trading and investment.
Major firms invest large sums in an attempt to produce standard methods of evaluating prices and risk. These differ from front office tools in thatExcel is very rare, with most development being inC++, thoughJava,C# andPython are sometimes used in non-performance critical tasks. LQs spend more time modeling ensuring the analytics are both efficient and correct, though there is tension between LQs and FOQs on the validity of their results. LQs are required to understand techniques such asMonte Carlo methods andfinite difference methods, as well as the nature of the products being modeled.
Often the highest paid form of Quant, ATQs make use of methods taken fromsignal processing,game theory, gamblingKelly criterion,market microstructure,econometrics, andtime series analysis.
This area has grown in importance in recent years, as the credit crisis exposed holes in the mechanisms used to ensure that positions were correctlyhedged; seeFRTB,{{Section link}}: required section parameter(s) missing.A core technique continues to bevalue at risk - applying boththe parametric and"Historical" approaches, as well asConditional value at risk andExtreme value theory - while this is supplemented with various forms ofstress test,expected shortfall methodologies,economic capital analysis,direct analysis of the positions at thedesk level, and,as below, assessment of the models used by the bank's various divisions.
After the2008 financial crisis, there surfaced the recognition that quantitative valuation methods were generally too narrow in their approach. An agreed upon fix adopted by numerous financial institutions has been to improve collaboration.
Model validation (MV) takes the models and methods developed by front office, library, and modeling quantitative analysts and determines their validity and correctness; seemodel risk. The MV group might well be seen as a superset of the quantitative operations in a financial institution, since it must deal with new and advanced models and trading techniques from across the firm.
Post crisis, regulators now typically talk directly to the quants in the middle office - such as the model validators - and since profits highly depend on the regulatory infrastructure, model validation has gained in weight and importance with respect to the quants in the front office.
Before the crisis however, the pay structure in all firms was such that MV groups struggle to attract and retain adequate staff, often with talented quantitative analysts leaving at the first opportunity. This gravely impacted corporate ability to managemodel risk, or to ensure that the positions being held were correctly valued. An MV quantitative analyst would typically earn a fraction of quantitative analysts in other groups with similar length of experience. In the years following the crisis, as mentioned, this has changed.
Quantitative developers, sometimes called quantitative software engineers, or quantitative engineers, are computer specialists that assist, implement and maintain the quantitative models. They tend to be highly specialised language technicians that bridge the gap betweensoftware engineers and quantitative analysts. The term is also sometimes used outside the finance industry to refer to those working at the intersection ofsoftware engineering andquantitative research.
The nonergodicity of financial markets and the time dependence of returns are central issues in modern approaches to quantitative trading. Financial markets are complex systems in which traditional assumptions, such as independence and normal distribution of returns, are frequently challenged by empirical evidence.[16][17] Thus, under the non-ergodicity hypothesis, the future returns about an investment strategy, which operates on a non-stationary system, depend on the ability of the algorithm itself to predict the future evolutions to which the system is subject. As discussed by Ole Peters in 2011, ergodicity is a crucial element in understanding economic dynamics,[18] especially in non-stationary contexts. Identifying and developing methodologies to estimate this ability represents one of the main challenges of modern quantitative trading.[19][20] In this perspective, it becomes fundamental to shift the focus from the result of individual financial operations to the individual evolutions of the system.
Operationally, this implies that clusters of trades oriented in the same direction offer little value in evaluating the strategy. On the contrary, sequences of trades with alternating buy and sell are much more significant. Since they indicate that the strategy is actually predicting a statistically significant number of evolutions of the system.
Because of their backgrounds, quantitative analysts draw from various forms of mathematics:statistics andprobability,calculus centered aroundpartial differential equations,linear algebra,discrete mathematics, andeconometrics. Some on the buy side may usemachine learning. Themajority of quantitative analysts have received little formal education in mainstream economics, and often apply a mindset drawn from the physical sciences. Quants use mathematical skills learned from diverse fields such as computer science, physics and engineering. These skills include (but are not limited to) advanced statistics, linear algebra and partial differential equations as well as solutions to these based uponnumerical analysis.
Commonly used numerical methods are:
A typical problem for a mathematically oriented quantitative analyst would be to develop a model for pricing, hedging, and risk-managing a complex derivative product. These quantitative analysts tend to rely more on numerical analysis than statistics and econometrics. One of the principal mathematical tools of quantitative finance isstochastic calculus. The mindset, however, is to prefer a deterministically "correct" answer, as once there is agreement on input values and market variable dynamics, there is onlyone correct price for any given security (which can be demonstrated, albeit often inefficiently, through a large volume ofMonte Carlo simulations).
A typical problem for a statistically oriented quantitative analyst would be to develop a model for deciding which stocks are relatively expensive and which stocks are relatively cheap. The model might include a company's book value to price ratio, its trailing earnings to price ratio, and other accounting factors. An investment manager might implement this analysis by buying the underpriced stocks, selling the overpriced stocks, or both. Statistically oriented quantitative analysts tend to have more of a reliance on statistics and econometrics, and less of a reliance on sophisticated numerical techniques and object-oriented programming. These quantitative analysts tend to be of the psychology that enjoys trying to find the best approach to modeling data, and can accept that there is no "right answer" until time has passed and we can retrospectively see how the model performed. Both types of quantitative analysts demand a strong knowledge of sophisticated mathematics and computer programming proficiency.
Quantitative analysis has traditionally been a major source of employment for those withPhD degrees in a quantitative discipline,[21] and analysts often come frommathematics,applied mathematics,physics orengineering backgrounds,[21] learning finance "on the job".
At the same time, the demand for quantitative skillshas led to[21] the creation of specialized Masters[22] and PhD courses infinancial engineering,mathematical finance andcomputational finance (as well as in specific topics such asfinancial reinsurance). In particular, theMaster of Quantitative Finance,Master of Financial Mathematics,Master of Computational Finance andMaster of Financial Engineering are becoming popular with students and with employers.[22][23]
This has, in parallel, led to a resurgence in demand foractuarial qualifications, as well ascommercial certifications such as theCQF.Similarly, the more generalMaster of Finance (andMaster of Financial Economics) increasingly[23]includes a significant technical component.Likewise, masters programs inoperations research,computational statistics,applied mathematics andindustrial engineering may offer a quantitative finance specialization.
While a quantitative analyst has typically needed[21][22] extensive skills incomputer programming — commonlyC,C++ andJava, and latelyR,MATLAB,Mathematica, andPython —data science andmachine learning analysis and methods are increasingly employed.[24][25] As such graduates ofMaster's in these areas are also hired as quants.