Matteo Marcozzi
The evolving regulatory landscape is becoming increasingly important to quants, but besides having to understand these changes, how else are stricter regulations impacting quants? According to Matteo Marcozzi, Quantitative Risk Specialist at Vontobel, new quants in the industry need to learn fast, while governance modelling would need to adapt to new regulatory demands.
In recent years, the unprecedented computational power and the availability of massive amounts of data from disparate sources have fuelled spectacular progress in machine learning techniques and related methods for all sorts of applications.
The abundance of literature on financial applications of machine learning both from the academic and practitioners’ side demonstrates that the financial industry has been involved in these developments. However, as advocated by many experts of the field, a naive use of those techniques may obfuscate the fundamental assumptions on which the models actually rely and may jeopardise the identification of the significant risk factors.
This issue assumes even greater relevance in the light of the regulatory developments after the financial crisis, which demand the deployment of consistent and well-structured processes for the models’ adequacy assessment.
More in general, the current regulatory landscape has triggered a substantial evolution in the spectrum of skills that quants must have in their toolbox. In fact, they should not limit themselves to a sort of solipsistic craftsmanship of sophisticated model features for specific products or numerical routines, but they need to have a solid grasp on of the mechanisms and policies governing model management at the level of internal model governance of the single firm and at the level of the regulatory institutions.
This applies also to junior quants, who should be aware from the beginning of their careers of the regulatory context in which their work takes place and recognise that regulatory changes can deeply impact various technical modelling aspects.
An excellent example of the challenges generated by new regulations is the constellation of the valuation adjustments and the LIBOR reform. In fact, beyond the very practical consequences on the organisational setting and on the allocation of resources of the financial firms, these innovations have been (and still are) producing authentic paradigm shifts in financial modelling.
In particular, as strongly advocated by Brigo, Morini and Pallavicini in the book "Counterparty Credit Risk, Collateral and Funding: With Pricing Cases for All Asset Classes", the consistent incorporation of the valuation adjustments in the pricing systems requires a new "holistic" approach to modelling, where risk cannot be decomposed in orthogonal components.
An outstanding issue in this respect is represented by the coherent treatment of correlations among different risk factors, one example in the above context being the evaluation of the wrong-way-risk.
Furthermore, sound estimation of correlations even within the same risk factor remains problematic. In fact, in literature, several different definitions of correlations are set and not rarely used in an inconsistent way: instantaneous model correlations, historical correlations, implied correlations, etc…
For instance, in the framework of the pricing and risk management of equity multi-underlying complex structured products via local volatility or stochastic volatility models, the relevant quantity is the instantaneous model correlation, i.e., the correlation among the stochastic drivers of the models. In principle, those values (which are model dependent) would have to be extracted from liquid multi-dimensional instruments such as basket options.
However, since the universe of underlyings is typically very large, for most pairs of underlyings, no such quotes exist. In those cases, it is market standard to estimate historical correlations from time-series and then to apply an asset-class-wide correlation shift to transform these values into the relevant model instantaneous correlations. Unfortunately, this correlation marking process is often characterised by discretional choices without any robust modelling assumption.
In conclusion, as demonstrated by the example of correlations’ estimation, I believe that the efforts of the quant community should be devoted to the development of a comprehensive framework where the interplay among risk factors is treated consistently in order to contribute to the stability of the financial system and to meet the high transparency standards of the current regulatory landscape.
Relive the world's leading quant finance conference
Exclusive interviews
Key insights
Session recordings
The latest developments in quant finance