Mathematical Finance
Department of Mathematics and Statistics
Texas Tech University
Deep Learning (DL) methods have been transforming computer vision with innovative adaptations
to other domains including climate change.
For DL to pervade Science and Engineering (S&E) applications where risk management is a core component,
well-characterized uncertainty estimates must accompany predictions.
However, S&E observations and model-simulations often follow heavily skewed distributions and
are not well modeled with DL approaches, since they usually optimize a Gaussian, or Euclidean,
likelihood loss. Recent developments in Bayesian Deep Learning (BDL), which attempts to capture
uncertainties from noisy observations, aleatoric, and from unknown model parameters, epistemic,
provide us a foundation. Here we present a discrete-continuous BDL model with Gaussian and lognormal
ikelihoods for uncertainty quantification (UQ).
We demonstrate the approach by developing UQ estimates on `DeepSD', a super-resolution based
DL model for Statistical Downscaling (SD) in climate applied to precipitation, which follows an
extremely skewed distribution.
We find that the discrete-continuous models outperform a basic Gaussian distribution in terms
of predictive accuracy and uncertainty calibration.
Furthermore, we find that the lognormal distribution, which can handle skewed distributions,
produces quality uncertainty estimates at the extremes.
Such results may be important across S&E, as well as other domains such as finance and economics,
where extremes are often of significant interest.
Furthermore, to our knowledge, this is the first UQ model in SD where both aleatoric and epistemic
uncertainties are characterized.
Environmental, Social, and Governance (ESG) scores measure companies' performance
concerning sustainability and societal impact and are organized on three pillars:
Environmental (E), Social (S), and Governance (G).
These complementary non-financial ESG scores should provide information about the ESG performance and
risks of different companies.
However, the extent of not yet published ESG information makes the reliability of ESG scores questionable.
To explicitly denote the not yet published information on ESG category scores, a new pillar,
the so-called Missing (M) pillar, is formulated. Environmental, Social, Governance, and Missing (ESGM)
scores are introduced to consider the potential release of new information in the future.
Furthermore, an optimization scheme is proposed to compute ESGM scores, linking them to the companies'
riskiness.
By relying on the data provided by Refinitiv, we show that the ESGM scores strengthen the companies'
risk relationship.
These new scores could benefit investors and practitioners as ESG exclusion strategies using only
ESG scores might exclude assets with a low score solely because of their missing information and
not necessarily because of a low ESG merit.
Environmental, Social, and Governance (ESG) are non-financial factors that are garnering attention
from investors as they increasingly look to apply these as part of their analysis to identify material
risks and growth opportunities.
Some of this attention is also driven by clients who, now more aware than ever, are demanding for their
money to be managed and invested responsibly.
As the interest in ESG grows, so does the need for investors to have access to consumable ESG information.
Since most of it is in text form in reports, disclosures, press releases, and 10-Q filings,
we see a need for sophisticated natural language processing (NLP) techniques for classification tasks for ESG text.
We hypothesize that an ESG domain specific pre-trained model will help with such and study building of the same in this paper.
We explored doing this by fine-tuning BERT’s pre-trained weights using ESG specific text and then further
fine-tuning the model for a classification task.
We were able to achieve accuracy better than the original BERT and baseline models in environment-specific classification tasks.
How does import competition from China affect engagement on ESG initiatives by US corporates?
On the one hand, reduced profitability due to import competition and lagging ESG performance of Chinese exporters
can disincentivize US firms to put more resources to ESG initiatives.
On the other hand, the shift from labor-intensive production to capital/technology-intensive
production along with offshoring may improve the US company's ESG performance.
Moreover, US companies have incentives to actively pursue more ESG engagement to differentiate from Chinese imports.
Exploiting a trade policy in which US congress granted China the Permanent Normal Trade Relations and the resulting change
in expected tariff rates on Chinese imports, we find that greater import competition from China leads to an increase in the
US company's ESG performance.
The improvement primarily stems from "doing more positives" and from more involvement on environmental initiatives.
Indirect and direct evidence shows that the improvement is not driven by the change in production process or offshoring,
but is consistent with product differentiation.
Our results suggest that the trade shock from China has significant impact on the US company's ESG performance.
We systematically investigate the links between price returns and ESG features.
We propose a cross-validation scheme with random company-wise validation to mitigate the relative
initial lack of quantity and quality of ESG data, which allows us to use most of the latest and
best data to both train and validate our models.
Boosted trees successfully explain a single bit of annual price returns not accounted for in the
traditional market factor.
We check with benchmark features that ESG features do contain significantly more information than
basic fundamental features alone.
The most relevant sub-ESG feature encodes controversies.
Finally, we find opposite effects of better ESG scores on the price returns of small and large
capitalization companies: better ESG scores are generally associated with larger price returns for the latter,
and reversely for the former.
The behaviour of a person is dominated by their ability to process uncertain information
available to them.
When there is a range of alternatives to choose from, the likelihoods assigned by the person
to these different alternatives determine the state of their mind in relation to that
particular choice.
When new information arrives, the person’s perspective changes, generating behavioural dynamics.
To model this behaviour, it is highly effective to use the mathematics of signal processing.
In this scheme, it is then possible to represent (i) reliable information, (ii) noise,
and (iii) disinformation in a unified framework.
Because the approach is designed to characterise the dynamics of the behaviour of people,
it is possible to quantify the impact of information control, including those resulting
from the dissemination of disinformation.
It can be shown that if a decision maker assigns an exceptionally high weight on one of
the alternative realities, then under the Bayesian logic their perception hardly changes
in time even if evidences presented indicate that this alternative corresponds to a false reality.
Thus confirmation bias need not be incompatible with Bayesian updating;
contrary to what is widely believed in psychology.
The information-based approach, originated in financial modelling, when applied to psychology,
also poses new challenges in stochastic analysis, which will be discussed briefly.
The talk will be an extended version of an informal article in:
https://theconversation.com/the-mathematics-of-human-behaviour-how-my-new-model-can-spot-liars-and-counter-disinformation-185309
We develop a novel approach for the construction of quantile processes governing the stochastic
dynamics of quantiles in continuous time.
Two classes of quantile diffusions are identified: the first, which we largely focus on,
features a dynamic random quantile level and allows for direct interpretation of the resulting
quantile process characteristics such as location, scale, skewness and kurtosis, in terms of the
model parameters.
The second type are function-valued quantile diffusions and are driven by stochastic parameter processes,
which determine the entire quantile function at each point in time.
By the proposed innovative and simple -- yet powerful -- construction method, quantile processes
are obtained by transforming the marginals of a diffusion process under a composite map consisting
of a distribution and a quantile function.
Such maps, analogous to rank transmutation maps, produce the marginals of the resulting quantile process.
We discuss the relationship and differences between our approach and existing methods and
characterisations of quantile processes in discrete and continuous time.
As an example of an application of quantile diffusions, we show how probability measure distortions,
a form of dynamic tilting, can be induced.
Though particularly useful in financial mathematics and actuarial science, examples of which are
given in this work, measure distortions feature prominently across multiple research areas.
For instance, dynamic distributional approximations (statistics), non-parametric and asymptotic
analysis (mathematical statistics), dynamic risk measures (econometrics), behavioural economics,
decision making (operations research), signal processing (information theory), and not least in
general risk theory including applications thereof, for example in the context of climate change.
We designed a machine learning algorithm that identifies patterns between ESG profiles and
financial performances for companies in a large investment universe.
The algorithm consists of regularly updated sets of rules that map regions into the high-dimensional
space of ESG features to excess return predictions.
The final aggregated predictions are transformed into scores which allow us to design simple strategies
that screen the investment universe for stocks with positive scores.
By linking the ESG features with financial performances in a non-linear way,
our strategy based upon our machine learning algorithm turns out to be an efficient stock picking tool,
which outperforms classic strategies that screen stocks according to their ESG ratings,
as the popular best-in-class approach.
Our paper brings new ideas in the growing field of financial literature that investigates
the links between ESG behavior and the economy.
We show indeed that there is clearly some form of alpha in the ESG profile of a company,
but that this alpha can be accessed only with powerful, non-linear techniques such as machine learning.
Bio:
Carmine de Franco is the head of research at Ossiam, an asset management firm specializing in
systematic and quantitative ETFs, located in Paris.
Graduated in Mathematics from the University of Roma II - Tor Vergata and the
University Paris VII - Denis Diderot, he holds a PhD in Probability and a master’s degree in
Financial Random Modelling from the University Paris VII-Denis.
Carmine joined Ossiam in May 2012 after working for 4 years at the Faculty of Mathematics of the
University of Paris VII (Université Denis Diderot).
His domain of expertise spans from mathematics and probability theory to statistics,
from financial research to the design of investment strategy and cross-assets portfolio construction.
More recently, his research topics have focused on ESG themes, low carbon approaches and biodiversity
in financial investments, machine learning and artificial intelligence.
He is co-author of several research papers on portfolio insurance, modelling and hedging with
stochastic jumps, regime switching models, interest rates, equity, smart beta and factor investing,
ESG, machine learning, Bayesian learning and portfolio construction under uncertainty, carbon and
biodiversity.
Ossiam: is a Paris-based asset manager focused on quantitative and systematic investment
solutions since 2009 with a distinct vision: providing clear, transparent access to quantitative,
research-based strategies.
Ossiam is an affiliate of Natixis Investment Managers and manages a range of ETFs, open ended-funds,
dedicated funds and mandates across a variety of asset classes and themes.
Ossiam is a signatory of the UN-supported Principles for Responsible Investment since 2016 and a
signatory of the Finance for Biodiversity Pledge since 2021.
As of end of July 2021, Ossiam had 5 bn EUR in assets under management.Catastrophe (CAT) bond markets are incomplete and hence carry uncertainty in instrument pricing.
As such various pricing approaches have been proposed, but none treat the uncertainty in catastrophe
occurrences and interest rates in a sufficiently flexible and statistically reliable way within a
unifying asset pricing framework.
Consequently, little is known empirically about the expected risk-premia of CAT bonds.
The primary contribution of this paper is to present a unified Bayesian CAT bond pricing framework
based on uncertainty quantification of catastrophes and interest rates.
Our framework allows for complex beliefs about catastrophe risks to capture the distinct and common
patterns in catastrophe occurrences, and when combined with stochastic interest rates, yields a
unified asset pricing approach with informative expected risk premia.
Specifically, using a modified collective risk model -- Dirichlet Prior-Hierarchical Bayesian Collective
Risk Model (DP-HBCRM) framework -- we model catastrophe risk via a model-based clustering approach.
Interest rate risk is modeled as a CIR process under the Bayesian approach.
As a consequence of casting CAT pricing models into our framework,
we evaluate the price and expected risk premia of various CAT bond contracts corresponding to
clustering of catastrophe risk profiles.
Numerical experiments show how these clusters reveal how CAT bond prices and expected risk premia
relate to claim frequency and loss severity.
This is joint work with Dixon Domfeh and Arpita Chatterjee.
This study examines the impact of ESG ratings on mutual fund holdings, stock returns,
corporate investment, and corporate ESG practices, using panel event studies.
Looking specifically at changes in the MSCI ESG rating, we document that rating downgrades
reduce ownership by mutual funds with a dedicated ESG strategy, while upgrades increase it.
We find a negative long-term response of stock returns to downgrades and a slower and weaker
positive response to upgrades.
Regarding firm responses, we find no significant effect of up- or downgrades on capital
expenditure.
We find that firms adjust their ESG practices following rating changes, but only in the
governance dimension.
These results suggest that ESG rating changes matter in financial markets, but so far have
only a limited impact on the real economy.
Statistical analysis and stochastic interest rate modelling for valuing the future
with implications in climate change mitigation
High future discounting rates favor inaction on present expending while lower rates advise
for a more immediate political action. A possible approach to this key issue in global
economy is to take historical time series for nominal interest rates and inflation,
and to construct then real interest rates and finally obtaining the resulting discount
rate according to a specific stochastic model. Extended periods of negative real interest rates,
in which inflation dominates over nominal rates, are commonly observed, occurring in many
epochs and in all countries.
This feature leads us to choose a well-known model in statistical physics,
the Ornstein-Uhlenbeck model, as a basic dynamical tool in which real interest rates
randomly fluctuate and can become negative, even if they tend to revert to a positive mean value.
By covering 14 countries over hundreds of years we suggest different scenarios and include
an error analysis in order to consider the impact of statistical uncertainty in our results.
We find that only 4 of the countries have positive long-run discount rates while the other ten
countries have negative rates.
Even if one rejects the countries where hyperinflation has occurred, our results support
the need to consider low discounting rates.
The results provided by these fourteen countries significantly increase the priority of
confronting global actions such as climate change mitigation.
We finally extend the analysis by first allowing for fluctuations of the mean level in
the Ornstein-Uhlenbeck model and secondly by considering modified versions of the Feller
and lognormal models.
In both cases, results remain basically unchanged thus demonstrating the robustness of
the results presented.
We develop an agent-based simulation of the catastrophe insurance and reinsurance industry
and use it to study the problem of risk model homogeneity.
The model simulates the balance sheets of insurance firms, who collect premiums from clients
in return for ensuring them against intermittent, heavy-tailed risks.
Firms manage their capital and pay dividends to their investors, and use either
reinsurance contracts or cat bonds to hedge their tail risk.
The model generates plausible time series of profits and losses and recovers stylized facts,
such as the insurance cycle and the emergence of asymmetric, long tailed firm size distributions.
We use the model to investigate the problem of risk model homogeneity.
Under Solvency II, insurance companies are required to use only certified risk models.
This has led to a situation in which only a few firms provide risk models, creating a
systemic fragility to the errors in these models.
We demonstrate that using too few models increases the risk of nonpayment and default
while lowering profits for the industry as a whole.
The presence of the reinsurance industry ameliorates the problem but does not remove it.
Our results suggest that it would be valuable for regulators to incentivize model diversity.
The framework we develop here provides a first step toward a simulation model of the insurance
industry for testing policies and strategies for better capital management.