New articles on Quantitative Finance


[1] 2407.14642

Applying the Nash Bargaining Solution for a Reasonable Royalty II

This paper expands on the concepts presented in Applying the Nash Bargaining Solution for a Reasonable Royalty ( arXiv:2005.10158 ). The goal is to refine the process for determining a reasonable royalty using statistical methods in cases where there is risk and uncertainty regarding each party's disagreement payoffs (opportunity costs) in the Nash Bargaining Solution (NBS). This paper uses a Bayes Cost approach to analyze Case 1, Case 2, and the Original Nash model from the authors' previous work. By addressing risk and uncertainty in the NBS, the NBS emerges as a more reliable method for estimating a reasonable royalty, aligning with the criteria outlined in Georgia Pacific factor fifteen.


[2] 2407.14728

An Integral Equation Approach for the Valuation of Finite-maturity margin-call Stock Loans

This paper examines the pricing issue of margin-call stock loans with finite maturities under the Black-Scholes-Merton framework. In particular, using a Fourier Sine transform method, we reduce the partial differential equation governing the price of a margin-call stock loan into an ordinary differential equation, the solution of which can be easily found (in the Fourier Sine space) and analytically inverted into the original space. As a result, we obtain an integral representation of the value of the stock loan in terms of the unknown optimal exit prices, which are, in turn, governed by a Volterra integral equation. We thus can break the pricing problem of margin-call stock loans into two steps: 1) finding the optimal exit prices by solving numerically the governing Volterra integral equation and 2) calculating the values of margin-call stock loans based on the obtained optimal exit prices. By validating and comparing with other available numerical methods, we show that our proposed numerical scheme offers a reliable and efficient way to calculate the service fee of a margin-call stock loan contract, track the contract value over time, and compute the level of stock price above which it is optimal to exit the contract. The effects of the margin-call feature on the loan contract are also examined and quantified.


[3] 2407.14734

Super-efficiency and Stock Market Valuation: Evidence from Listed Banks in China (2006 to 2023)

This study investigates the relationship between bank efficiency and stock market valuation using an unbalanced panel dataset of 42 listed banks in China from 2006 to 2023. We employ a non-radial and non-oriented slack based super-efficiency Data Envelopment Analysis (Super-SBM-UND-VRS based DEA) model, which treats Non-Performing Loans (NPLs) as an undesired output. Our results show that the relationship between super-efficiency and stock market valuation is stronger than that between Return on Asset (ROA) and stock market performance, as measured by Tobin's Q. Notably, the Super-SBM-UND-VRS model yields novel results compared to other efficiency methods, such as the Stochastic Frontier Analysis (SFA) approach and traditional DEA models. Furthermore, our results suggest that bank evaluations benefit from decreased ownership concentration, whereas interest rate liberalization has the opposite effect.


[4] 2407.14736

Is the difference between deep hedging and delta hedging a statistical arbitrage?

The recent work of Horikawa and Nakagawa (2024) explains that there exist complete market models in which the difference between the hedging position provided by deep hedging and that of the replicating portfolio is a statistical arbitrage. This raises concerns as it entails that deep hedging can include a speculative component aimed simply at exploiting the structure of the risk measure guiding the hedging optimisation problem. We test whether such finding remains true in a GARCH-based market model. We observe that the difference between deep hedging and delta hedging can be a statistical arbitrage if the risk measure considered does not put sufficient relative weight on adverse outcomes. Nevertheless, a suitable choice of risk measure can prevent the deep hedging agent from including a speculative overlay within its hedging strategy.


[5] 2407.14773

Similarity of Information and Collective Action

We study a canonical collective action game with incomplete information. Individuals attempt to coordinate to achieve a shared goal, while also facing a temptation to free-ride. Consuming more similar information about the fundamentals can help them coordinate, but it can also exacerbate free-riding. Our main result shows that more similar information facilitates (impedes) achieving a common goal when achieving the goal is sufficiently challenging (easy). We apply this insight to show why insufficiently powerful authoritarian governments may face larger protests when attempting to restrict press freedom, and why informational diversity in committees is beneficial when each vote carries more weight.


[6] 2407.14776

National accounting from the bottom up using large-scale financial transactions data: An application to input-output tables

Technical advances enabled real-time data collection at a large scale, but lacking standards hamper their economic interpretation. Here, we benchmark a new monthly time series of inter-industrial flows of funds, constructed from aggregated and anonymised real-time payments between UK businesses, covering 5-digit SIC codes industries for the period 08/2015 to 12/2023, against established economic indicators, including GDP, input-output tables (IOTs), and stylised facts of granular firm- and industry-level production networks. We supplement the quantitative analyses with conceptual discussions, explaining the caveats of bottom-up collected payment data and their differences to national account tables. The results reveal strong GDP correlations, some qualitative consistency with official IOTs and stylised facts. We guide on the interpretation of the data and areas that require special attention for reliable quantitative research.


[7] 2407.14955

Temptation: Immediacy and certainty

Is an option especially tempting when it is both immediate and certain? I test the effect of risk on the present-bias factor given quasi-hyperbolic discounting. My experimental subjects allocate about thirty to fifty minutes of real-effort tasks between two weeks. I study dynamic consistency by comparing choices made two days in advance of the workday with choices made when work is imminent. My novel design permits estimation of present-bias using a decision with a consequence that is both immediate and certain. I find greater present-bias when the consequence is certain. I offer a methodological remedy for experimental economists.


[8] 2407.15038

Explainable AI in Request-for-Quote

In the contemporary financial landscape, accurately predicting the probability of filling a Request-For-Quote (RFQ) is crucial for improving market efficiency for less liquid asset classes. This paper explores the application of explainable AI (XAI) models to forecast the likelihood of RFQ fulfillment. By leveraging advanced algorithms including Logistic Regression, Random Forest, XGBoost and Bayesian Neural Tree, we are able to improve the accuracy of RFQ fill rate predictions and generate the most efficient quote price for market makers. XAI serves as a robust and transparent tool for market participants to navigate the complexities of RFQs with greater precision.


[9] 2407.15105

Weak convergence implies convergence in mean within GGC

We prove that weak convergence within generalized gamma convolution (GGC) distributions implies convergence in the mean value. We use this fact to show the robustness of the expected utility maximizing optimal portfolio under exponential utility function when return vectors are modelled by hyperbolic distributions.


[10] 2407.15147

Industry Dynamics with Cartels: The Case of the Container Shipping Industry

I investigate how explicit cartels, known as ``shipping conferences", in a global container shipping market facilitated the formation of one of the largest globally integrated markets through entry, exit, and shipbuilding investment of shipping firms. Using a novel data, I develop and construct a structural model and find that the cartels shifted shipping prices by 20-50\% and encouraged firms' entry and investment. In the counterfactual, I find that cartels would increase producer surplus while slightly decreasing consumer surplus, then may increase social welfare by encouraging firms' entry and shipbuilding investment. This would validate industry policies controlling prices and quantities in the early stage of the new industry, which may not be always harmful. Investigating hypothetical allocation rules supporting large or small firms, I find that the actual rule based on tonnage shares is the best to maximize social welfare.


[11] 2407.15339

Deep Learning for Economists

Deep learning provides powerful methods to impute structured information from large-scale, unstructured text and image datasets. For example, economists might wish to detect the presence of economic activity in satellite images, or to measure the topics or entities mentioned in social media, the congressional record, or firm filings. This review introduces deep neural networks, covering methods such as classifiers, regression models, generative AI, and embedding models. Applications include classification, document digitization, record linkage, and methods for data exploration in massive scale text and image corpora. When suitable methods are used, deep learning models can be cheap to tune and can scale affordably to problems involving millions or billions of data points.. The review is accompanied by a companion website, EconDL, with user-friendly demo notebooks, software resources, and a knowledge base that provides technical details and additional applications.


[12] 2407.15509

The increase in the number of low-value transactions in international trade

This paper documents a new feature of international trade: the increase in the number of low-value transactions. Using Spanish data, we show that the share of low-value transactions in the total number of transactions increased from 9% to 61% in exports and from 14% to 54% in imports between 1997 and 2023. The increase in the number of low-value trade transactions is related to the rise in e-commerce and direct-to-customer sales facilitated by online retail platforms. In the case of exports, the increase in the number of low-value transactions is also explained by the fast-fashion strategy followed by clothing firms.


[13] 2407.15532

Large-scale Time-Varying Portfolio Optimisation using Graph Attention Networks

Apart from assessing individual asset performance, investors in financial markets also need to consider how a set of firms performs collectively as a portfolio. Whereas traditional Markowitz-based mean-variance portfolios are widespread, network-based optimisation techniques have built upon these developments. However, most studies do not contain firms at risk of default and remove any firms that drop off indices over a certain time. This is the first study to incorporate risky firms and use all the firms in portfolio optimisation. We propose and empirically test a novel method that leverages Graph Attention networks (GATs), a subclass of Graph Neural Networks (GNNs). GNNs, as deep learning-based models, can exploit network data to uncover nonlinear relationships. Their ability to handle high-dimensional features and accommodate customised layers for specific purposes makes them particularly appealing for large-scale problems such as mid- and small-cap portfolio optimization. This study utilises 30 years of data on mid-cap firms, creating graphs of firms using distance correlation and the Triangulated Maximally Filtered Graph approach. These graphs are the inputs to a GAT model that we train using custom layers which impose weight and allocation constraints and a loss function derived from the Sharpe ratio, thus directly maximising portfolio risk-adjusted returns. This new model is benchmarked against a network characteristic-based portfolio, a mean variance-based portfolio, and an equal-weighted portfolio. The results show that the portfolio produced by the GAT-based model outperforms all benchmarks and is consistently superior to other strategies over a long period while also being informative of market dynamics.


[14] 2407.15536

Calibrating the Heston Model with Deep Differential Networks

We propose a gradient-based deep learning framework to calibrate the Heston option pricing model (Heston, 1993). Our neural network, henceforth deep differential network (DDN), learns both the Heston pricing formula for plain-vanilla options and the partial derivatives with respect to the model parameters. The price sensitivities estimated by the DDN are not subject to the numerical issues that can be encountered in computing the gradient of the Heston pricing function. Thus, our network is an excellent pricing engine for fast gradient-based calibrations. Extensive tests on selected equity markets show that the DDN significantly outperforms non-differential feedforward neural networks in terms of calibration accuracy. In addition, it dramatically reduces the computational time with respect to global optimizers that do not use gradient information.


[15] 2407.15755

Income, health, and cointegration

Data for many nations show a long-run increase, over many decades, of income, indexed by GDP per capita, and population health, indexed by mortality or life expectancy at birth (LEB). However, the short-run and long-run relationships between these variables have been interpreted in different ways, and many controversies are still open. Some authors have claimed that the causal relationships between population health and income can be discovered using cointegration models. We show, however, that empirically testing a cointegration relation between LEB and GDP per capita is not a sound method to infer a causal link between health and income. For a given country it is easy to find computer-generated data or time series of real observations, related or unrelated to the country, that according to standard methods are also cointegrated with the country's LEB. More generally, given a trending time series, it is easy to find other series, observational or artificial, that appear cointegrated with it. Thus, standard cointegration methodology cannot distinguish whether cointegration relationships are spurious or causal.


[16] 2407.15757

Willingness to Pay for an Electricity Connection: A Choice Experiment Among Rural Households and Enterprises in Nigeria

Rural electrification initiatives worldwide frequently encounter financial planning challenges due to a lack of reliable market insights. This research delves into the preferences and marginal willingness to pay (mWTP) for upfront electricity connections in rural and peri-urban areas of Nigeria. We investigate discrete choice experiment data gathered from 3,599 households and 1,122 Small to Medium-sized Enterprises (SMEs) across three geopolitical zones of Nigeria, collected during the 2021 PeopleSuN project survey phase. Employing conditional logit modeling, we analyze this data to explore preferences and marginal willingness to pay for electricity connection. Our findings show that households prioritize nighttime electricity access, while SMEs place a higher value on daytime electricity. When comparing improvements in electricity capacity to medium or high-capacity, SMEs exhibit a sharp increase in willingness to pay for high-capacity, while households value the two options more evenly. Preferences for the electricity source vary among SMEs, but households display a reluctance towards diesel generators and a preference for the grid or solar solutions. Moreover, households with older heads express greater aversion to connection fees, and male-headed households show a stronger preference for nighttime electricity compared to their female-headed counterparts. The outcomes of this study yield pivotal insights to tailor electrification strategies for rural Nigeria, emphasizing the importance of considering the diverse preferences of households and SMEs.


[17] 2407.15766

Analyzing selected cryptocurrencies spillover effects on global financial indices: Comparing risk measures using conventional and eGARCH-EVT-Copula approaches

This study examines the interdependence between cryptocurrencies and international financial indices, such as MSCI World and MSCI Emerging Markets. We compute the value at risk, expected shortfall (ES), and range value at risk (RVaR) and investigate the dynamics of risk spillover. We employ a hybrid approach to derive these risk measures that integrate GARCH models, extreme value models, and copula functions. This framework uses a bivariate portfolio approach involving cryptocurrency data and traditional financial indices. To estimate the above risks of these portfolio structures, we employ symmetric and asymmetric GARCH and both tail flexible EVT models as marginal to model the marginal distribution of each return series and apply different copula functions to connect the pairs of marginal distributions into a multivariate distribution. The empirical findings indicate that the eGARCH EVT-based copula model adeptly captures intricate dependencies, surpassing conventional methodologies like Historical simulations and t-distributed parametric in VaR estimation. At the same time, the HS method proves superior for ES, and the t-distributed parametric method outperforms RVaR. Eventually, the Diebold-Yilmaz approach will be applied to compute risk spillovers between four sets of asset sequences. This phenomenon implies that cryptocurrencies reveal substantial spillover effects among themselves but minimal impact on other assets. From this, it can be concluded that cryptocurrencies propose diversification benefits and do not provide hedging advantages within an investor's portfolio. Our results underline RVaR superiority over ES regarding regulatory arbitrage and model misspecification. The conclusions of this study will benefit investors and financial market professionals who aspire to comprehend digital currencies as a novel asset class and attain perspicuity in regulatory arbitrage.


[18] 2407.14573

Trading Devil Final: Backdoor attack via Stock market and Bayesian Optimization

Since the advent of generative artificial intelligence, every company and researcher has been rushing to develop their own generative models, whether commercial or not. Given the large number of users of these powerful new tools, there is currently no intrinsically verifiable way to explain from the ground up what happens when LLMs (large language models) learn. For example, those based on automatic speech recognition systems, which have to rely on huge and astronomical amounts of data collected from all over the web to produce fast and efficient results, In this article, we develop a backdoor attack called MarketBackFinal 2.0, based on acoustic data poisoning, MarketBackFinal 2.0 is mainly based on modern stock market models. In order to show the possible vulnerabilities of speech-based transformers that may rely on LLMs.


[19] 2407.15016

Rethinking Digitalization and Climate: Don't Predict, Mitigate

Digitalization is a core component of the green transition. Today's focus is on quantifying and pre-dicting the climate effects of digitalization through various life-cycle assessments and baseline sce-nario methodologies. Here we argue that this is a mistake. Most attempts at prediction are based on three implicit assumptions: (a) the digital carbon footprint can be quantified, (b) business-as-usual with episodic change leading to a new era of stability, and (c) investments in digitalization will be delivered within the cost, timeframe, and benefits described in their business cases. We problema-tize each assumption within the context of digitalization and argue that the digital carbon footprint is inherently unpredictable. We build on uncertainty literature to show that even if you cannot predict, you can still mitigate. On that basis, we propose to rethink practice on the digital carbon footprint from prediction to mitigation.


[20] 2407.15388

A new paradigm of mortality modeling via individual vitality dynamics

The significance of mortality modeling extends across multiple research areas, including life insurance valuation, longevity risk management, life-cycle hypothesis, and retirement income planning. Despite the variety of existing approaches, such as mortality laws and factor-based models, they often lack compatibility or fail to meet specific research needs. To address these shortcomings, this study introduces a novel approach centered on modeling the dynamics of individual vitality and defining mortality as the depletion of vitality level to zero. More specifically, we develop a four-component framework to analyze the initial value, trend, diffusion, and sudden changes in vitality level over an individual's lifetime. We demonstrate the framework's estimation and analytical capabilities in various settings and discuss its practical implications in actuarial problems and other research areas. The broad applicability and interpretability of our vitality-based modeling approach offer an enhanced paradigm for mortality modeling.


[21] 2407.15715

Cryptoeconomics and Tokenomics as Economics: A Survey with Opinions

This paper surveys products and studies on cryptoeconomics and tokenomics from an economic perspective, as these terms are still (i) ill-defined and (ii) disconnected from economic disciplines. We first suggest that they can be novel when integrated; we then conduct a literature review and case study following consensus-building for decentralization and token value for autonomy. Integration requires simultaneous consideration of strategic behavior, spamming, Sybil attacks, free-riding, marginal cost, marginal utility and stabilizers. This survey is the first systematization of knowledge on cryptoeconomics and tokenomics, aiming to bridge the contexts of economics and blockchain.