This PhD Thesis presents an investigation into the analysis of financial returns using mixture models, focusing on mixtures of generalized normal distributions (MGND) and their extensions. The study addresses several critical issues encountered in the estimation process and proposes innovative solutions to enhance accuracy and efficiency. In Chapter 2, the focus lies on the MGND model and its estimation via expectation conditional maximization (ECM) and generalized expectation maximization (GEM) algorithms. A thorough exploration reveals a degeneracy issue when estimating the shape parameter. Several algorithms are proposed to overcome this critical issue. Chapter 3 extends the theoretical perspective by applying the MGND model on several stock market indices. A two-step approach is proposed for identifying turmoil days and estimating returns and volatility. Chapter 4 introduces constrained mixture of generalized normal distributions (CMGND), enhancing interpretability and efficiency by imposing constraints on parameters. Simulation results highlight the benefits of constrained parameter estimation. Finally, Chapter 5 introduces generalized normal distribution-hidden Markov models (GND-HMMs) able to capture the dynamic nature of financial returns. This manuscript contributes to the statistical modelling of financial returns by offering flexible, parsimonious, and interpretable frameworks. The proposed mixture models capture complex patterns in financial data, thereby facilitating more informed decision-making in financial analysis and risk management.
In the current context of accelerated globalization and digitalization, the complexity and uncertainty of financial markets are increasing, and the identification and prevention of economic risks have become a key link in maintaining the stability of the financial system. Traditional risk identification methods often have limitations because they are difficult to cope with the multi-level and dynamically changing complex relationships in financial networks. With the rapid development of financial technology, graph neural network (GNN) technology, as an emerging deep learning method, has gradually shown great potential in the field of financial risk management. GNN can map transaction behaviors, financial institutions, individuals, and their interactive relationships in financial networks into graph structures, and effectively capture potential patterns and abnormal signals in financial data through embedded representation learning. Using this technology, financial institutions can extract valuable information from complex transaction networks, identify hidden dangers or abnormal behaviors that may cause systemic risks in a timely manner, optimize decision-making processes, and improve the accuracy of risk warnings. This paper explores the economic risk identification algorithm based on the GNN algorithm, aiming to provide financial institutions and regulators with more intelligent technical tools to help maintain the security and stability of the financial market. Improving the efficiency of economic risk identification through innovative technical means is expected to further enhance the risk resistance of the financial system and lay the foundation for building a robust global financial system.
The objective of the paper is to price weather derivative contracts based on temperature and precipitation as underlying climate variables. We use a neural network approach combined with time series forecast to value Pacific Rim index in Toronto and Chicago
The aim of this study is to estimate the implications for the reference rate and for the distribution of the tax burden among households from the consumption tax reform as outlined in the version of the Projeto de Lei Complementar (PLP) 68 approved by the Brazilian Chamber of Deputies in July 2024.
This paper introduces and examines numerical approximation schemes for computing risk budgeting portfolios associated to positive homogeneous and sub-additive risk measures. We employ Mirror Descent algorithms to determine the optimal risk budgeting weights in both deterministic and stochastic settings, establishing convergence along with an explicit non-asymptotic quantitative rate for the averaged algorithm. A comprehensive numerical analysis follows, illustrating our theoretical findings across various risk measures -- including standard deviation, Expected Shortfall, deviation measures, and Variantiles -- and comparing the performance with that of the standard stochastic gradient descent method recently proposed in the literature.
In this paper, we introduce a novel pricing model for Uniswap V3, built upon stochastic processes and the Martingale Stopping Theorem. This model innovatively frames the valuation of positions within Uniswap V3. We further conduct a numerical analysis and examine the sensitivities through Greek risk measures to elucidate the model's implications. The results underscore the model's significant academic contribution and its practical applicability for Uniswap liquidity providers, particularly in assessing risk exposure and guiding hedging strategies.
Rising market power threatens competition and decreases consumers' welfare. To date, a few works have shown how global firm-level markups increase, but there is scant evidence about the channels of such a change. This study investigates the causal impact of takeovers on markups and related firm-level outcomes on European manufacturing in 2007- 2021. Interestingly, findings suggest that takeovers aimed at vertical integration strategies are procompetitive because they result in lower markups (0.7%) and more sales (2.9%). The effects are higher as time passes from the takeover event, and they increase with the parents' number of already integrated subsidiaries. Notably, we do not find a significant impact on markups in horizontal integration strategies after we control for cherry-picking by acquirers. Eventually, we emphasize that our results on vertical takeovers point to strategies aimed at eliminating double profit margins on the input markets; thus, lower markups increase sales, spreading fixed costs and benefiting from economies of scale. Several checks on methods and sample composition effects confirm our central tenets. Finally, we reconnect with the debate initiated by the U.S. Vertical Merger Guidelines (2020; 2023), where the presumption of harm after vertical deals has been softened, thus considering procompetitive effects, but the discussion of potential
It is well-established that the home advantage (HA), the phenomenon that on average the local team performs better than the visiting team, exists in many sports. In response to the COVID-19 outbreak, spectators were banned from football stadiums, which we leverage as a natural experiment to examine the impact of stadium spectators on HA. Using data from the first division of the German Bundesliga for seasons 2016/17 to 2023/24, we are the first to focus on a longer time horizon and consider not only the first but all three seasons subject to spectator regulations as well as two subsequent seasons without. We confirm previous studies regarding the disappearance of the HA in the last nine matches of season 2019/20. This drop materialised almost entirely through a reduction of home goals. The HA in season 2020/21 (with spectator ban during most matches) was very close to the pre-COVID-19 season 2018/19, indicating that teams became accustomed to the absence of spectators. For season 2021/22, with varying spectator regulations, we detect a U-shaped relationship between HA and the stadium utilisation rate, where HA increases considerably for matches with medium stadium utilisation which is associated with a larger difference in running distance between the home and away teams.
Tax administrative cost reduction is an economically and socially desirable goal for public policy. This article proposes total administrative cost as percentage of total tax revenue as a vivid measurand, also useful for cross-jurisdiction comparisons. Statistical data, surveys and a novel approach demonstrate: Germany's 2021 tax administrative costs likely exceeded 20% of total tax revenue, indicating need for improvement of Germany's taxation system - and for the many jurisdictions with similar tax regimes. In addition, this article outlines possible reasons for and implications of the seemingly high tax administrative burden as well as solutions.
We study the distributional implications of uncertainty shocks by developing a model that links macroeconomic aggregates to the US distribution of earnings and consumption. We find that: initially, the fraction of low-earning workers decreases, while the share of households reporting low consumption increases; at longer horizons, the fraction of low-income workers increases, but the consumption distribution reverts to its pre-shock shape. While the first phase reduces income inequality and increases consumption inequality, in the second stage income inequality rises, while the effects on consumption inequality dissipate. Finally, we introduce Functional Local Projections and show that they yield similar results.
Advancements in large language models (LLMs) have renewed concerns about AI alignment - the consistency between human and AI goals and values. As various jurisdictions enact legislation on AI safety, the concept of alignment must be defined and measured across different domains. This paper proposes an experimental framework to assess whether LLMs adhere to ethical and legal standards in the relatively unexplored context of finance. We prompt nine LLMs to impersonate the CEO of a financial institution and test their willingness to misuse customer assets to repay outstanding corporate debt. Beginning with a baseline configuration, we adjust preferences, incentives and constraints, analyzing the impact of each adjustment with logistic regression. Our findings reveal significant heterogeneity in the baseline propensity for unethical behavior of LLMs. Factors such as risk aversion, profit expectations, and regulatory environment consistently influence misalignment in ways predicted by economic theory, although the magnitude of these effects varies across LLMs. This paper highlights both the benefits and limitations of simulation-based, ex post safety testing. While it can inform financial authorities and institutions aiming to ensure LLM safety, there is a clear trade-off between generality and cost.
This study examines conversational business analytics, an approach that utilizes AI to address the technical competency gaps that hindered end users from effectively using traditional self-service analytics. By facilitating natural language interactions, conversational business analytics aims to enable end users to independently retrieve data and generate insights. The analysis focuses on Text-to-SQL as a representative technology for translating natural language requests into SQL statements. Using models grounded in expected utility theory, the study identifies conditions under which conversational business analytics, through partial or full support, can outperform delegation to human experts. The results indicate that partial support, which focuses solely on information generation by AI, is viable when the accuracy of AI-generated SQL queries exceeds a defined threshold. In contrast, full support includes not only information generation but also validation through explanations provided by the AI, and requires sufficiently high validation effectiveness to be reliable. However, user-based validation presents challenges, such as misjudgment and rejection of valid SQL queries, which may limit the effectiveness of conversational business analytics. These challenges underscore the need for robust validation mechanisms, including improved user support, automated processes, and methods for assessing quality independently of end users' technical competencies.
Thiele's differential equation explains the change in prospective reserve and plays a fundamental role in safe-side calculations and other types of actuarial model comparisons. This paper presents a `model lean' version of Thiele's equation with the novel feature that it supports any canonical insurance model, irrespective of the model's intertemporal dependence structure. The basis for this is a canonical and path-wise model construction that simultaneously handles discrete and absolutely continuous modeling regimes. Comparison theorems for differing canonical insurance models follow directly from the resulting stochastic backward equations. The elegance with which these comparison theorems handle non-equivalence of probability measures is one of their major advantages over previous results.