Endogeneity can cause a significant bias of the coefficient estimation, up to the change in sign. It leads to controversial research results, which also makes it difficult to adequately test individual hypotheses and theories in corporate finance (CF). For practitioners, such as company valuation consultants, these model problems interrupt to obtaining the most reliable estimates in the interests of the customer. The aim of this study is to review an endogeneity problem in CF, ways to solve a problem of endogeneity. We will illustrate the methods found in the systematic review with an empirical example. The paper provides the reasons for this problem from an econometric point of view and with examples from the CF and econometric methods of dealing with it. As a result of a systematic literature review, we have shown that dynamics panel models, in particular the Blundell-Bond method, are mostly used to consider endogeneity in CF studies. We have verified empirically the conclusion made in the framework of the literature review. To detect the endogeneity, we used the Hausman test, the endogeneity test, and the analysis of the correlation matrix, including the saved regression residuals. Eliminating step-by-step endogeneity, we concluded that the Blundell-Bond method is not always the optimal one for dealing with endogeneity in CF, as well as regression with a fixed effect. It was revealed that the two-step least squares method (IV 2SLS) is the most appropriate method for cost of capital model estimation eliminating endogeneity. In addition, the estimates of the cost of capital model, which analyzes the impact of non-financial reporting, have been improved.
We consider a guaranteed deterministic approach to discrete-time super-replication for guaranteed coverage of contingent claims on options for all possible asset-price scenarios. Price increases during a period are assumed to be contained in a priori specified compacta dependent on price history. A game problem is stated and reduced to the solution of the corresponding Bellman–Isaacs equation. Numerical solution algorithms on a discrete lattice are considered for the Bellman–Isaacs equation. Results of a numerical experiment are reported for various model specifications.
Researchers have been improving credit scoring models for decades. The reason for this is following: an increase in the predictive ability of scoring even by small values can save a financial institution from a significant losses. As a result, many researchers have conclude that ensembles of classifiers or aggregated scorings have greater performance. However, ensembles outperform the base classifiers by thousandths of a percent on unbalanced samples.
This article suggests building an aggregated scoring. Unlike previously proposed aggregate scores, its baseline classifiers are focused on identifying different types of borrowers. The purpose of this study is to illustrate the effectiveness of such scoring aggregation on real unbalanced data.
We use one performance measure as effectiveness indicator - the area under the ROC curve. The DeLong, DeLong and Clarke-Pearson test is used to measure the statistical difference between the two or more areas. In addition, this study uses a logistic model of defaults (logistic regression), which is applied to the data of companies financial statements. This model is usually focused on identifying default borrowers. To obtain a scoring aimed at non-default borrowers, a modified Kemeny median is used, which was conceived by the authors to rank companies with credit ratings. Both scores are aggregated by logistic regression.
Our data contains most of the observations of Russian banks that existed and defaulted from 01.07.2010 to 01.07.2015. The sample of banks is highly unbalanced, a concentration of defaults is about 5%. However, the aggregation is carried out for the banks that have several ratings. As a result, it was found that aggregated classifiers based on different types of information improves significantly the discriminatory power of scoring even on an unbalanced sample.
This aggregated scoring and the approach to its construction could be applied in financial institutions as part of credit risk assessment, as well as an auxiliary tool for decision-making process because of relatively high interpretability of these scores.
There are many different models for estimation of a yield curve from bond market quotes. These models are well suited for developed markets with high liquidity level and market data readily available. However, this is not always the case for developing markets that are characterized by infrequent trading, heterogeneous liquidity and frequent missing data.
In this article we provide a review of the existing and theoretically possible solutions to the problems arising in the process of yield curve construction in developing markets. Our review shows, that all these problems can be effectively tackled by adapting traditional yield curves models to the observer liquidity level of developing market.
Heterogeneous liquidity can be addressed by introducing liquidity-based weights into a yield curve model and by removing observations with atypical liquidity from the dataset. To solve missing data problem, we suggest using dynamic yield curve models or recreating missing observations with help of a supplementary model. In special cases when there are not enough bond issues on the market one is recommended to simplify yield curve model and use the data from other markets (e.g. derivative market).
The article might be of a great use for market practitioners who operate on developing bond markets as well as for quants who are engaged in construction of yield curves. It also serves as a starting point for a further academic research in the area of term structure modelling in illiquid bond markets.
A guaranteed deterministic problem setting of super-replication with discrete time is considered: the aim of hedging of a contingent claim is to ensure coverage of the possible payout under an option contract for all admissible scenarios. These scenarios are given by means of compacts given a priori, which depend on the prehistory of prices: the increments of the (discounted) price at each moment of time must lie in the corresponding compacts. The reference probability, common for financial mathematics, is not needed. In the current framework we arrive at a control problem under uncertainty with discrete time, which has a game-theoretic interpretation. The capital, which will be necessary at some moment in time to cover the contingent liability during the time interval up to expiration, satisfies Bellman–Isaacs equations for both the pure and the mixed strategies. The stochastic description of price dynamic therefore arises when considering mixed strategies of the “market,” such that the conditional distributions of price increments given price history have supports contained in the corresponding compacts. In the present paper, under the assumption of no trading constraints, we prove that if there are no arbitrage opportunities, then the equilibrium holds for mixed strategies with the conditional distribution of price increments concentrated in a finite set, and we show that the optimal strategies of the “market” can be found among the “risk-neutral” strategies.
Estimates of the term structure of interest rates depend heavily on the quality of the market data from which it is constructed. Estimated rates can be incorrect due to observation errors and omissions in the data. The usual way to deal with the heteroskedasticity of observation errors is by introducing weights in the fitting procedure. There is currently no consensus in the literature about the choice of such weights. We introduce a non-parametric bootstrap-based method of introducing observation errors drawn from the empirical distribution into the model data, which allows us to perform a comparison test of different weighting schemes without implicitly favoring one of the contesting models – a common design flaw in comparison studies. We use government bonds of several countries with examples of both liquid and illiquid bond markets. We show that realistic observation errors can greatly distort the estimated yield curve. Moreover, we show that using different weights or other modifications of accounting for observation errors in bond price data doesn’t always improve the term structure estimates, and often only worsens the situation. Based on our comparison, we advise to either use equal weights of weights proportional to the inverse duration in practical applications.
We consider a guaranteed deterministic formulation for the super-replication problem in discrete time: find a guaranteed coverage of a contingent claim on an option under all possible scenarios. These scenarios are specified by a priori compacta that depend on historical prices: the price increments at each instant should be in the corresponding compacta. We assume the presence of trading constraints and the absence of transaction costs. The problem is posed in a game-theoretical setting and leads to Bellman–Isaacs equations in both pure and mixed “market” strategies. In the present article, we investigate the sensitivity of the solutions to small perturbations of the compacta that describe price uncertainties over time. Numerical methods are proposed allowing for the problem’s specific features.
This paper studies the form of the instantaneous impact cost function in a financial market with transaction costs via an axiomatic approach. We show that several kinds of convexity of the cost function are equivalent to the corresponding properties of the price impact functions. The results clarify the implicit assumptions made when selecting a particular form of the cost function and can be used when choosing the correct portfolio optimization framework.
The article shows how the Bayesian approach to income adjustment can be implemented in a non-parametric structure with automatic smoothing obtained from data. It also briefly illustrates the benefits of this approach using real data.
The article uses an infinite-dimensional (functional space) approach to reverse problems. Numerical calculations are performed using the Markov-Monte Carlo chain algorithm with several settings to ensure good performance. The model clearly uses spreads between queries and sentences to account for observation errors and provides automatic smoothing based on them.
The non-parametric structure captures the complex forms of the zero-coupon curves of emerging markets. The Bayesian approach assesses the accuracy of estimates, which is crucial for some applications. Examples of valuation results are given for three different bond markets: liquid (German), medium liquid (Chinese) and illiquid (Russian).
The result shows that an infinite-dimensional Bayesian approach to evaluating the structure of the term is possible. Market practices can use this approach to better understand the timing of interest rates. For example, they could now supplement their non-parametric estimates of the timing structure with Bayesian confidence intervals to enable them to assess the statistical significance of their results.
The model does not require parameters to be set during the evaluation. It has its own parameters, but they must be selected during the model configuration.
In this short note we show that a weak version of Bernstein’s characterization of the normal distribution implies the local integrability of a measurable solution of the Cauchy functional equation; the linearity of a solution of the Cauchy functional equation is an easy consequence of its local integrability. In its turn, this weak version of Bernstein’s theorem can be derived from Cauchy’s theorem.
For the discrete-time superreplication problem, a guaranteed deterministic formulation is proposed: the problem is to ensure the cheapest coverage of the contingent claim on an American option under all admissible scenarios. These scenarios are set by a priori defined compacts depending on the price history; the price increment at each moment of time must lie in the corresponding compact. The market is considered without trading constraints and transaction costs. The problem statement is game-theoretic in nature and leads directly to the Bellman–Isaacs equations of a special form under the assumption of no trading constraints. In the present study, we estimate the modulus of continuity of uniformly continuous solutions, including the Lipschitz case.
The paper considers the parametric hedging of non-parallel shifts in the yield curve. In order to determine capital requirements and stress testing, Basel committee recommends taking into account the risk of non-parallel interest rate shifts. (Basel Committee on Banking Supervision, 2016). As of April 2017, only one Russian bank took this risk into account in calculating interest rate risk, and one was developing a methodology (Central bank of Russia, 2017). We use several term structure models for hedging non-parallel interest rate shifts. The study uses a 5-year span of Russian bond market data. We use VaR and MAE to assess the effectiveness of hedging approaches.
The novelty of the work lies in the application of different term structure models, most of which have not previously been used for parametric hedging. We also present an original methodology for assessing the effectiveness of hedging. For the first time a study is conducted on the Russian bond market.
Cross-validation shows that the Nelson-Siegel (and also its shortened version), Svensson and Cox-Ingersoll-Ross models within the parametric hedging problem give better results than the generally accepted Fisher-Weil duration model. The results of this work have practical significance for fixed income managers.