Skip to content

Publications

We report the results of investigation of the momentum and contrarian effects on cryptocurrency markets. The investigated investment strategies involve 100 (amongst over 1200 present as of date Nov 2017) cryptocurrencies with the largest market cap and average 14-day daily volume exceeding a given threshold value. Investment portfolios are constructed using different assumptions regarding the portfolio reallocation period, width of the ranking window, the number of cryptocurrencies in the portfolio, and the percent transaction costs. The performance is benchmarked against: (1) equally weighted and (2) market-cap weighted investments in all of the ranked assets, as well as against the buy and hold strategies based on (3) S&P500 index, and (4) Bitcoin price. Our results show a clear and significant dominance of the short-term contrarian effect over both momentum effect and the benchmark portfolios. The information ratio coefficient for the contrarian strategies often exceeds two-digit values depending on the assumed reallocation period and the width of the ranking window. Additionally, we observe a significant diversification potential for all cryptocurrency portfolios with relation to the S&P500 index.

The main aim of this paper was to formulate and analyze the machine learning methods, fitted to the strategy parameters optimization specificity. The most important problems are the sensitivity of a strategy performance to little parameter changes and numerous local extrema distributed over the solution space in an irregular way. The methods were designed for the purpose of significant shortening of the computation time, without a substantial loss of a strategy quality. The efficiency of methods was compared for three different pairs of assets in case of moving averages crossover system. The methods operated on the in sample data, containing 20 years of daily prices between 1998 and 2017. The problem was presented for three sets of two assets portfolios. In the first case, a strategy was trading on the SPX and DAX index futures, in the second on the AAPL and MSFT stocks and finally, in the third case on the HGF and CBF commodities futures. The major hypothesis verified in this thesis is that machine learning methods select strategies with evaluation criterion near to the highest one, but in significantly lower execution time than the Exhaustive Search.

The paper presents a new approach to optimizing automatic transactional systems. We propose a multi-stage technique which enables us to find investment strategies beating the market. Additionally, new measures of combined risk and returns are applied in the process of optimization. Moreover, we define new elements of a risk control system based on volatility measures and consecutive signal confirmation. As a result, we formulate three complex investment systems which maximize returns and simultaneously minimize risk in comparison to all other alternative investments (IR=2, Maximum Drawdown<21%, Maximum Loss Duration=0.75 year). Our analysis is based on historical daily data (1998-2010, in- and out-of-sample period) for index and commodity futures. Afterwards, the systems are reoptimized and reallocated each half a year in order to include the most recent financial data. Finally, we show the results for a joint model consisting of our three systems.

The main aim of this paper is to analyse the efficiency of BTC mining under current market conditions. After thorough analysis of initial assumptions concerning the (1) price of mining machine and its effective amortization period, (2) difficulty and hash rate of BTC network, (3) BTC transaction fees and (4) energy costs, we have found that currently BTC mining is not profitable, except for some rare cases. The main reason of this phenomenon is the fast and unpredictable increase of difficulty of BTC network over time which results in decreasing participation of our mining machines in BTC network hash rate. The research is augmented with detailed sensitivity analysis of mining efficiency to initial parameters assumptions, which allows to observe that the conditions for BTC mining to be efficient and profitable are very challenging to meet.

  • Piotr Arendarski, Paweł Misiewicz, Mariusz Nowak, Tomasz Skoczylas, and Robert Wojciechowski (2014), Generalized Momentum Asset Allocation Model, Working Papers 2014-30, Faculty of Economic Sciences, University of Warsaw

In this paper we propose Generalized Momentum Asset Allocation Model (GMAA). GMAA is a new approach to construct optimal portfolio and is based on close examination of asset’s returns distribution. GMAA tries to capture certain market phenomena and use information they contain as predictors for future returns. Our model is validated using MSCI Indexes with MSCI World Index set as a benchmark. We find results rather promising as we managed to significantly reduce portfolio volatility and obtain stable path of cumulative returns of portfolio. Our model outperforms benchmark in terms of Information Ratio or Maximum Drawdown. Detailed sensitivity analysis was conducted at the end of this paper and it shows that our strategy is sensitive to a few optimization parameters thus further research may be required.

In this paper a new ARCH-type volatility model is proposed. The Range-based Heterogeneous Autoregressive Conditional Heteroskedasticity (RHARCH) model draws inspiration from Heterogeneous Autoregressive Conditional Heteroskedasticity presented by Muller et al. (1995), but employs more efficient, range-based volatility estimators instead of simple squared returns in conditional variance equation. In the first part of this research range-based volatility estimators (such as Parkinson, or Garman-Klass estimators) are reviewed, followed by derivation of the RHARCH model. In the second part of this research the RHARCH model is compared with selected ARCH-type models with particular emphasis on forecasting accuracy. All models are estimated using data containing EURPLN spot rate quotation. Results show that RHARCH model often outperforms return-based models in terms of predictive abilities in both in-sample and out-of-sample periods. Also properties of standardized residuals are very encouraging in case of the RHARCH model.

We suggest that the term structure of volatility futures (e.g. VIX futures) shows a clear pattern of dependence on the current level of VIX index. At the low level of VIX (below 20) the term structure is highly upward sloping; at the high VIX level (over 30) it is strongly downward sloping. We use those features to better predict future volatility and index futures. We begin by introducing some quantitative measures of volatility term structure (VTS) and volatility risk premium (VRP). We use them further to estimate the distance between the actual value and the fair (model) value of the VTS. We find that this distance has significant predictive power for volatility futures and index futures and we use this feature to design a simple strategy to invest in VIX index futures and S&P500.

  • Juliusz Jabłecki & Ryszard Kokoszczyński & Paweł Sakowski & Robert Ślepaczuk & Piotr Wójcik (2014) Options delta hedging with no options at all, Working Papers 2014-27, Faculty of Economic Sciences, University of Warsaw.

The adjustment speed of delta hedged options exposure depends on the market realized and implied volatility. We observe that by consistently hedging long and short positions in options we can eventually end up with pure exposure to volatility without any options in the portfolio at all. The results of such arbitrage strategy is based only on speed of adjustment of delta hedged option positions. More specifically, they rely on interrelation between realized volatility levels calculated for various time intervals (from daily to intraday frequency). Theoretical intuition enables us to solve the puzzle of the optimal frequency of hedge adjustment and its influence on hedging efficiency. We present results of a simple hedge strategy based on the consistent hedging of a portfolio of options for various worldwide equity indices.

This paper investigates the changes in the investment portfolio performance after including VIX. We apply different models for optimal portfolio selection (Markowitz and Black-Litterman) assuming both the possibility of short sale and the lack of it. We also use various assets, data frequencies, and investment horizons to get a comprehensive picture of our results’ robustness. Investment strategies including VIX futures do not always deliver higher returns or higher Sharpe ratios for the period 2006-2013. Their performance is quite sensitive to changes in model parameters. However, including VIX significantly increases the returns in almost all cases during the recent financial crisis. This result clearly emphasizes potential gains of having such an asset in the portfolio in case of very high volatility in financial markets.

  • Juliusz Jabłecki, Ryszard Kokoszczyński, Paweł Sakowski, Robert Ślepaczuk and Piotr Wójcik ( 2014), Simple heuristics for pricing VIX options, Working Papers 2014-25, Faculty of Economic Sciences, University of Warsaw.

The article presents a simple parameterization of the volatility surface for options on the S&P 500 volatility index, VIX. Specifically, we document the following features of VIX implied volatility: (i) VIX at-the-money (ATM) implied volatility correlates strongly with the volatility skew in S&P 500 options; (ii) VIX ATM implied volatility declines exponentially with options’ time to expiry; (iii) a SABR-type model can be used to model the smile observed in VIX options. These observations lead to simple heuristics for quoting prices (in terms of implied volatility) of VIX options with almost arbitrary strike and expiry, obtaining values that are reasonably close to market levels.

Forecasting time-series of high-frequency asset returns is an essential part of modern trading systems. Forward Stagewise Additive Modelling (FSAM) is a powerful Machine Learning method that has been successfully applied to function estimation tasks. It is an iterative procedure that fits an additive expansion in a set of basis functions; the basis function that best fits the current residuals is added to the expansion at each step. The goal of this paper is to propose an extension to the standard FSAM method, and empirically demonstrate its efficiency in the domain of high-frequency financial time-series modelling. The two principal components of the extended modelling method are a hyperbolic tangent transformation of the first-level residuals that addresses the issue of robust estimation, and a linear bias correction of the final model. Results on the GBP/USD foreign exchange, S&P500, and GOLD spot-price time-series of five-minutes and one-hour frequencies suggest that for the majority of problems the proposed method performs better out-of-sample when compared against traditional linear and non-linear time-series modelling methods and their ensemble versions, as well as an adaptation of FSAM found in the literature.

This paper explores cointegration among three of the most popular agriculture soft commodities (corn, soya and wheat) and its potential usefulness for dynamic asset allocation strategies. Johansen tests indicate that natural logarithms of weekly prices of corn, soya and wheat futures are cointegrated and two cointegrating vectors exist. Formal tests show that the estimated long-run relationship is stable even beyond the estimation sample. We use obtained results to create simple trading rules and verify their profitability. The trading strategies’ risk-adjusted abnormal returns look to be significant based on the Sharpe ratio criterion and they are low correlated with the stock market.

We identified 4500 US stocks with year ending losses of 50 percent or more during the 2001-2011 period. We screened our “falling knives” for financial strength to promote a greater likelihood of recovery and minimize any survivorship bias. We added the constraints of Altman Z-Scores, debt/equity ratio, and current ratio to our data set. We use GARCH-in-mean model to control the risk of the strategies. The results show consistent improvement of risk-standardized return profiles of the strategies in comparison with buy and hold strategy.

The paper presents the new approach to optimizing automatic transactional systems. We propose the multi-stage technique which enables us to find investment strategies beating the market. Additionally, new measures of combined risk and returns are applied in the process of optimization. Moreover, we define new elements of risk control system based on volatility measures and consecutive signals confirmation. As the result, we formulate three complex investment systems, which maximize returns and simultaneously minimize risk in comparison to all other alternative investments (IR=2, Maximum Drawdown<21%, Maximum Loss Duration=0.75 year). Our analysis is based on historical daily data (1998-2010, in- and out-ofsample period) for index and commodity futures. Afterwards, systems are reoptimized and reallocated each half a year in order to include the most recent financial data. Finally, we show the results for joint model consisting of our three systems.

In this paper, we compared several Black option pricing models by applying different measures of volatility and examined the Black model with historical (BHV), implied (BIV), and several different types of realized (BRV) volatility. The main objective of the study was to find the best model; that is, the model that predicts the actual market price with the minimum error. The high frequency (HF) data and bid-ask quotes (instead of transactional data) for the Warsaw Stock Exchange (WSE) were used to omit the problem of nonsynchronous trading and to increase the number of observations. Several error statistics and the percentage of price overpredictions (OP) showed the results that confirmed the initial intuition that the BIV model is the best model, the BHV model is the second best, and the BRV is the least efficient among the models studied.

Option pricing models are the main subject of many research papers prepared both in academia and financial industry. Using high-frequency data for Nikkei225 index options, we check the properties of option pricing models with different assumptions concerning the volatility process (historical, realized, implied, stochastic or based on GARCH model). In order to relax the continuous dividend payout assumption, we use the Black model for pricing options on futures, instead of the Black-Scholes-Merton model. The results are presented separately for 5 classes of moneyness ratio and 5 classes of time to maturity in order to show some patterns in option pricing and to check the robustness of our results. The Black model with implied volatility (BIV) comes out as the best one. Highest average pricing errors we obtain for the Black model with realized volatility (BRV). As a result, we do not see any additional gain from using more complex and time-consuming models (SV and GARCH models. Additionally, we describe liquidity of the Nikkei225 option pricing market and try to compare our results with a detailed study for the emerging market of WIG20 index options.

The main idea of this research is to check the efficiency of the Black option pricing model on the basis of HF emerging market data. However, liquidity constraints – a typical feature of an emerging derivatives market – put severe limits for conducting such a study. That is the reason why Kokoszczynski et al., 2010, have conducted their earlier research on midquotes data treating them as potential transactional data. They have got some intriguing conclusions about implementing different volatility processes into the Black option model. Nevertheless, taking into account that midquotes do not have to be the proper representation of market prices as probably transactional data do, we decide to compare in this paper the results of the research conducted on HF transactional and midquotes data. This comparison shows that the results do not differ significantly between these two approaches and that BIV model significantly outperforms other models, especially BRV model with the latter producing the worst results. Additionally, we provide the discussion of liquidity issue in the context of emerging derivatives market. Finally, after exclusion of spurious outliers we observe significant patterns in option pricing that are not visible on the raw data.

This paper compares option pricing models, based on Black model notion (Black, 1976), especially focusing on the volatility models implied in the process of pricing. We calculated the Black model with historical (BHV), implied (BIV) and several different types of realized (BRV) volatility (additionally searching for the optimal interval Δ, and parameter n – the memory of the process). Our main intention was to find the best model, i.e. which predicts the actual market price with minimum error. We focused on the HF data and bidask quotes (instead of transactional data) in order to omit the problem of non-synchronous trading and additionally to increase the significance of our research through numerous observations. After calculation of several error statistics (RMSE, HMAE and HRMSE) and additionally the percent of price overpredictions, the results confirmed our initial intuition that that BIV is the best model, BHV being the second best, and BRV – the least efficient of them. The division of our database into different classes of moneyness ratio and TTM enabled us to observe the distinct differences between compared pricing models. Additionally, focusing on the same pricing model with different volatility processes results in the conclusion that point-estimate, not averaged process of RV is the main reason of high errors and instability of valuation in high volatility environment. Finally, we have been able to detect “spurious outliers” and explain their effect and the reason for them owing to the multi-dimensional comparison of the pricing error statistics.

This paper focuses on volatility of financial markets, which is one of the most important issues in finance, especially with regard to modeling high-frequency data. Risk management, asset pricing and option valuation techniques are the areas where the concept of volatility estimators (consistent, unbiased and the most efficient) is of crucial concern. Our intention was to find the best estimator of true volatility taking into account the latest investigations in finance literature. Basing on the methodology presented in Parkinson (1980), Garman and Klass (1980), Rogers and Satchell (1991), Yang and Zhang (2000), Andersen et al. (1997, 1998, 1999a, 199b), Hansen and Lunde (2005, 2006b) and Martens (2007), we computed the various model-free volatility estimators and compared them with classical volatility estimator, most often used in financial models. In order to reveal the information set hidden in high-frequency data, we utilized the concept of realized volatility and realized range. Calculating our estimator, we carefully focused on ? (the interval used in calculation), n (the memory of the process) and q (scaling factor for scaled estimators). Our results revealed that the appropriate selection of ? and n plays a crucial role when we try to answer the question concerning the estimator efficiency, as well as its accuracy. Having nine estimators of volatility, we found that for optimal n (measured in days) and ? (in minutes) we obtain the most efficient estimator. Our findings confirmed that the best estimator should include information contained not only in closing prices but in the price range as well (range estimators). What is more important, we focused on the properties of the formula itself, independently of the interval used, comparing the estimator with the same ?, n and q parameter. We observed that the formula of volatility estimator is not as important as the process of selection of the optimal parameter n or ?. Finally, we focused on the asymmetry between market turmoil and adjustments of volatility. Next, we put stress on the implications of our results for well-known financial models which utilize classical volatility estimator as the main input variable.

  • Robert Ślepaczuk, Grzegorz Zakrzewski, Emerging versus developed volatility indices (2009). The comparison of VIW20 and VIX indices, University of Warsaw, Faculty of Economic Sciences, Working Paper

Modeling of financial markets volatility is one of the most significant issues of contemporary finance, especially with regard to analyzing high-frequency data. Accurate quantification and forecast of volatility are of immense importance in risk management (VaR models, stress testing and worst-case scenario), models of capital market and options valuation techniques. What we show in this paper is the methodology for calculating volatility index for Polish capital market (VIW20 ? index anticipating expected volatility of WIG20 index). The methods presented are based on VIX index (VIX White Paper, 2003) and enriched with necessary modifications corresponding to the character of Polish options market. Quoted on CBOE, VIX index is currently known as the best measure of capital investment risk perfectly illustrating the level of fear and emotions of market participants. The conception of volatility index is based on the combination of realized volatility and implied volatility which, using methodology of Derman et al. (1999) and reconstructing volatility surface, reflects both volatility smile as well as its term structure. The research is carried out using high-frequency data (i.e. tick data) for index options on WIG20 index for the period November 2003 – May 2007, in other words, starting with the introduction of options by Warsaw Stock Exchange. All additional simulations are carried out using data gathered in years 1998-2008. Having analyzed VIW20 index in detail, we observed its characteristic behavior during the periods of strong market turmoils. What we also present is the analysis of the influence of VIW20 and VIX index-based instruments both on construction of minimum risk portfolio and on the quality of derivatives portfolio management in which volatility risk and liquidity risk play a key role. The main objective of this paper is to provide foundations for introducing appropriate volatility indices and volatility-based derivatives. This is done with paying attention to crucial methodology changes, necessary if one considers strong markets inefficiencies in emerging countries. As the introduction of appropriate instruments will enable active management of risks that are unhedgable nowadays it will significantly contribute to the development of the given markets in the course of time. In the summary we additionally point to the benefits Warsaw Stock Exchange might obtain from, being one of the few emerging markets possessing appropriately quantified investment risk as well as derivatives to manage it.

This paper focuses on one of the heavily tested issue in the contemporary finance, i.e. efficient market hypothesis (EMH). The existing evidence in the literature is ambiguous. For some markets the departure from efficiency is observed only when High Frequency (HF) data are analysed. Therefore, we verify efficient market hypothesis (EMH) basing our analysis on 5-minute data for WIG20 index futures quoted on the Warsaw Stock Exchange (WSE). We use robust regression that assigns the higher weights to the better behaved observations in order to verify the existence of daily and hourly effects. Our results indicate that the day of the week effect and hour of the day effect are observed. What is more important is the existence of strong open jump effect for all days except Wednesday and positive day effect for Monday. Considering the hour of the day effect we observe positive, persistent and significant open jump effect and the end of session effect. Aforementioned results confirm our initial hypothesis that Polish stock market is not efficient in the information sense.