Uncertainty-Aware Lookahead Factor Models for Improved Quantitative Investing

Jul 12, 2020

Sprecher:innen

Über

On a periodic basis, publicly traded companies are required to report fundamentals: financial data such as revenue, earnings, debt, etc., providing insight into the company’s financial health. Quantitative finance research has identified several factors—computed features of the reported data—that have been demonstrated in retrospective analysis to outperform market averages. In this paper, we first show through simulation that if we could (clairvoyantly) select stocks using factors calculated on future fundamentals (via oracle), then our portfolios would far outperform a standard factor approach. Motivated by this analysis, we train MLP and LSTM neural networks to forecast future fundamentals based on a trailing window of five years. We propose lookahead factor models to act upon these predictions, plugging the predicted future fundamentals into traditional factors. Finally, we incorporate uncertainty estimates from both neural heteroscedastic regression and a dropout-based heuristic, demonstrating gains from adjusting our portfolios to avert risk. In a retrospective analysis using an industry-grade stock portfolio simulator (backtester), we show simultaneous improvement in annualized return and Sharpe ratio (a common measure of risk-adjusted returns). Specifically, the simulated annualized return for the uncertainty-aware model is 17.7

Organisator

Kategorien

Über ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Präsentation speichern

Soll diese Präsentation für 1000 Jahre gespeichert werden?

Wie speichern wir Präsentationen?

Ewigspeicher-Fortschrittswert: 0 = 0.0%

Freigeben

Empfohlene Videos

Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

Interessiert an Vorträgen wie diesem? ICML 2020 folgen