Data > Information > Insight > Action

by Mario Fisher and Mayuresh Kulkarni | 5,Nov,2023 | Investments, Old Mutual, Q4 2023

Mayuresh Kulkarni, Quantitative Analyst, and Mario Fisher, Head of Quantitative Research, Old Mutual Investment Group

In today’s world, there is an explosion of the amount and types of data available to the average investor. The consequence is a greater need for tools to process large volumes of data and convert them into consistently good insights at a reasonable speed to beat the markets.

Data, information, insight and action are guiding principles for quantitative investors, reminders of what we should be spending our time on. All the statistical models, team meetings, financial reports, market news and everything else that we get bombarded with daily as investment professionals, can be reduced elegantly into these four words. And the flow of these concepts matters – because raw data can lead to informed decisions. 

Dimensionality of data

Investing on a global scale requires the processing of a lot of data. This can be trade data, such as price and volume; fundamental data such as company financials; metadata such as sector, industry or country classifications; topical data such as news articles and editorials; and novel sources of data such as satellite images of shopping mall parking lots or credit card receipts. Different analyses and back tests need different levels and types of data. The dimensions of the data can change based on the investible universe and multiple funds could require overlapping dimensions of data. Ultimately, raw data is usually unusable and requires processing.

Using the momentum style as an example, if we want to calculate momentum for global stocks, it requires thousands of data points per day. Momentum implies that the stocks that have been doing well in the recent past (three to twelve months) will do well for the next period (let’s say next month). This requirement increases exponentially if we want to test the momentum hypothesis. A high quality and quick proprietary database, coupled with robust data engineering, is needed to ensure fast and reliable data.

Distilling the data

In quantitative investment strategies, to get from data to information, we need to process the data. In the case of the momentum factor, we may need to compare the prices of stocks today to the prices three (or six or twelve) months ago. In a global investing universe, this means the same calculation is done on thousands of stocks per day. Just the sheer volume of data handled in these calculations means that simple tools like Microsoft Excel won’t cut it. We need to use computer programming languages where we can automate the calculations and perform them quickly.

The database that stores the raw data can also store the calculated data, allowing for single storage with less chance of error. Therefore, the idea of a single source of truth also means that data is not being transferred between team members using files that can be easily modified. This way, multiple calculations can be performed as soon as data is available, and the calculated data can be made available on demand.

Data over dogma

Cliff Asness, founder of AQR Capital Management, once told a story about discussing the concept of ‘momentum’ in equity markets with his PhD supervisor, Eugene Fama – known for the Efficient Markets Theory. Despite being a strong proponent of value investing, Fama’s response was straightforward: “If it’s in the data, write the paper.” This anecdote underscores the significance of empirical evidence in research, even when it challenges existing theories.

So far in this article, we have been taking it for granted that equity momentum exists and works. The way to test it is to go to the third step of the guiding principles – using data from different countries and time periods and running checks, which confirm this. Gaining insights from data is one of the most important steps we can take as investment professionals. Formulating deep, meaningful insights from data helps us understand market conditions and position our portfolios better. A further example of this is work done on a global cycle indicator, which is automatically updated to provide insights for portfolio positioning based on the current economic phase that the indicator points to. These insights are all continuously tested and updated automatically, with the system operating like a growing knowledge bank of innovative research and insights.

Creating cutting edge technology

Given the scope of data that investors need to process, a proprietary system of cutting-edge technology is necessary for analysts and portfolio managers to gain a competitive advantage. At a minimum, investment teams should be supported by meaningful databases, with robust analytics engines and a front-end, which can be accessed by any team member from any device. This will give them the best chance to make the correct decisions regarding investment of their clients’ capital. Portfolio positioning decisions can be taken after distilling large amounts of information in a fast and accurate manner, with the systems able to be modified in any way necessary for producing market-beating returns in their portfolios.

The idea of data-driven decisions comes to life through this approach and the only limitation is our own imagination.

Mario Fisher
Head of Quantitative Research at Old Mutual Investment Group | + posts
Mayuresh Kulkarni
Quantitative Analyst at Old Mutual Investment Group | + posts