Time For An Alternative Data Reality Check

0

The following is a guest article from Ankit Sahni, President & Head of Research at Exante Data Inc, a firm providing alternative data and analysis to large macro hedge funds and asset managers.

Buzzwords such as “big data” and “machine learning” have permeated the investing world in a significant way in recent years. But, while most in the market are clearly excited about the possibility of incorporating these in their investment process, actual implementation is much easier said than done.

We often get asked why (and how) some firms are able to handle this transition better than others. How some manage to use data analysis as a source of competitive advantage, while others struggle to do so, despite throwing significant amounts of money at it.

The overall answer is in accepting the new multi-faceted reality and adapting to it. And in realizing that there is no silver-bullet solution or shortcut to revamp any firm’s investment process. Each firm and investment process is different, in terms of starting point, flexibility and resources, and should be handled differently.

But we have found that there are some facts that apply quite broadly, and need to be internalized at the outset.

Dis-economies of scale

The first such fact is that the regulatory environment and organizational inertia have created a world with dis-economies of scale, in terms of incumbent organizations’ ability to build new data and expertise. This is exacerbated by the advent of cloud computing, and easy access to capital for startups, giving small, nimble companies a level playing field to access to crucial technological infrastructure. Not being encumbered by organizational inertia, these startups are then able to look for truly innovative solutions to markets questions.

The best incumbent firms have adjusted to this by building specific core competencies in-house and working with reliable external providers to outsource the rest.

Is alternative data truly big and revolutionary?

Another key shift in this new paradigm is to simultaneously acknowledge the power and limitations of data analysis in finance. In some respects, the debates around the advent of new technology can become somewhat like those around religion – investors either believe it will solve all their problems or they want to shun it altogether, with both groups unable to see things from the other’s perspective.

Not surprisingly, the truth is somewhere in the middle.

While ‘big data’ proponents (including us) are quick to tout the power of machine learning algorithms and advances in other fields, financial markets are different.

Any algorithmic output can only be as good as the input data it sees, both in terms of size and quality. In finance, there is a shortage of truly “big” fundamental data, especially in the macro space. Economies move slowly with significant short-term volatility. As an example, if we take all the US economic data releases since 1914 – data that has had significant market impact over the years – it falls well short of the loosest definitions of ‘big data’ and does not even register relative to data sets for which modern machine learning algorithms are designed. Even tick by tick market data pales in comparison.

Even beyond that, financial markets and economies are dynamic, in terms of the evolution of various interactions and interrelationships. Almost by definition, simple observable relationships in markets have a limited half-life once they become widely understood. In other words, the markets game is different from Chess or Go, because the rules of the game are not even remotely static. This violates the basic (implicit) assumption in all of machine learning – that the future operates with the same rules as the past.

Man and machine: Learning from each other

The implication of all of that is that blindly throwing more complex machine learning algorithms to fancy-sounding data sets is a clear recipe for disaster. However, not using the power of new data and algorithms is also a mistake.

The right choice is almost always more subtle. High quality data and analytics, as they exist today, are best equipped to supplement (not supplant) existing investment processes that rely on human creativity.

Financial markets and the economy are complex, dynamic puzzles with ever-changing rules. The current state of machine learning and AI does not get us close to solving these. Indeed, it seems possible (likely) that this is not a truly solvable puzzle at all.

The reality to accept here is that the combined skills of man and machine are what is likely to come closest, with both learning from each other.

More specifically, using advanced analytics to improve the quality of data provided to human decision makers is key – i.e. combine the relative strengths of man and machine. A machine’s ability to scan large sets of disparate data dispassionately, and flag outliers, is almost without parallel. But the human ability to view these outliers in the right context is where true insight is generated.

Such combinations of humans and algorithms are the best source of alpha, and the way forward (at least for now).

Share.

About Author

Ankit Sahni is the President & Head of Research at Exante Data. Prior to Exante, Ankit was the Head of US Strategy at a Prologue Capital, a global macro hedge fund, and a Macro Strategist at Nomura Securities. At Exante, Ankit and the team analyze macro market relevant data, and build tools and frameworks to create reasoned answers for the most complex markets questions. The firm employs market analysts, economists, programmers and data scientists to empower market participants by providing the highest quality proprietary data and analysis. Having been founded in 2016, the firm already counts a sizable number of the top names in the Global Macro space as clients.

Leave A Reply