Benjamin Graham Meets Advanced Analytics


The following article is by Gabe Lowy, a former sell-side technology analyst who is founder of TechTonics Advisors, a provider of strategic communication for technology companies.

Benjamin Graham’s The Intelligent Investor is considered by many, including his disciple Warren Buffet, as the finest book ever written on the subject of investing. At the heart of Graham’s thesis is the question, “Can investors uncover intrinsic value in specific securities if markets are truly efficient?” The challenges of stock selection – and market outperformance – lie in this question.

If the market is efficient then all of the factors driving every company’s past and current performance – as well as expectations for its future results – are already known and priced in. Yet markets are as volatile as they ever have been. Therefore, it would follow that these price swings can be reasonably explained by new information that is not widely known or anticipated.

Some would suggest that today’s price swings are caused by algorithmic trading, which did not exist in Graham’s time. While computer-based trading does account for at least half of current volume, most algorithms are driven by models that are created by individuals who define specific input variables. Those input variables are extensively researched and refined. They can produce outsized returns. But when the data the models are built on is inaccurate, and assumptions mirror one another, the result can be flash crashes.

Another explanation for stock price movements is the shift toward indexing – which has grown to 35% of assets in stock mutual funds and ETFs from 25% five years ago. Yet the underlying composition of many indexes and ETFs are also created by individuals. These too are based on sound research principles applied to various input variables. However, as we’ve recently seen they are not a safe haven during market volatility, and could potentially exacerbate wild swings.

So with thousands of participants analyzing markets, sectors and individual securities, the price of a security would reasonably be expected to reflect the consensus opinion. Under the efficient market theory, choosing one security (or index fund or ETF) over another would be based on personal bias toward certain factors or for expectations of future performance. The more popular a particular security is, the more commonly known would be the factors underpinning that security’s price movements.

Could price movements then not be attributable to randomness or just plain luck?

Or is there another explanation?

Despite vast amounts of resources devoted to researching ever-growing data sources, the majority of professional money managers consistently underperform the market. Over the past ten years, over 75% of active U.S. equity funds underperformed the S&P Composite 1500, a percentage similar to previous decades.

A key to Graham’s approach is that intelligent investing requires an “adequate knowledge” of how securities have performed under different circumstances. And although he says that “nothing on Wall Street can be counted on to occur exactly in the same way as it happened before”, Graham suggests that some of these circumstances are likely to recur.

The problem is that most fund managers cannot accurately or consistently identify the input variables that drive portfolio performance. Thus, while the costs for data sources and management continue to rise, the return on that “research” continues to decline. Most portfolios are constructed without “adequate knowledge”.

In the era of indexing and ETFs Graham’s approach to finding neglected sectors and/or securities selling at discounts to their net asset value are as valid as ever. But while the principles of securities analysis remain intact, the number of input variables that need to be researched and analyzed has exploded. And the speed at which these data points are generated and travel is unprecedented. Aggregating and analyzing all this data has grown beyond the scope of human capabilities – even one as brilliant as Benjamin Graham.

Adding Advanced Analytics to the Equation

The technologies behind advanced analytics, such as parallel processing, machine learning and data visualization, make it possible for fund managers to remain true to investment style. They can now ingest, process and analyze massive data sets in much shorter time. Factors driving performance can be rapidly identified so that higher quality data is used to validate models and test strategies.

Advanced analytics can improve portfolio performance, while reducing research costs and lifting productivity. Qualitative fundamental knowledge is complemented by objective quantitative analysis to get a 360-degree view of securities.   Fund managers gain a deeper understanding of how securities behaved in past scenarios to generate more accurate forecasts on how they might perform when similar conditions appear. Modern technology allows for faster allocation and rebalancing, with more accurate data identifying key performance variables.

In addition to better performance, advanced analytics can improve product and service quality, and ensure adherence with GRC (governance, regulatory, compliance) mandates. It facilitates a shift in business models from high-cost, low-value proprietary products to low-cost, high-value fiduciary services. Firms can gain deeper insights into customer preferences to develop tailored solutions. Coming MiFID II and “best interest” regulations will disrupt product research and delivery processes. Firms can use advanced analytics to monitor patterns in customer behavior and well as internal managers and traders.

Combining Benjamin Graham’s investment principles with modern technology tools can help firms differentiate by focusing on longer-term strategies and outcomes. They can still incorporate indexing and ETFs to meet demand, but with a deeper understanding of the underlying securities. Fund managers can uncover intrinsic value in neglected markets or securities faster and more accurately.

Managers who evolve will achieve the margin of safety Graham speaks of by basing decisions on “simple and definite arithmetic reasoning from statistical data”. That and the courage of conviction will determine their success.


About Author

Gabriel Lowy is the founder of TechTonics Advisors, a strategic communications consultancy for technology companies that bridges vision, strategy, portfolio and markets with customers and investors to build value for all stakeholders. TechTonics synthesizes complex concepts and technologies into clearly understood and actionable value messages. During the previous 15 years, Gabe was a multiple award-winning technology analyst at several Wall Street firms. He began his career with Andersen Consulting. Gabe has published over 1,000 reports and articles on different sectors and companies in the technology industry. Follow Gabe on twitter @GabrielLowy1 and learn more at

Leave A Reply