Upbeat Equities Environment

April 23rd, 2014

Earnings reports from major investment banks are showing continued improvement in the equities environment during the first quarter of 2014.  The longer term trend also suggests a more stable context for equity research.

Quarterly equities growth

Bank of America, Morgan Stanley, Credit Suisse, and JP Morgan have reported sequential growth in their equities revenues and all except JP Morgan had higher equities revenues than the first quarter of 2013.  Citigroup was the outlier.  Although its overall earnings report was well received, its equities business is down 12% relative to comparable quarters.

Goldman Sachs equity revenues were also down in aggregate, which it explained as the result of the sale of its reinsurance business and lower derivatives revenues.  However, its commission revenues were up 4% compared to the first quarter of last year and 11% sequentially.

The average for the banks reporting so far shows slight growth over the first quarter of 2013 and a robust 21% uptick from the fourth quarter.  Still to report: Barclays, Deutsche Bank and UBS.

Commissions as a component of equities

Unlike its competitors, Goldman provides detail on the components of its equities revenues.  Its report of equity commission revenues excludes the ‘noise’ from other equities products less directly related to investment research.  This is a reminder that although overall cash equities revenues are the best metric we have, it isn’t a perfect proxy for commission revenues.

The negative press surrounding high frequency trading has led to speculation that banks may be under pressure to close their dark pools.  Goldman’s CFO told analysts that there were “no strategic plans” to close its Sigma X pool during the first quarter conference call.  Closure of dark pools would negatively impact overall equities revenues, but not commissions revenues tied to investment research.

Equities trend

Looking at the longer term trend in equities revenues for one of the top equities houses, Credit Suisse, we see a near doubling of equities revenues in the run up to the financial crisis followed by an equally quick decline.   By 2011, Credit Suisse’s equity revenues were back to 2005 levels.  However, since then equities revenues have stabilized and show moderate growth (10%) in 2013.  (Note: CHF = 1.13 US$ and has trended around $1 US.)

The quarterly revenue picture is more complex, but has a similarly encouraging picture of an improving near term trend.   The quarterly revenues also show decreasing volatility, which is the trend across all the major investment banks.

The quarterly pattern also suggests some seasonality to the revenues with stronger revenues earlier in the year and weaker results later in the year.

Conclusion

Overall, it is premature to break out the champagne.  Not all banks have reported yet, so the picture could alter. More importantly, past experience tells us that equity revenues are volatile and a strong first quarter does not necessarily presage a strong year.   Nevertheless, the general trend suggests a generally more stable equities environment, at least for the leading equities players, which is a positive for the research segment.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Big Data and Investment Research: Part 2

April 21st, 2014

Last week we wrote about why buy-side firms are considering adding “big data” analytic techniques to their research process.  This week, we will investigate the impact that “big data” is now having on the sell-side and alternative research industry.

The “Big Data” Evolution

As we mentioned last week, the term “big data” when applied to investment research is the practice of regularly collecting, storing and analyzing huge amounts of structured and unstructured data sources to generate predictive insights with the sole purpose of generating consistent investment returns.

Despite pronouncements to the contrary, this is not really a new phenomenon in the research business as most of the best sell-side and alternative research firms have been doing just this for many years — albeit to a more limited extent.  Countless research analysts have conducted formal channel checks, surveys, or shopping bag counts; or they have licensed unique commercial data feeds; they have warehoused this data; and they have analyzed these unique data points as part of their research process for many years.  In fact, some research franchises like Midwest Research and it’s many spinoffs were widely known for this type of data-driven research process.

However, in recent years as computing power has grown, computer storage costs have fallen, and the internet has exponentially increased the availability of both structured and unstructured data, both the capability and interest in expanding the data-driven research model also increased with buy-side research consumers and the third-party research producers that serve them.

History of “Big Data” Investment Research

As mentioned in last week’s article, one of the first third-party “big data” firms to serve institutional investors was Majestic Research, founded in 2002, who collected, warehoused, and analyzed a number of unique data sets to identify investable signals for asset managers.  However, Majestic Research found this business model was difficult to maintain as quantitatively oriented hedge funds that initially used the service felt the predictive edge of this analysis deteriorated as more investors used it.  In other words, Majestic Research could not scale its research business.

In response, the firm decided to hire fundamental securities analysts who could leverage their proprietary data and statistical analysis to produce data-driven fundamental research.  They found the market for this type of research was much broader than pure quantitative investors, and these buy-side clients were less worried that the data was too widely distributed.  They valued the insight Majestic provided more than the underlying data which led to that insight.  As discussed last week, Majestic was acquired by ITG in 2010 and this data-driven research product became the foundation for ITG’s research product since.

However, other firms saw that the Majestic model could enhance the value of the traditional fundamental research product.  The largest single “big data” research initiative was rolled out by Morgan Stanley in 2008 with its Alpha Wise initiative.  Initially, AlphaWise conducted and charged hedge funds and asset managers for customized primary research, including market research, web bots, and expert surveys.  However, eventually, AlphaWise morphed into a unique data-driven research product (they call it evidence based) that Morgan Stanley clients could access based on hundreds of terabytes of proprietary survey and web scraped data.

Then in 2009, San Francisco-based alternative research firm, DISCERN started up to build a “big data” research model to meet the varied needs of different types of institutional investors.  As mentioned last week, DISCERN is an institutional equity research firm which covers a wide range of sectors, including specialty pharmaceuticals, financial services, energy, and real estate investment trusts.  Its research is based on the statistical analysis of huge amounts of structured and unstructured data to identify unique investment signals, and then overlays this data analysis with contextual insights generated by experienced sell-side analysts to identify key trends, market deflection points, and other potential investable opportunities.  Discern provides a wide range of unbundled services, including allowing buy-side clients to license its data, the predictive signals it has developed, or the extensive research it generates.

Consequences for the Research Industry

So, does the adoption of “big data” by these three research firms have any long-term consequence for the research industry?  We think so.  Clearly, these three firms have found extensive market acceptance of their data centric research products.  Based on our own analysis, DISCERN has been one of the faster growing alternative research firms in the industry over the past few years.  In addition, research produced by the AlphaWise team has been some of the most highly demanded research produced by Morgan Stanley.

If investment research firms want to compete with growing research firms like Majestic (now ITG), Morgan Stanley’s AlphaWise, or DISCERN, they are going to have to make significant changes to their traditional research businesses.  They will need to invest in building (or renting) scalable data warehouses and other technical infrastructure.  In addition, they will need to start thinking about what public data they want to collect; what commercial data they want to license; and more importantly, what proprietary data they should create by conducting longitudinal surveys, implementing web scraping technology, or utilizing other primary research techniques.  They will also need to hire data scientists or analysts skilled in building predictive analytics to work alongside their traditional analysts with deep company or industry expertise.

In addition, they will need to think through their business model and product set.  Do they only want to sell access to their analysts and the research reports they wrote as they did in the past?  Or, do they want to unbundle their research process and allow customers to cherry pick exactly what they want, including data, investment signals, full research reports, custom work, etc.

And of course, how do they want their clients to pay for this research – through hard dollar subscriptions, CSAs, trading, or some combination of the above? It is interesting to note that currently, all three of the companies mentioned earlier in this article, ITG, Morgan Stanley and DISCERN enable their clients to pay for these “big data” research services by trading through their trading desks.

Winners & Losers in “Big Data” Research?

Given the significant financial investment required in people, data, and technology, we suspect the obvious winners in rolling out “big data” research services are likely to be sell-side investment banks.  Clearly, many of these firms have produced data driven research in the past based on the analysts they have hired.  The move to a “big data” focus will really be a commitment on the part of the firm to basing all their investment research on a deep statistical analysis of underlying data sources.

However, it is important to note that many sell-side firms will not choose to make the switch to a “big data” based research process, nor will everyone that tries to do so succeed.  One of the major impediments to success is a bank’s unwillingness to make the financial commitment necessary to succeed.  Certainly, the downturn in equity commissions over the past five years and the struggle to make equity research pay could convince many management teams that an investment in “big data” research is just too risky with no obvious payoff.   Another reason some firms will fail in its big data efforts is their inability to adapt the firm’s culture to this new approach to investment research.

So, can alternative research firms succeed in developing “big data” research services?  We think so, though we do not think it will be a quick road to success.  In the past, both Majestic Research and DISCERN were alternative research firms that became successful in this space, though in each case, the firms developed their coverage and their product offering incrementally rather than trying to roll out broad coverage at the outset.

Similarly, we suspect that other alternative research firms will be successful in the future by initially focusing on a limited number of sectors based on a discreet set of proprietary data they have collected and few predictive signals they have developed.  These firms will be able to expand their businesses as they gain commercial traction by adding new sectors, more proprietary data, and additional analytic signals.

Another possible winner in the “big data” research space could be existing data aggregators or predictive analytics providers who decide to move up the value chain with their data products by adding context and investment insight to their offerings by hiring investment research analysts.  Unfortunately, we think that very few data aggregators or analytics providers will take the risk of stepping outside their domain expertise to enter the investment research business.  Consequently, we  don’t expect to see too many of these types of firms enter the research business.

Summary

In our view, the acceptance of “big data” techniques in the investment research industry is a forgone conclusion.  Buy-side investors have increasingly exhibited their appetite for unique datasets and for data driven research over the past decade.  To meet this hunger, a number of sell side and alternative research firms have well developed efforts under way, and we suspect that a few significant new entrants are likely to announce initiatives in 2014.

The real question remaining is how existing research firms plan to respond to this trend.  As we mentioned last week, the adoption of “big data” techniques enables a firm to develop a proprietary edge – whether it is through the collection of unique datasets or the development of proprietary signals that have predictive value.  We believe that as more research firms adopt a “big data” oriented research process, it will be increasingly harder for other traditional research firms, with no discernible proprietary edge to compete.  The days where a research firm could succeed solely on the basis of having a cadre of analysts with deep industry experience might be over.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

High Frequency Research: How HFT Impacts Investment Research

April 16th, 2014

The hubbub over high frequency trading (HFT) has implications for investment research.  The future direction of HFT will have direct implications for research.  And the Michael Lewis phenomenon is a cautionary tale for the research industry.

Low carb trading

It is estimated that HFT accounts for about half of trading in the US (down from two-thirds).  From the perspective of research, HFT is low-carb trading.  In other words, it doesn’t pay the research bills because it is too low margin and HFT firms don’t use Wall Street research.

From the HFT perspective, research is one of the obstacles to HFT growth.  In Michael Lewis’s controversial new book, the founder of IEX, a new exchange designed to counteract the evils of HFT, was challenged by buy-side inertia in directing trades because of the need to pay for research.  According to Lewis, the buy-side traders were outraged by HFT predatory tactics, yet continued to send orders to the HFT-infested dark pools operated by Wall Street banks like lambs to the slaughter.  All because of bank research.

The rise of high frequency research

What Lewis does not mention is that HFT has declined from its heyday, when it accounted for two-thirds of US market trading.  In 2009, high-frequency traders moved about 3.25 billion shares a day. In 2012, it was 1.6 billion a day, according to a Bloomberg BusinessWeek article.   It has declined because margins have declined from fierce competition among HFT firms.   Average profits have fallen from about a tenth of a penny per share to a twentieth of a penny.

Lewis misses the fundamental point with HFT: it is simply automated trading.  Yes, HFT trading has predatory practices, but that is not the core.  The core is computerized trading.  An unnerving aspect of HFT given surprisingly short shrift by Lewis is the frequency of flash crashes, which occur regularly but so quickly that humans can’t detect them.  Check out his excellent TED talk on HFT:  http://youtu.be/V43a-KxLFcg

For researchers, it is worth noting the current direction of HFT: high frequency research (HFR).  HFTs are using sophisticated programs to analyze news wires and headlines to instantly interpret implications. Some are scanning Twitter feeds, as evidenced by the Hash Crash, the sudden selloff that followed the Associated Press’s hacked Twitter account reporting explosions at the White House.  We can expect further developments and innovations in HFR as algorithms get more sophisticated.

Much has been written about the automation of trading and the declining role of human traders.  The automation of research is yet to be written.

A cautionary tale

One last point.  Flash Boys is compelling story of an outsider who uncovers HFT abuses and works to counteract them.   While it makes for a great read, it stretches belief that Brad Katsuyama was the first to discover the pernicious effects of HFT.

Like many other aspects of Wall Street, HFT was yet another open secret (although perhaps not understood in the exacting detail that Brad pursues).  Can you think of another generally accepted quirk that applies to research, just waiting for the next Michael Lewis tome?

HFT & Research

While the Wall Street banks have used HFT to augment cash equities revenues, that game is declining.  HFT is fundamentally hostile to traditional bank research.  Its trades don’t pay research bills, and ultimately HFT leads to a very different form of research that sends chills down the spine of every fundamental analyst.

Wall Street offers opportunities for talented writers to prosper by spotlighting commonly accepted idiosyncrasies in the markets.  Or for talented politicians seeking greater fame.  Will Soft Boys the next Lewis opus?

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Big Data and Investment Research: Part 1

April 14th, 2014

One of the key themes in the institutional investment research business this year is the growing importance of warehousing and generating meaningful signals and insights from “big data”.  This week we will discuss the impact this trend is starting to have on the internal investment research process at buy-side firms and next week we will write about how these developments will transform the sell-side and independent research business.

So What is Big Data?

The term “big data” when applied to investment research is one which is used by countless journalists, vendors, and consultants, though few really agree on a definition.  We see “big data” as the regular collection, storage and analysis of numerous structured and unstructured data sources to generate predictive insights and consistent investment returns.

One of the first real “big data” firms to serve the U.S. institutional investor was Majestic Research, founded in 2002 by Seth Goldstein and Tony Berkman.  Majestic Research entered into exclusive licenses with proprietary third-party data providers and they obtained data freely from the web enabling them to generate data driven insights on sales trends in industries such as telecommunications, real estate airlines.  The firm was sold to ITG in 2010 for $56 mln.

 

As you can see from the well-known IBM slide above, many analysts break “big data” down into four dimensions – Volume, Velocity, Variety, and Veracity.  However, the team at Integrity Research believes that one additional “V”, Validity should also be added when looking at “big data” from an institutional investor’s investment perspective.

Applying the 5 V’s to the Buy-Side

As you might guess, Integrity Research has interacted with many buy-side investors who either have already started a “big data” initiative or who are considering implementing one in the near-term.  So, what issues should buy-side investors be aware of as they plan to develop a “big data” effort to enhance their current investment research processes?

Volume: Consistent with the term “big data” one of the obvious characteristics of any big data initiative is the volume of data that investors must be prepared to collect and analyze.  As you can see from the slide above, 2.3 trillion gigabytes of data are created every day, with estimates suggesting that 43 trillion gigabytes of data will be created by 2020.  Consequently, buy-side investors looking to develop a big data strategy must be prepared to warehouse and analyze huge amounts of data – considerably more than they have ever worked with in the past.

Velocity: Not only is the volume of data huge, but most big data initiatives require that investors analyze this data in real-time to identify meaningful signals.  Fortunately, most buy-side investors are used to working with real-time data.

Variety: One of the key characteristics of “big data” initiatives is the variety of data types that buy-side investors can collect, including both structured and unstructured data.  A few of the major external data types that we have identified for buy-side clients include data from public sources, social media sites, crowd sourcing efforts, various transaction types, sensors, commercial industry sources, primary research vendors, exchanges and market data vendors.

Veracity: All investors understand the problem of poor data quality when trying to build a reliable research process.  Clearly, this becomes an exponentially more difficult issue for buy-side investors as they try to identify and ingest terabytes of data from numerous public and private sources, all who have different data collection and cleansing processes.  Consequently, investors often have to implement sophisticated data quality checks to make sure that the data they warehouse is reasonably accurate.

Validity: One important concern for buy-side investors when deciding what data they want to acquire and or collect is whether this data is actually useful in helping predict the movement of securities or asset prices.  Warehousing irrelevant data only increases cost and complexity without contributing value to the research process.  Consequently, buy-side investors need to clearly think through the potential validity of a dataset before it is acquired.

Big Data Benefits for the Buy Side

So why are buy-side investors starting to jump on the big data bandwagon?  Is it a fad, or is it a long-term trend for institutional investors?  In our mind, the adoption of “big data” methods for investing is merely the next logical step for investors looking to create a way to generate consistent returns.

One of the most obvious benefits of rolling out a “big data” initiative is to enable investors to create a systematic repeatable research process versus an investment process which is overly reliant on specific individuals.  Clearly, this has been the benefit of quantitative investment models used by some asset managers for years.  What is really interesting is the fact that a number of traditionally qualitative investors are now looking into “big data” techniques to add this as an overlay to their primary investment strategy.

A related benefit of implementing a “big data” project is the ability for buy-side investors to develop proprietary predictive signals from the data they are warehousing for individual stocks, sectors, ETFs or other market indices which can help generate consistent returns.  In fact, an investor’s ability to develop predictive signals is often only limited by their ingenuity in either finding existing datasets, their willingness and skill in building new proprietary datasets, and their creativity in analyzing this data.

Adopting a “big data” driven research process should also lower the research risk for institutional investors than for many traditional primary research driven approaches.  Clearly, and investor is unlikely to receive and trade on illicit inside information when using “big data” techniques.

Costs of Implementing Buy-Side Big Data Initiatives

Of course, adopting a “big data” research program is not without significant costs for a buy-side firm.  A few of these costs include:

Obviously, firms that have never implemented a significant quantitative investment strategy are likely not to have the expertise or knowledge to effectively implement a “big data” program.  This includes the expertise in finding unique data sources, technically integrating these data sources, cleaning / validating this data, warehousing huge volumes of data, analyzing this data and developing predictive signals, etc.  Consequently, buy-side firms looking to build “big data” initiatives will be forced to hire different types of talent than they ever have had to hire before.  Some of these professionals will need data integration skills, advanced analytics and predictive analysis skills, complex event processing skills, rule management skills, and experience with business intelligence tools.  Unfortunately, the current supply of high quality data scientists is considerably smaller than the exploding demand for their skills.

Hiring workers with these new skill sets is also likely to create a different issue for buy-side firms, and this is a management and corporate culture issue.  Clearly, these new employees will often need to be managed differently than their peers given their skills, experiences and personalities.  Consequently, finding managers who can effectively manage and motivate these new employees will be critical in recruiting, developing and keeping this talent.

Of course, one of the most significant costs of implementing a “big data” initiative at a buy-side firm is the upfront and ongoing financial investment required to be successful.  Not only does the firm have to hire the right talent (discussed previously), but they also have to acquire and/or build the right technical infrastructure, and they need to identify and acquire the right data.  In some instances, buy-side firms also need to invest in “creating” unique proprietary time series (e.g. by conducting longitudinal surveys or employing other primary research techniques) which will also require specialized know-how and a significant financial investment.

Alternatives to Building Big Data Programs In-House

Does this mean that only the largest buy-side firms have the management, technical or financial resources to successfully implement a “big data” program?  Well, the answer is yes and no.  If a buy-side firm wants to build this type of program in-house, then it will take considerable resources to pull off.  However, if a buy-side firm is willing to outsource some of this initiative to other partners, then it is possible to build a “big data” program more cost effectively.

In fact, there are a growing number of vendors who can provide buy-side investors with various components of a “big data” research program such as sourcing and aggregating unique data sets, leveraging a data warehouse for both external and proprietary data, and even building custom signals for investors.

One such vendor is DISCERN, a San Francisco-based research firm which collects large amounts of publicly available information.  The firm was founded in 2010 by former Lehman Brothers IT analyst Harry Blount.  Besides producing data-driven research, DISCERN has also leveraged cloud-based machine-learning algorithms and persistent queries to automate data aggregation, enhance data discovery and visualization, signal decision-makers about new business developments, and deliver masses of disparate data in a manageable format.

In addition to DISCERN, a number of new vendors have sprung up in recent years providing asset managers with aggregated sources of structured and unstructured public data, access to proprietary industry or transaction data covering various sectors, or investment focused signals based on various social media sources.  

Summary

As we mentioned earlier, adopting a “big data” research strategy is not without its own issues and related costs for any buy-side investor considering this course of action, including deciding whether to “build or buy”, acquiring the relevant know-how, finding appropriate management, and resourcing the project sufficiently to attain success.

Despite these issues, the use of “big data” research techniques is likely to transform a significant part of the investment community in the coming years as the buy-side looks to implement a research process which produces repeatable and consistent returns, and which does so at a lower risk than traditional approaches.

In our minds, one of the most exciting aspects of this trend is discovering what new data vendors, infrastructure providers, and analytic tool suppliers spring up to meet the growing buy-side demand to more easily and cost effectively adopt “big data” as an integral part of their research process.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Financial Social Media Coming of Age?

April 9th, 2014

Gnip, a data provider specializing in social media feeds, released a whitepaper saying that mining social media for financial market insights is three years behind branding analysis, and is reaching a tipping point.  Social media traffic relating to financial markets is growing exponentially.  Social media analysis is spreading from equity markets to foreign exchange, energy and commodities markets.  Gnip argues that financial analysis of social media is reaching a point of accelerating adoption equivalent to the earlier adoption of social media branding analysis.

Founded in 2008, Gnip provides data feeds of various social media sources.   Gnip was the first to distribute Twitter and Twitter archives and has added feeds of Foursquare, Word Press, Facebook and others.  Its primary client base utilizes its feeds for brand analysis, but sees growth in financial data mining of social media.

Growth in financial analysis

Early adopters of social data analytics in finance were a small set of hedge funds and high frequency traders.  In 2013, a few things happened to increase financial market interest in social media:

  • The SEC blessed social media outlets for corporate announcements in compliance with Regulation Fair Disclosure.
  • The Hash Crash in April took 140 points off the Dow in two minutes after the AP Twitter account was hacked and tweeted about explosions in the White House.
  • A tweet by Carl Icahn caused Apple’s stock price to add $12.5 billion in market value.

We also noted that a tweet by a Hedgeye analyst caused Kinder Morgan Inc (KMI) shares to drop 6 percent taking $4 billion off the company’s market capitalization.

Financial discussions on social media have grown over the past three years, in part thanks to the “Cashtag”.  Cashtagging is the convention of adding a “$TICKER(s)” tag to content to associate the discussion with tradable equities.  According to Gnip, in comparable periods from 2011 to 2014 cashtagged conversations on Twitter around Russell 1000 securities increased more than 550% reaching several million messages per quarter.  Use of cashtags has expanded beyond equities to FX, futures, and commodities.

A tipping point?

Gnip claims that financial analysis of social media is now entering the second stage of a product S-curve, the stage of rapid adoption.  It argues that adoption is about three years behind the use of social media data for branding analysis, which accelerated around 2010.

Gnip is seeing two primary use cases for financial analysis of social media.  One is to mine social media (mainly Twitter) for news.  Bloomberg and Thomson Reuters have added filtered data from Twitter and StockTwits to their platforms.  News oriented startups include Eagle Alpha, Hedge Chatter, Market Prophit and Finmaven.

The second use case is to apply analytics to social media to create scores, signals and other derived data from Twitter or other social media.  These companies include Social Market Analytics, Contix, Eagle Alpha, Market Prophit, Infinigon, TheySay, Knowsis, Dataminr, PsychSignal and mBlast.

Our take

Although Gnip has a clear incentive to talk its book, there is no question that the intersection of social media and finance is growing.   However, there are still some formidable barriers to be breached before social media becomes a mainstream financial markets vehicle.  One is regulatory.  Although the SEC condoned use of social media for company announcements, the use of social media for the distribution of sell-side stock research is still problematic, not least because of required disclosures.  More importantly most producers of sell-side research (the banks) strive to control and measure access to its research.

Which isn’t to say that social media won’t ultimately disrupt the current research business model.   Academic studies suggest that crowd-sourced models such as Seeking Alpha and SumZero are outperforming Street research.   Deutsche Bank’s Quantitative Research Group recently validated the use of Estimize’s crowd-sourced estimates data in quantitative research models.  However, it is difficult to disrupt a business model subsidized by client commissions totaling over $10 billion globally.

Difficult, but not impossible.  Financial analysis of social media will continue to grow, investors will increasingly mine big data, whether derived from social media or other sources, and crowd sourcing of financial analysis will increase.  The tipping point, however, is still ahead of us and has some obstacles to surmount.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

JPMorgan’s Ambition To Boost Cash Equities Biz

April 7th, 2014

Despite being the largest US bank, JPMorgan has found itself lagging behind its biggest rivals in the cash equities business.  Based on its annual investor day presentation, the bank is looking to turn this around by continuing to make investments in electronic trading and hedge fund services, hoping this will help the bank become a Top 3 firm in cash equities.

Background

According to research firm Coalition Ltd, in 2013 JPMorgan ranked first in investment banking, equities origination, fixed income, commodities and currencies, while coming in second for derivatives products.

Unfortunately, JPM ranked sixth globally in cash equities in 2013 reflecting a 5th place rank in North America, a seventh place rank in EMEA and an 8th place finish in Asia.

Most industry experts explain JPM’s struggles in cash equities to be a result of weakness in electronic equities trading.  While most of JPM’s competitors have been investing heavily in this area in recent years, JMP focused instead on sales and equity research.  Only recently has JPM started investing in boosting its electronic trading capabilities.

A study released recently by Tabb Group backed up this contention.  In a survey of 58 head equity traders of both long only investors and hedge funds in Europe, the UK and the US, Tabb found that they rated UBS, Morgan Stanley and Credit Suisse as the best providers of electronic equity trading systems.  JPMorgan was not mentioned.

Strategy To Turn Around This Weakness

In JPMorgan’s recent investor’s day presentation, Mike Cavanagh and Daniel Pinto, co-heads of the firm’s investment bank, said the bank plans to ‘strengthen its equities position this year.’

Part of the bank’s strategy to accomplish this would be to make continued investments in prime brokerage, electronic trading, equities (including equity research), and OTC Clearing and Collateral Management (particularly in EMEA).

However, some at the bank suggest that JPM just needs to get more aggressive with customers making sure they get paid for the services they provide – a move that their competitors have been much better at than JPM.

Tim Throsby, global head of equities in London explained this issue, “In the past, those top players in this space were very aggressive in having persistent, repeated conversations with their clients, saying, ‘What are you giving us?  That was never our approach. That’s something that other firms were far more effective at earlier than we were.”

Why Bother?

However, some might ask whether it makes sense for JPM to try and boost its position in cash equities.  After all, the cash equities business has suffered from falling volumes and squeezed profit margins in recent years.

In fact some European banks cut back on their cash equities businesses given these tough market dynamics.  For example, Barclays, Nomura, and UniCredit have all reduced their equities businesses in the past six months.

Despite these trends, JPM management feel that the equities business can be profitable, particularly for the top players in the business, with the top three firms in cash equities generating handsome profits.

JPM’s Ultimate Goal

Throsby summed up JPMorgan’s goal for its cash equities business, saying “One of the big themes for the firm is to bring cash equities up to that same level [as the rest of the bank].  There are some very obvious areas where we are working to improve, and where we think there is a great opportunity.  Third or fourth place would be a realistic medium-term ambition, with second or third in the long term.”

That’s a pretty big ambition to move from 6th to 2nd in cash equities, but given what JPMorgan has done with the rest of its business, your probably don’t want to bet against them.  We will just have to wait and see.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Morningstar Buys BuyAllAccounts

April 2nd, 2014

Morningstar Inc., a leading research provider to the retail channel, announced it has acquired ByAllAccounts, Inc., a provider of account aggregation data to financial advisors, asset managers and wealth managers.  The purchase price is $28 million, described as ‘relatively paltry’ although ByAllAccounts’ revenues were not disclosed.

ByAllAccounts

ByAllAccounts allows advisors to view accounts held at rival custodians or banks and provide clients with aggregated analysis of total portfolios, irrespective of which portion of the portfolios advisors control.  The firm has connections with 4,300 custodians and 40 platform and service providers, facilitating the aggregation of account data.

Founded in 1999, the company was sold to State Street in 2004 and then bought back in 2008 for $5 million after the firm had languished at State Street.  The company received a $5 million capital infusion in 2010.

The company has 60 employees and around 1,500 clients which are mostly RIAs and financial advisors.

Our take

Envestnet, a publicly traded competitor to Morningstar in the RIA space, was reportedly also pursuing ByAllAccounts, according to an article in RIABiz.  The acquisition will give a boost to Morningstar’s RIA software, Morningstar Office, which was reportedly languishing from a lack of differentiation.  It will also give Morningstar insights into RIA asset flows and holdings.

We are estimating a revenue multiple between 2.5x and 3x.  An average of $150,000 in revenues per person for the 60 ByAllAccounts employees would put total revenues around $9 million, for a multiple in the neighborhood of 3x.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Estimize Caps Off Three Big Wins With $1.2 mln Funding

March 31st, 2014

Last week saw some great upsets in the NCAA Division I Men’s basketball tournament (Go Flyers and Huskies!). However, crowd-sourced earnings estimate provider, Estimize, experienced its own March Madness last week, with three notable wins, capped off by a $1.2 mln funding round reflecting the numerous advances the firm has made over the past eighteen months.

Company Background

Founded in June 2011, Estimize is an open financial estimates platform based on the belief that the “wisdom of the crowds” is more accurate then estimates provided purely by sell-side analysts. The firm was established by Leigh Drogen, a former quantitative hedge fund analyst, who felt that Wall Street analysts are not correctly incented to provide accurate and unbiased earnings estimates due to inherent conflicts within the investment banking business model.

Currently Estimize collects revenue and earnings estimates from 4,232 contributors, including professional analysts from the buy-side and sell-side, as well as private investors and even students.  The consensus estimates collected by Estimize cover more than 900 U.S. publicly traded stocks on a quarterly basis.

The core consensus data set generated by Estimize is open and free to everyone, regardless of an individual’s contribution level.  The firm currently generates revenue by selling access to their dataset via an API for professional investors.  At present, a handful of clients pay for access to its API, with almost two dozen more who are testing the data now.  The company is also in talks to feed its data to a few financial media platforms. Management explains that Estimize’s revenue has grown 100% each quarter.

Deutsche Bank Report Validates Estimize Data

One of the first wins experienced by Estimize last week was an extremely encouraging report published by Deutsche Bank’s Quantitative Research Group validating the use of Estimize data in quantitative research models.  Following are a few of the findings from the Deutsche Bank report:

  • DB found that the Estimize consensus forecasts more accurately identified earnings surprises than the less timely IBES estimates – a factor that led to capturing more of the stock price move post earnings announcement.
  • DB also found that as the number of contributors to the Estimize consensus increased, the forecast accuracy relative to IBES also increased.  For example, EPS estimates for stocks with more than 20 Estimize contributors are more accurate than the IBES consensus 2/3rds of the time.
  • DB also found, much to their surprise, that the EPS prediction accuracy of Estimize contributors who were finance professionals underperformed the accuracy of non-professionals.
  • In addition, DB found that combining the Estimize estimates of professionals and non-professionals increased the accuracy of these estimates above the accuracy of any one of the groups alone.

In wrapping up, the quantitative group at Deutsche Bank agreed that, “In conclusion we found multiple benefits to using the Estimize dataset; especially in the case of short-term applications in which accuracy is essential. Another interesting byproduct of the analysis was the power of crowdsourcing. We found that some of the value-added in the Estimize dataset was due to the ‘wisdom of crowds’ effect as more predictions give way to greater accuracy. Moreover, the diversity of the contributors provides a greater spectrum of information which can potentially improve investment strategies based on estimates.”

Launch of Mergerize.com

Another major development for Estimize last week was the launch of a new financial information product called Mergerize which provides crowd-sourced expectations for mergers and acquisitions.

At its most basic, Mergerize (www.mergerize.com) is a platform where people can guess whether a company will get bought by another company, or whether they will buy another company, and at what price.  Drogen explained, “It’s all the same set up as Estimize, but it’s more about if Company X will buy Company Y for X dollars.”  The platform will eventually provide M&A expectations on more than 4,800 public companies, private companies, and even start-ups.

Drogen, a former buy-side quant analyst explained the potential value of this M&A expectations data, “How might the resulting data be used by traders? Well for the quants, I expect that they will use Mergerize data to exclude certain stocks from their trading at certain times given that there is outsized risk associate with M&A transactions in their models. If Mergerize data can mitigate this risk by putting a “no trade” tag on certain stocks, it should be very valuable. For discretionary traders, specifically of the event driven persuasion, having a set of M&A expectations should allow them to take advantage of their own beliefs if they see that they are significantly different from the crowd. If the market feels a deal will take place, and you believe there’s a premium built into a stock because of that expectation, but you don’t think that deal is going to happen, then you probably believe at some point that premium is going to come out, and you should be short. My bet is also that there will be a high correlation between the velocity of people predicting a certain deal and the implied volatility of the target in the deal. I’m sure traders will figure out how to utilize that correlation.”

Of course, Mergerize.com is a brand new expectations service and doesn’t offer many of the features found on Estimize.com, nor does it have the volume of estimates of that site.  So far, a total of 38 predictions have been made, focusing on 26 predicted targets and 28 potential acquirers.  However, Drogen is hopeful about the new service, explaining “Over time as we see the platform grow more resources will be put towards it.”

$1.2 mln Funding Round Completed

Capping off a great week, Estimize closed a $1.2 mln Series A-1 funding round, at approximately 3 times the valuation it received in its previous funding round 18 months earlier, reflecting the progress the company has made on building out its the product, data set, and team.

The financing round included the firm’s current venture investors from Contour Venture Partners and Longworth Venture Partners.  It also included a syndicate of angel investors put together by ValueStream Labs, along with individual angels Brian Finn (former CEO of Credit Suisse), Mike Towey (Director of Research at Susquehanna), and Jason Finger (Founder SeamlessWeb).

Drogen explained his excitement about the angel syndicate put together by ValueStream Labs, “It has always been my goal to do a “crowdfunding” round at Estimize. Our platform is reliant on a network effect, and we want as many people as possible who use Estimize to be incented to grow our community with us. And while our government has made strides in opening up the legal framework for crowdfunding, it’s still touch and go in many respects when it comes to accredited vs non accredited investors. The AngelList syndicate structure gets us half way, and has allowed us to bring in 20+ amazing investors from across the financial ecosystem in a clean way for our cap table.”

Summary

The past few weeks have been particularly noteworthy for “crowd-sourced” investment research sites, including SeekingAlpha, and now Estimize.  While few firms have fully embraced the view that the “wisdom of the crowds” can be even more valuable to investors than traditional investment research and analyst estimates, a small but growing number of studies have shown that this clearly could be the case.

In our mind, what is particularly exciting is the extension of the “crowd-sourcing” model to quantify new areas of “investor expectations” like M&A activity.  Only time will tell if this data will be accurate and useful for investors.  However, whether the Mergerize product is successful or not, we think professional investors will become increasingly more willing to consider using non-traditional research data from “crowd-sourced” investment research businesses like Estimize, Seeking Alpha, StockTwits, or other new data and research vendors that are likely to spring up in the next few years.  Clearly, “crowd-sourcing” is here to stay.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Gerson Lehrman 2.0

March 26th, 2014

Earlier this month, Gerson Lehrman Group, the world’s leading expert network, introduced a new logo and revamped its website. The change is symptomatic of the resurgence of expert networks since the onset of the insider trading investigations.

Gerson Lehrman, reportedly coming off a year of record revenues in 2013, is rebranding itself as GLG, not to be confused with GLG Partners, the $30 billion hedge fund based in London. Besides a state-of-the-art new website, it will be moving its headquarters into state-of-the-art new offices at One Grand Central Place in Manhattan this summer. And it is hiring furiously. It would not be surprising if the long deferred IPO were announced later this year.

GLG Tech

With the new website comes a new url: www.glg.it. Note the .it domain. No, Gerson Lehrman has not moved to Italy. It doesn’t even have an office in Italy. More likely the domain is meant to evoke ‘Information Technology’, seeking to position GLG as a tech company.

Aspiring to be viewed as a tech company is not a new phenomenon for GLG. During its last rebranding in 2011, the firm launched a social media site named G+ which was going to be the new social media face of Gerson Lehrman: a LinkedIn with gravitas or, more accurately, a business-oriented version of Quora. However, Google did not take kindly to the G+ brand, and GLG changed it to ‘High Table’ in 2012. With the latest rebranding this month, GLG has quietly killed www.hightable.com, redirecting the url to its main site.

The GLG research website still lives on (www.glgresearch.com) highlighting GLG’s events, offering website information in 12 different languages and featuring GLG’s research store which allows purchase of one-off reports and conference transcripts. However, parts of the GLG research site already redirect to www.GLG.it so it may just be a matter of time before it goes away also.

Expert Membership Network

The other key facet of GLG’s rebranding is to distance it from being an expert network. Gerson Lehrman is no longer an expert network, it is a membership: “GLG is building the world’s largest membership for professional learning and expertise, so clients in every sector can interact with experts to gain insight.” Oops, experts slipped in there. The guideline for writing GLG’s copy appears to require sparing references to experts or networks and never ever put the two words together.

The irony here is that being an expert network, and more especially being an expert network during the insider trading investigation, has ultimately been good for Gerson Lehrman. After its planned sale to Goldman Sachs cratered over liability issues in 2004, GLG began aggressively beefing up its compliance platform. By 2008, the stringency of GLG’s compliance platform was becoming a competitive issue, with smaller, nimbler expert networks touting their quicker response times (and implicitly their laxer compliance regimes).

Consolidation

Gerson Lehrman had an even bigger problem as pricing was pressured by greater competition. GLG created a large price umbrella, typically charging over $1000 per hour for consultations, and there were few barriers to entry for new expert networks. By 2010, there were around 50 expert networks globally, and the consultation fees were dropping to $800 per hour. Then in late 2010, the insider trading investigations went public and an expert network competitor, Primary Global, was deeply implicated in the scandal.

Suddenly, the words ‘expert network’ raised the blood pressure of buy-side compliance officers and GLG became the safe choice. GLG was not immune from the overall decline in expert networks by asset managers immediately following the Primary Global scandal, but it picked up market share. And the number of competitors shrank by more than half. Now, as usage of expert networks has rebounded, GLG is a key beneficiary.

Going forward

GLG is hiring aggressively. With a staff of 850, it is already an order of magnitude larger than most other expert networks. It is advertising over 70 new jobs as it continues to scale. Three quarters of the new jobs are in the US, reflecting in part the centralization of many of its back office operations in Austin Texas, but it is also hiring in China, India (where much of its expert sourcing occurs), UK, Ireland and Singapore.

Overall, we think the new rebranding is more successful than its previous brandings, and reflects the confidence engendered by the resurgence of its business (and big bucks spent on high end designers). The end goal is an IPO exit for its private equity investors, Silver Lake Partners and Bessemer Venture Partners. The exit was delayed by the insider trading scandals, but now that GLG has resumed a growth path it is just a matter of time before GLG files, perhaps later this year.

 

Earlier this month, Gerson Lehrman Group, the world’s leading expert network, introduced a new logo and revamped its website. The change is symptomatic of the resurgence of expert networks since the onset of the insider trading investigations.

Gerson Lehrman, reportedly coming off a year of record revenues in 2013, is rebranding itself as GLG, not to be confused with GLG Partners, the $30 billion hedge fund based in London. Besides a state-of-the-art new website, it will be moving its headquarters into state-of-the-art new offices at One Grand Central Place in Manhattan this summer. And it is hiring furiously. It would not be surprising if the long deferred IPO were announced later this year.

GLG 2.0

With the new website comes a new url: www.glg.it. Note the .it domain. No, Gerson Lehrman has not moved to Italy. It doesn’t even have an office in Italy. More likely the domain is meant to evoke ‘Information Technology’, seeking to position GLG as a tech company.

Aspiring to be viewed as a tech company is not a new phenomenon for GLG. During its last rebranding in 2011, the firm launched a social media site named G+ which was going to be the new social media face of Gerson Lehrman: a LinkedIn with gravitas or, more accurately, a professional version of Quora. However, Google did not take kindly to the G+ brand, and GLG changed it to ‘High Table’ in 2012. With the latest rebranding this month, GLG has quietly killed www.hightable.com, redirecting the url to its main site. http://www.integrity-research.com/cms/2011/05/10/rebranding-gerson-lehrman/ http://www.integrity-research.com/cms/2012/05/15/gerson-agonistes/

The GLG research website still lives on (www.glgresearch.com) highlighting GLG’s events, offering website information in 12 different languages and featuring GLG’s research store which allows purchase of one-off reports and conference transcripts. However, parts of the GLG research site already redirect to www.GLG.it so it may just be a matter of time before it goes away also.

Not an expert network

The other key facet of GLG’s rebranding is to distance it from being an expert network. Gerson Lehrman is no longer an expert network, it is a membership: “GLG is building the world’s largest membership for professional learning and expertise, so clients in every sector can interact with experts to gain insight.” Oops, experts slipped in there. The guideline for writing GLG’s copy appears to require sparing references to experts or networks and never ever put the two words together.

The irony here is that being an expert network, and more especially being an expert network during the insider trading investigation, has ultimately been good for Gerson Lehrman. After its planned sale to Goldman Sachs cratered over liability issues in 2005, GLG began aggressively beefing up its compliance platform. By 2008, the stringency of GLG’s compliance platform was becoming a competitive issue, with smaller, nimbler expert networks touting their quicker response times (and implicitly their laxer compliance regimes).

Consolidation

Gerson Lehrman had an even bigger problem as pricing was pressured by greater competition. GLG created a large price umbrella, typically charging over $1000 per hour for consultations, and there were few barriers to entry for new expert networks. By 2010, there were around 50 expert networks globally, and the consultation fees were dropping to $800 per hour. Then in late 2010, the insider trading investigations went public and an expert network competitor to GLG, Primary Global, was deeply implicated in the scandal.

Suddenly, the words ‘expert network’ raised the blood pressure of buy-side compliance officers and GLG became the safe choice. GLG was not immune from the overall decline in expert networks by asset managers immediately following the Primary Global scandal, but it picked up market share. And the number of competitors shrank by more than half. Now, as usage of expert networks has rebounded, GLG is a key beneficiary. http://www.integrity-research.com/cms/2013/11/11/resurrection-of-expert-networks/

Going forward

GLG is hiring aggressively. With a staff of 850, it is already an order of magnitude larger than most other expert networks. It is advertising over 70 new jobs as it continues to scale. Three quarters of the new jobs are in the US, reflecting in part the centralization of many of its back office operations in Austin Texas, but it is also hiring in China, India (where much of its expert sourcing occurs), UK, Ireland and Singapore.

Overall, we think the new rebranding is more successful than its previous brandings, and reflects the resurgence of its business. The end goal is an IPO exit for its private equity investors, Silver Lake Partners and Bessemer Venture Partners. The exit was delayed by the insider trading scandals, but now that GLG has resumed a growth path, it is just a matter of time before GLG files, perhaps later this year.

Expert Membership Network

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Crowd Sourced Stock Ideas Beat Wall Street Analysts

March 24th, 2014

A recent academic study revealed that the stock market ideas published on internet investor forum Seeking Alpha outperformed the research written by Wall Street analysts and articles published by financial news services over the past seven years supporting the belief in the “wisdom of the crowds”.

Background of the Study

Last week, the Wall Street Journal wrote an article highlighting a recent academic study called Wisdom of Crowds: The Value of Stock Opinions Transmitted Through Social Media which found that stock market recommendations published on internet-based investor forum SeekingAlpha.com predicted individual stock returns, as well as earnings surprises, in excess of what is provided by Wall Street analyst reports and financial news articles.

The study, which will be published shortly in the Review of Financial Studies, was written by Hailiang Chen, Prabuddha De, Byoung-Hyoun Hwang, and Yu (Jeffrey) Hu of City University of Hong Kong, Purdue University and Georgia Institute of Technology.  The researchers analyzed the sentiment of approximately 100,000 articles and related comments posted on SeekingAlpa.com between 2005 and 2012.

The study evaluated the percentage of positive or negative words in SeekingAlpha articles and the associated comments on a particular stock, and tracked the performance of that stock after the article and comments were posted.  In order to eliminate the short-term impact of investors immediately responding to published articles, the researchers started tracking the performance of these share prices 48 hours after the article was published.

In addition, the researchers tried to determine whether the articles on SeekingAlpha were primarily moving the market rather than predicting it.  They did this by also analyzing whether the sentiment of SeekingAlpha articles or the associated comments were highly correlated with earnings surprises.

Findings of the Study

What the researchers discovered was the more positive the articles and comments were on a specific stock, the more likely that stock was to perform better than similar stocks over the next several months.  They also found that negative articles and comments also underperformed similar stocks in the future.

In fact, the study concluded that on average, someone investing according to the sentiment of the articles and comments and holding positions for three months would consistently beat the market.  This finding held up even when a number of variables were controlled for, including Wall Street analyst recommendations, upgrade/downgrades, earnings surprises, and the sentiment of traditional financial news articles.

In addition, the study found that the aggregate opinion of SeekingAlpha articles and comments had a strong correlation with earnings surprises, suggesting that the articles did not merely move stock prices, but they had some predictive power.

“We find that the fraction of negative words in SA articles and comments strongly predict subsequent scaled earnings surprises,” says the study. “The earnings-surprise predictability suggests that the opinions expressed in SA articles and comments indeed provide value-relevant information (beyond that provided by financial analysts).”

The study also found that SeekingAlpha articles and comments predicted stock returns over every time-frame examined: three months, six months, one year and three years.

Lastly, the study concluded that in cases where there is broad-based disagreement between the authors of articles and the community’s comments to these articles, the sentiment of the community was more accurate than the authors in predicting future stock price performance and earnings surprises.

Some Issues with Seeking Alpha

Despite the support that SeekingAlpha and crowdsourcing of investment ideas received from this recent academic study, a few have been quick to point out the dangers that sites like SeekingAlpha could create for individual investors.

In an article published this past weekend in Barron’s, called Seeking Alpha Needs to Take Stock of its Policies, John Kimelman wrote that several of SeekingAlpha’s policies could lead to sloppy analysis or even market manipulation by its contributors.

Kimelman wrote, “But Seeking Alpha has also received some troubling press in recent weeks, exposing problems inherent in the site’s policy of allowing anonymous contributors. The problems, I contend, also stem from less than stringent editorial standards that need to be tightened up so that Seeking Alpha can do a better job resisting stock manipulators who see the site as an easy mark.”

In the article, Kimelman discusses a recent instance when SeekingAlpha removed at least a half  dozen favorable articles about Galena Biopharma after concluding that the anonymous writers of these articles weren’t being truthful about their identities.  After investigating this issue, SeekingAlpha discovered that one individual wrote five articles under different pseudonyms.

Unfortunately, a number of Galena insiders sold shares of the stock which may have gained value, at least in part, due to the favorable articles.

Kimelman concludes that the by protecting the anonymity of the authors, SeekingAlpha is overlooking the dangers that could result from this policy, like enabling a writer to publish sloppy or one-sided analysis, or even allowing a single writer to try and manipulate stock prices by publishing overly bullish or bearish articles by making up phony aliases.

The second troubling policy Kimelman highlights is the fact that SeekingAlpha asks its writers to disclose whether they are long or short the stock when they write an article.  Unfortunately, the forum has no way to tell if the writer is telling the truth, an issue which could call into question whether the article is biased or not.

Kimelman concludes his article by explaining, “In response to some of my questions, [Eli] Hoffman [the Editor in Chief of SeekingAlpha] says that his site is taking steps to improve the quality of the site, including a new compensation system that rewards quality rather than just page-views. But no steps are being taken at this time to limit the use of anonymity.  I would urge them to reconsider that point.”

Summary

In our mind, the recent study clearly points out the value that crowd sourcing of investment ideas from a site such as SeekingAlpha, can provide to individual investors.  In fact, Yu Jeffrey Hu, associate professor at Georgia Tech’s Scheller College of Business and one of the authors of the academic study, pointedly explained that, “Seeking Alpha is the only platform to date that we have shown can predict individual stock returns.”

However, the recent Barron’s article also highlights how sites like SeekingAlpha can be abused by bad actors trying to profit from the impact the site has on stock prices – a fact that could put some investors at risk of being manipulated.

Given this, perhaps the most important role that SeekingAlpha, and other crowd sourced investment research sites, should play (at least at present) are they are great sources to find interesting ideas which investors can use to start their research process.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader