High Frequency Research: How HFT Impacts Investment Research

April 16th, 2014

The hubbub over high frequency trading (HFT) has implications for investment research.  The future direction of HFT will have direct implications for research.  And the Michael Lewis phenomenon is a cautionary tale for the research industry.

Low carb trading

It is estimated that HFT accounts for about half of trading in the US (down from two-thirds).  From the perspective of research, HFT is low-carb trading.  In other words, it doesn’t pay the research bills because it is too low margin and HFT firms don’t use Wall Street research.

From the HFT perspective, research is one of the obstacles to HFT growth.  In Michael Lewis’s controversial new book, the founder of IEX, a new exchange designed to counteract the evils of HFT, was challenged by buy-side inertia in directing trades because of the need to pay for research.  According to Lewis, the buy-side traders were outraged by HFT predatory tactics, yet continued to send orders to the HFT-infested dark pools operated by Wall Street banks like lambs to the slaughter.  All because of bank research.

The rise of high frequency research

What Lewis does not mention is that HFT has declined from its heyday, when it accounted for two-thirds of US market trading.  In 2009, high-frequency traders moved about 3.25 billion shares a day. In 2012, it was 1.6 billion a day, according to a Bloomberg BusinessWeek article.   It has declined because margins have declined from fierce competition among HFT firms.   Average profits have fallen from about a tenth of a penny per share to a twentieth of a penny.

Lewis misses the fundamental point with HFT: it is simply automated trading.  Yes, HFT trading has predatory practices, but that is not the core.  The core is computerized trading.  An unnerving aspect of HFT given surprisingly short shrift by Lewis is the frequency of flash crashes, which occur regularly but so quickly that humans can’t detect them.  Check out his excellent TED talk on HFT:  http://youtu.be/V43a-KxLFcg

For researchers, it is worth noting the current direction of HFT: high frequency research (HFR).  HFTs are using sophisticated programs to analyze news wires and headlines to instantly interpret implications. Some are scanning Twitter feeds, as evidenced by the Hash Crash, the sudden selloff that followed the Associated Press’s hacked Twitter account reporting explosions at the White House.  We can expect further developments and innovations in HFR as algorithms get more sophisticated.

Much has been written about the automation of trading and the declining role of human traders.  The automation of research is yet to be written.

A cautionary tale

One last point.  Flash Boys is compelling story of an outsider who uncovers HFT abuses and works to counteract them.   While it makes for a great read, it stretches belief that Brad Katsuyama was the first to discover the pernicious effects of HFT.

Like many other aspects of Wall Street, HFT was yet another open secret (although perhaps not understood in the exacting detail that Brad pursues).  Can you think of another generally accepted quirk that applies to research, just waiting for the next Michael Lewis tome?

HFT & Research

While the Wall Street banks have used HFT to augment cash equities revenues, that game is declining.  HFT is fundamentally hostile to traditional bank research.  Its trades don’t pay research bills, and ultimately HFT leads to a very different form of research that sends chills down the spine of every fundamental analyst.

Wall Street offers opportunities for talented writers to prosper by spotlighting commonly accepted idiosyncrasies in the markets.  Or for talented politicians seeking greater fame.  Will Soft Boys the next Lewis opus?

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Big Data and Investment Research: Part 1

April 14th, 2014

One of the key themes in the institutional investment research business this year is the growing importance of warehousing and generating meaningful signals and insights from “big data”.  This week we will discuss the impact this trend is starting to have on the internal investment research process at buy-side firms and next week we will write about how these developments will transform the sell-side and independent research business.

So What is Big Data?

The term “big data” when applied to investment research is one which is used by countless journalists, vendors, and consultants, though few really agree on a definition.  We see “big data” as the regular collection, storage and analysis of numerous structured and unstructured data sources to generate predictive insights and consistent investment returns.

One of the first real “big data” firms to serve the U.S. institutional investor was Majestic Research, founded in 2002 by Seth Goldstein and Tony Berkman.  Majestic Research entered into exclusive licenses with proprietary third-party data providers and they obtained data freely from the web enabling them to generate data driven insights on sales trends in industries such as telecommunications, real estate airlines.  The firm was sold to ITG in 2010 for $56 mln.

 

As you can see from the well-known IBM slide above, many analysts break “big data” down into four dimensions – Volume, Velocity, Variety, and Veracity.  However, the team at Integrity Research believes that one additional “V”, Validity should also be added when looking at “big data” from an institutional investor’s investment perspective.

Applying the 5 V’s to the Buy-Side

As you might guess, Integrity Research has interacted with many buy-side investors who either have already started a “big data” initiative or who are considering implementing one in the near-term.  So, what issues should buy-side investors be aware of as they plan to develop a “big data” effort to enhance their current investment research processes?

Volume: Consistent with the term “big data” one of the obvious characteristics of any big data initiative is the volume of data that investors must be prepared to collect and analyze.  As you can see from the slide above, 2.3 trillion gigabytes of data are created every day, with estimates suggesting that 43 trillion gigabytes of data will be created by 2020.  Consequently, buy-side investors looking to develop a big data strategy must be prepared to warehouse and analyze huge amounts of data – considerably more than they have ever worked with in the past.

Velocity: Not only is the volume of data huge, but most big data initiatives require that investors analyze this data in real-time to identify meaningful signals.  Fortunately, most buy-side investors are used to working with real-time data.

Variety: One of the key characteristics of “big data” initiatives is the variety of data types that buy-side investors can collect, including both structured and unstructured data.  A few of the major external data types that we have identified for buy-side clients include data from public sources, social media sites, crowd sourcing efforts, various transaction types, sensors, commercial industry sources, primary research vendors, exchanges and market data vendors.

Veracity: All investors understand the problem of poor data quality when trying to build a reliable research process.  Clearly, this becomes an exponentially more difficult issue for buy-side investors as they try to identify and ingest terabytes of data from numerous public and private sources, all who have different data collection and cleansing processes.  Consequently, investors often have to implement sophisticated data quality checks to make sure that the data they warehouse is reasonably accurate.

Validity: One important concern for buy-side investors when deciding what data they want to acquire and or collect is whether this data is actually useful in helping predict the movement of securities or asset prices.  Warehousing irrelevant data only increases cost and complexity without contributing value to the research process.  Consequently, buy-side investors need to clearly think through the potential validity of a dataset before it is acquired.

Big Data Benefits for the Buy Side

So why are buy-side investors starting to jump on the big data bandwagon?  Is it a fad, or is it a long-term trend for institutional investors?  In our mind, the adoption of “big data” methods for investing is merely the next logical step for investors looking to create a way to generate consistent returns.

One of the most obvious benefits of rolling out a “big data” initiative is to enable investors to create a systematic repeatable research process versus an investment process which is overly reliant on specific individuals.  Clearly, this has been the benefit of quantitative investment models used by some asset managers for years.  What is really interesting is the fact that a number of traditionally qualitative investors are now looking into “big data” techniques to add this as an overlay to their primary investment strategy.

A related benefit of implementing a “big data” project is the ability for buy-side investors to develop proprietary predictive signals from the data they are warehousing for individual stocks, sectors, ETFs or other market indices which can help generate consistent returns.  In fact, an investor’s ability to develop predictive signals is often only limited by their ingenuity in either finding existing datasets, their willingness and skill in building new proprietary datasets, and their creativity in analyzing this data.

Adopting a “big data” driven research process should also lower the research risk for institutional investors than for many traditional primary research driven approaches.  Clearly, and investor is unlikely to receive and trade on illicit inside information when using “big data” techniques.

Costs of Implementing Buy-Side Big Data Initiatives

Of course, adopting a “big data” research program is not without significant costs for a buy-side firm.  A few of these costs include:

Obviously, firms that have never implemented a significant quantitative investment strategy are likely not to have the expertise or knowledge to effectively implement a “big data” program.  This includes the expertise in finding unique data sources, technically integrating these data sources, cleaning / validating this data, warehousing huge volumes of data, analyzing this data and developing predictive signals, etc.  Consequently, buy-side firms looking to build “big data” initiatives will be forced to hire different types of talent than they ever have had to hire before.  Some of these professionals will need data integration skills, advanced analytics and predictive analysis skills, complex event processing skills, rule management skills, and experience with business intelligence tools.  Unfortunately, the current supply of high quality data scientists is considerably smaller than the exploding demand for their skills.

Hiring workers with these new skill sets is also likely to create a different issue for buy-side firms, and this is a management and corporate culture issue.  Clearly, these new employees will often need to be managed differently than their peers given their skills, experiences and personalities.  Consequently, finding managers who can effectively manage and motivate these new employees will be critical in recruiting, developing and keeping this talent.

Of course, one of the most significant costs of implementing a “big data” initiative at a buy-side firm is the upfront and ongoing financial investment required to be successful.  Not only does the firm have to hire the right talent (discussed previously), but they also have to acquire and/or build the right technical infrastructure, and they need to identify and acquire the right data.  In some instances, buy-side firms also need to invest in “creating” unique proprietary time series (e.g. by conducting longitudinal surveys or employing other primary research techniques) which will also require specialized know-how and a significant financial investment.

Alternatives to Building Big Data Programs In-House

Does this mean that only the largest buy-side firms have the management, technical or financial resources to successfully implement a “big data” program?  Well, the answer is yes and no.  If a buy-side firm wants to build this type of program in-house, then it will take considerable resources to pull off.  However, if a buy-side firm is willing to outsource some of this initiative to other partners, then it is possible to build a “big data” program more cost effectively.

In fact, there are a growing number of vendors who can provide buy-side investors with various components of a “big data” research program such as sourcing and aggregating unique data sets, leveraging a data warehouse for both external and proprietary data, and even building custom signals for investors.

One such vendor is DISCERN, a San Francisco-based research firm which collects large amounts of publicly available information.  The firm was founded in 2010 by former Lehman Brothers IT analyst Harry Blount.  Besides producing data-driven research, DISCERN has also leveraged cloud-based machine-learning algorithms and persistent queries to automate data aggregation, enhance data discovery and visualization, signal decision-makers about new business developments, and deliver masses of disparate data in a manageable format.

In addition to DISCERN, a number of new vendors have sprung up in recent years providing asset managers with aggregated sources of structured and unstructured public data, access to proprietary industry or transaction data covering various sectors, or investment focused signals based on various social media sources.  

Summary

As we mentioned earlier, adopting a “big data” research strategy is not without its own issues and related costs for any buy-side investor considering this course of action, including deciding whether to “build or buy”, acquiring the relevant know-how, finding appropriate management, and resourcing the project sufficiently to attain success.

Despite these issues, the use of “big data” research techniques is likely to transform a significant part of the investment community in the coming years as the buy-side looks to implement a research process which produces repeatable and consistent returns, and which does so at a lower risk than traditional approaches.

In our minds, one of the most exciting aspects of this trend is discovering what new data vendors, infrastructure providers, and analytic tool suppliers spring up to meet the growing buy-side demand to more easily and cost effectively adopt “big data” as an integral part of their research process.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Financial Social Media Coming of Age?

April 9th, 2014

Gnip, a data provider specializing in social media feeds, released a whitepaper saying that mining social media for financial market insights is three years behind branding analysis, and is reaching a tipping point.  Social media traffic relating to financial markets is growing exponentially.  Social media analysis is spreading from equity markets to foreign exchange, energy and commodities markets.  Gnip argues that financial analysis of social media is reaching a point of accelerating adoption equivalent to the earlier adoption of social media branding analysis.

Founded in 2008, Gnip provides data feeds of various social media sources.   Gnip was the first to distribute Twitter and Twitter archives and has added feeds of Foursquare, Word Press, Facebook and others.  Its primary client base utilizes its feeds for brand analysis, but sees growth in financial data mining of social media.

Growth in financial analysis

Early adopters of social data analytics in finance were a small set of hedge funds and high frequency traders.  In 2013, a few things happened to increase financial market interest in social media:

  • The SEC blessed social media outlets for corporate announcements in compliance with Regulation Fair Disclosure.
  • The Hash Crash in April took 140 points off the Dow in two minutes after the AP Twitter account was hacked and tweeted about explosions in the White House.
  • A tweet by Carl Icahn caused Apple’s stock price to add $12.5 billion in market value.

We also noted that a tweet by a Hedgeye analyst caused Kinder Morgan Inc (KMI) shares to drop 6 percent taking $4 billion off the company’s market capitalization.

Financial discussions on social media have grown over the past three years, in part thanks to the “Cashtag”.  Cashtagging is the convention of adding a “$TICKER(s)” tag to content to associate the discussion with tradable equities.  According to Gnip, in comparable periods from 2011 to 2014 cashtagged conversations on Twitter around Russell 1000 securities increased more than 550% reaching several million messages per quarter.  Use of cashtags has expanded beyond equities to FX, futures, and commodities.

A tipping point?

Gnip claims that financial analysis of social media is now entering the second stage of a product S-curve, the stage of rapid adoption.  It argues that adoption is about three years behind the use of social media data for branding analysis, which accelerated around 2010.

Gnip is seeing two primary use cases for financial analysis of social media.  One is to mine social media (mainly Twitter) for news.  Bloomberg and Thomson Reuters have added filtered data from Twitter and StockTwits to their platforms.  News oriented startups include Eagle Alpha, Hedge Chatter, Market Prophit and Finmaven.

The second use case is to apply analytics to social media to create scores, signals and other derived data from Twitter or other social media.  These companies include Social Market Analytics, Contix, Eagle Alpha, Market Prophit, Infinigon, TheySay, Knowsis, Dataminr, PsychSignal and mBlast.

Our take

Although Gnip has a clear incentive to talk its book, there is no question that the intersection of social media and finance is growing.   However, there are still some formidable barriers to be breached before social media becomes a mainstream financial markets vehicle.  One is regulatory.  Although the SEC condoned use of social media for company announcements, the use of social media for the distribution of sell-side stock research is still problematic, not least because of required disclosures.  More importantly most producers of sell-side research (the banks) strive to control and measure access to its research.

Which isn’t to say that social media won’t ultimately disrupt the current research business model.   Academic studies suggest that crowd-sourced models such as Seeking Alpha and SumZero are outperforming Street research.   Deutsche Bank’s Quantitative Research Group recently validated the use of Estimize’s crowd-sourced estimates data in quantitative research models.  However, it is difficult to disrupt a business model subsidized by client commissions totaling over $10 billion globally.

Difficult, but not impossible.  Financial analysis of social media will continue to grow, investors will increasingly mine big data, whether derived from social media or other sources, and crowd sourcing of financial analysis will increase.  The tipping point, however, is still ahead of us and has some obstacles to surmount.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

JPMorgan’s Ambition To Boost Cash Equities Biz

April 7th, 2014

Despite being the largest US bank, JPMorgan has found itself lagging behind its biggest rivals in the cash equities business.  Based on its annual investor day presentation, the bank is looking to turn this around by continuing to make investments in electronic trading and hedge fund services, hoping this will help the bank become a Top 3 firm in cash equities.

Background

According to research firm Coalition Ltd, in 2013 JPMorgan ranked first in investment banking, equities origination, fixed income, commodities and currencies, while coming in second for derivatives products.

Unfortunately, JPM ranked sixth globally in cash equities in 2013 reflecting a 5th place rank in North America, a seventh place rank in EMEA and an 8th place finish in Asia.

Most industry experts explain JPM’s struggles in cash equities to be a result of weakness in electronic equities trading.  While most of JPM’s competitors have been investing heavily in this area in recent years, JMP focused instead on sales and equity research.  Only recently has JPM started investing in boosting its electronic trading capabilities.

A study released recently by Tabb Group backed up this contention.  In a survey of 58 head equity traders of both long only investors and hedge funds in Europe, the UK and the US, Tabb found that they rated UBS, Morgan Stanley and Credit Suisse as the best providers of electronic equity trading systems.  JPMorgan was not mentioned.

Strategy To Turn Around This Weakness

In JPMorgan’s recent investor’s day presentation, Mike Cavanagh and Daniel Pinto, co-heads of the firm’s investment bank, said the bank plans to ‘strengthen its equities position this year.’

Part of the bank’s strategy to accomplish this would be to make continued investments in prime brokerage, electronic trading, equities (including equity research), and OTC Clearing and Collateral Management (particularly in EMEA).

However, some at the bank suggest that JPM just needs to get more aggressive with customers making sure they get paid for the services they provide – a move that their competitors have been much better at than JPM.

Tim Throsby, global head of equities in London explained this issue, “In the past, those top players in this space were very aggressive in having persistent, repeated conversations with their clients, saying, ‘What are you giving us?  That was never our approach. That’s something that other firms were far more effective at earlier than we were.”

Why Bother?

However, some might ask whether it makes sense for JPM to try and boost its position in cash equities.  After all, the cash equities business has suffered from falling volumes and squeezed profit margins in recent years.

In fact some European banks cut back on their cash equities businesses given these tough market dynamics.  For example, Barclays, Nomura, and UniCredit have all reduced their equities businesses in the past six months.

Despite these trends, JPM management feel that the equities business can be profitable, particularly for the top players in the business, with the top three firms in cash equities generating handsome profits.

JPM’s Ultimate Goal

Throsby summed up JPMorgan’s goal for its cash equities business, saying “One of the big themes for the firm is to bring cash equities up to that same level [as the rest of the bank].  There are some very obvious areas where we are working to improve, and where we think there is a great opportunity.  Third or fourth place would be a realistic medium-term ambition, with second or third in the long term.”

That’s a pretty big ambition to move from 6th to 2nd in cash equities, but given what JPMorgan has done with the rest of its business, your probably don’t want to bet against them.  We will just have to wait and see.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Morningstar Buys BuyAllAccounts

April 2nd, 2014

Morningstar Inc., a leading research provider to the retail channel, announced it has acquired ByAllAccounts, Inc., a provider of account aggregation data to financial advisors, asset managers and wealth managers.  The purchase price is $28 million, described as ‘relatively paltry’ although ByAllAccounts’ revenues were not disclosed.

ByAllAccounts

ByAllAccounts allows advisors to view accounts held at rival custodians or banks and provide clients with aggregated analysis of total portfolios, irrespective of which portion of the portfolios advisors control.  The firm has connections with 4,300 custodians and 40 platform and service providers, facilitating the aggregation of account data.

Founded in 1999, the company was sold to State Street in 2004 and then bought back in 2008 for $5 million after the firm had languished at State Street.  The company received a $5 million capital infusion in 2010.

The company has 60 employees and around 1,500 clients which are mostly RIAs and financial advisors.

Our take

Envestnet, a publicly traded competitor to Morningstar in the RIA space, was reportedly also pursuing ByAllAccounts, according to an article in RIABiz.  The acquisition will give a boost to Morningstar’s RIA software, Morningstar Office, which was reportedly languishing from a lack of differentiation.  It will also give Morningstar insights into RIA asset flows and holdings.

We are estimating a revenue multiple between 2.5x and 3x.  An average of $150,000 in revenues per person for the 60 ByAllAccounts employees would put total revenues around $9 million, for a multiple in the neighborhood of 3x.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Estimize Caps Off Three Big Wins With $1.2 mln Funding

March 31st, 2014

Last week saw some great upsets in the NCAA Division I Men’s basketball tournament (Go Flyers and Huskies!). However, crowd-sourced earnings estimate provider, Estimize, experienced its own March Madness last week, with three notable wins, capped off by a $1.2 mln funding round reflecting the numerous advances the firm has made over the past eighteen months.

Company Background

Founded in June 2011, Estimize is an open financial estimates platform based on the belief that the “wisdom of the crowds” is more accurate then estimates provided purely by sell-side analysts. The firm was established by Leigh Drogen, a former quantitative hedge fund analyst, who felt that Wall Street analysts are not correctly incented to provide accurate and unbiased earnings estimates due to inherent conflicts within the investment banking business model.

Currently Estimize collects revenue and earnings estimates from 4,232 contributors, including professional analysts from the buy-side and sell-side, as well as private investors and even students.  The consensus estimates collected by Estimize cover more than 900 U.S. publicly traded stocks on a quarterly basis.

The core consensus data set generated by Estimize is open and free to everyone, regardless of an individual’s contribution level.  The firm currently generates revenue by selling access to their dataset via an API for professional investors.  At present, a handful of clients pay for access to its API, with almost two dozen more who are testing the data now.  The company is also in talks to feed its data to a few financial media platforms. Management explains that Estimize’s revenue has grown 100% each quarter.

Deutsche Bank Report Validates Estimize Data

One of the first wins experienced by Estimize last week was an extremely encouraging report published by Deutsche Bank’s Quantitative Research Group validating the use of Estimize data in quantitative research models.  Following are a few of the findings from the Deutsche Bank report:

  • DB found that the Estimize consensus forecasts more accurately identified earnings surprises than the less timely IBES estimates – a factor that led to capturing more of the stock price move post earnings announcement.
  • DB also found that as the number of contributors to the Estimize consensus increased, the forecast accuracy relative to IBES also increased.  For example, EPS estimates for stocks with more than 20 Estimize contributors are more accurate than the IBES consensus 2/3rds of the time.
  • DB also found, much to their surprise, that the EPS prediction accuracy of Estimize contributors who were finance professionals underperformed the accuracy of non-professionals.
  • In addition, DB found that combining the Estimize estimates of professionals and non-professionals increased the accuracy of these estimates above the accuracy of any one of the groups alone.

In wrapping up, the quantitative group at Deutsche Bank agreed that, “In conclusion we found multiple benefits to using the Estimize dataset; especially in the case of short-term applications in which accuracy is essential. Another interesting byproduct of the analysis was the power of crowdsourcing. We found that some of the value-added in the Estimize dataset was due to the ‘wisdom of crowds’ effect as more predictions give way to greater accuracy. Moreover, the diversity of the contributors provides a greater spectrum of information which can potentially improve investment strategies based on estimates.”

Launch of Mergerize.com

Another major development for Estimize last week was the launch of a new financial information product called Mergerize which provides crowd-sourced expectations for mergers and acquisitions.

At its most basic, Mergerize (www.mergerize.com) is a platform where people can guess whether a company will get bought by another company, or whether they will buy another company, and at what price.  Drogen explained, “It’s all the same set up as Estimize, but it’s more about if Company X will buy Company Y for X dollars.”  The platform will eventually provide M&A expectations on more than 4,800 public companies, private companies, and even start-ups.

Drogen, a former buy-side quant analyst explained the potential value of this M&A expectations data, “How might the resulting data be used by traders? Well for the quants, I expect that they will use Mergerize data to exclude certain stocks from their trading at certain times given that there is outsized risk associate with M&A transactions in their models. If Mergerize data can mitigate this risk by putting a “no trade” tag on certain stocks, it should be very valuable. For discretionary traders, specifically of the event driven persuasion, having a set of M&A expectations should allow them to take advantage of their own beliefs if they see that they are significantly different from the crowd. If the market feels a deal will take place, and you believe there’s a premium built into a stock because of that expectation, but you don’t think that deal is going to happen, then you probably believe at some point that premium is going to come out, and you should be short. My bet is also that there will be a high correlation between the velocity of people predicting a certain deal and the implied volatility of the target in the deal. I’m sure traders will figure out how to utilize that correlation.”

Of course, Mergerize.com is a brand new expectations service and doesn’t offer many of the features found on Estimize.com, nor does it have the volume of estimates of that site.  So far, a total of 38 predictions have been made, focusing on 26 predicted targets and 28 potential acquirers.  However, Drogen is hopeful about the new service, explaining “Over time as we see the platform grow more resources will be put towards it.”

$1.2 mln Funding Round Completed

Capping off a great week, Estimize closed a $1.2 mln Series A-1 funding round, at approximately 3 times the valuation it received in its previous funding round 18 months earlier, reflecting the progress the company has made on building out its the product, data set, and team.

The financing round included the firm’s current venture investors from Contour Venture Partners and Longworth Venture Partners.  It also included a syndicate of angel investors put together by ValueStream Labs, along with individual angels Brian Finn (former CEO of Credit Suisse), Mike Towey (Director of Research at Susquehanna), and Jason Finger (Founder SeamlessWeb).

Drogen explained his excitement about the angel syndicate put together by ValueStream Labs, “It has always been my goal to do a “crowdfunding” round at Estimize. Our platform is reliant on a network effect, and we want as many people as possible who use Estimize to be incented to grow our community with us. And while our government has made strides in opening up the legal framework for crowdfunding, it’s still touch and go in many respects when it comes to accredited vs non accredited investors. The AngelList syndicate structure gets us half way, and has allowed us to bring in 20+ amazing investors from across the financial ecosystem in a clean way for our cap table.”

Summary

The past few weeks have been particularly noteworthy for “crowd-sourced” investment research sites, including SeekingAlpha, and now Estimize.  While few firms have fully embraced the view that the “wisdom of the crowds” can be even more valuable to investors than traditional investment research and analyst estimates, a small but growing number of studies have shown that this clearly could be the case.

In our mind, what is particularly exciting is the extension of the “crowd-sourcing” model to quantify new areas of “investor expectations” like M&A activity.  Only time will tell if this data will be accurate and useful for investors.  However, whether the Mergerize product is successful or not, we think professional investors will become increasingly more willing to consider using non-traditional research data from “crowd-sourced” investment research businesses like Estimize, Seeking Alpha, StockTwits, or other new data and research vendors that are likely to spring up in the next few years.  Clearly, “crowd-sourcing” is here to stay.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Gerson Lehrman 2.0

March 26th, 2014

Earlier this month, Gerson Lehrman Group, the world’s leading expert network, introduced a new logo and revamped its website. The change is symptomatic of the resurgence of expert networks since the onset of the insider trading investigations.

Gerson Lehrman, reportedly coming off a year of record revenues in 2013, is rebranding itself as GLG, not to be confused with GLG Partners, the $30 billion hedge fund based in London. Besides a state-of-the-art new website, it will be moving its headquarters into state-of-the-art new offices at One Grand Central Place in Manhattan this summer. And it is hiring furiously. It would not be surprising if the long deferred IPO were announced later this year.

GLG Tech

With the new website comes a new url: www.glg.it. Note the .it domain. No, Gerson Lehrman has not moved to Italy. It doesn’t even have an office in Italy. More likely the domain is meant to evoke ‘Information Technology’, seeking to position GLG as a tech company.

Aspiring to be viewed as a tech company is not a new phenomenon for GLG. During its last rebranding in 2011, the firm launched a social media site named G+ which was going to be the new social media face of Gerson Lehrman: a LinkedIn with gravitas or, more accurately, a business-oriented version of Quora. However, Google did not take kindly to the G+ brand, and GLG changed it to ‘High Table’ in 2012. With the latest rebranding this month, GLG has quietly killed www.hightable.com, redirecting the url to its main site.

The GLG research website still lives on (www.glgresearch.com) highlighting GLG’s events, offering website information in 12 different languages and featuring GLG’s research store which allows purchase of one-off reports and conference transcripts. However, parts of the GLG research site already redirect to www.GLG.it so it may just be a matter of time before it goes away also.

Expert Membership Network

The other key facet of GLG’s rebranding is to distance it from being an expert network. Gerson Lehrman is no longer an expert network, it is a membership: “GLG is building the world’s largest membership for professional learning and expertise, so clients in every sector can interact with experts to gain insight.” Oops, experts slipped in there. The guideline for writing GLG’s copy appears to require sparing references to experts or networks and never ever put the two words together.

The irony here is that being an expert network, and more especially being an expert network during the insider trading investigation, has ultimately been good for Gerson Lehrman. After its planned sale to Goldman Sachs cratered over liability issues in 2004, GLG began aggressively beefing up its compliance platform. By 2008, the stringency of GLG’s compliance platform was becoming a competitive issue, with smaller, nimbler expert networks touting their quicker response times (and implicitly their laxer compliance regimes).

Consolidation

Gerson Lehrman had an even bigger problem as pricing was pressured by greater competition. GLG created a large price umbrella, typically charging over $1000 per hour for consultations, and there were few barriers to entry for new expert networks. By 2010, there were around 50 expert networks globally, and the consultation fees were dropping to $800 per hour. Then in late 2010, the insider trading investigations went public and an expert network competitor, Primary Global, was deeply implicated in the scandal.

Suddenly, the words ‘expert network’ raised the blood pressure of buy-side compliance officers and GLG became the safe choice. GLG was not immune from the overall decline in expert networks by asset managers immediately following the Primary Global scandal, but it picked up market share. And the number of competitors shrank by more than half. Now, as usage of expert networks has rebounded, GLG is a key beneficiary.

Going forward

GLG is hiring aggressively. With a staff of 850, it is already an order of magnitude larger than most other expert networks. It is advertising over 70 new jobs as it continues to scale. Three quarters of the new jobs are in the US, reflecting in part the centralization of many of its back office operations in Austin Texas, but it is also hiring in China, India (where much of its expert sourcing occurs), UK, Ireland and Singapore.

Overall, we think the new rebranding is more successful than its previous brandings, and reflects the confidence engendered by the resurgence of its business (and big bucks spent on high end designers). The end goal is an IPO exit for its private equity investors, Silver Lake Partners and Bessemer Venture Partners. The exit was delayed by the insider trading scandals, but now that GLG has resumed a growth path it is just a matter of time before GLG files, perhaps later this year.

 

Earlier this month, Gerson Lehrman Group, the world’s leading expert network, introduced a new logo and revamped its website. The change is symptomatic of the resurgence of expert networks since the onset of the insider trading investigations.

Gerson Lehrman, reportedly coming off a year of record revenues in 2013, is rebranding itself as GLG, not to be confused with GLG Partners, the $30 billion hedge fund based in London. Besides a state-of-the-art new website, it will be moving its headquarters into state-of-the-art new offices at One Grand Central Place in Manhattan this summer. And it is hiring furiously. It would not be surprising if the long deferred IPO were announced later this year.

GLG 2.0

With the new website comes a new url: www.glg.it. Note the .it domain. No, Gerson Lehrman has not moved to Italy. It doesn’t even have an office in Italy. More likely the domain is meant to evoke ‘Information Technology’, seeking to position GLG as a tech company.

Aspiring to be viewed as a tech company is not a new phenomenon for GLG. During its last rebranding in 2011, the firm launched a social media site named G+ which was going to be the new social media face of Gerson Lehrman: a LinkedIn with gravitas or, more accurately, a professional version of Quora. However, Google did not take kindly to the G+ brand, and GLG changed it to ‘High Table’ in 2012. With the latest rebranding this month, GLG has quietly killed www.hightable.com, redirecting the url to its main site. http://www.integrity-research.com/cms/2011/05/10/rebranding-gerson-lehrman/ http://www.integrity-research.com/cms/2012/05/15/gerson-agonistes/

The GLG research website still lives on (www.glgresearch.com) highlighting GLG’s events, offering website information in 12 different languages and featuring GLG’s research store which allows purchase of one-off reports and conference transcripts. However, parts of the GLG research site already redirect to www.GLG.it so it may just be a matter of time before it goes away also.

Not an expert network

The other key facet of GLG’s rebranding is to distance it from being an expert network. Gerson Lehrman is no longer an expert network, it is a membership: “GLG is building the world’s largest membership for professional learning and expertise, so clients in every sector can interact with experts to gain insight.” Oops, experts slipped in there. The guideline for writing GLG’s copy appears to require sparing references to experts or networks and never ever put the two words together.

The irony here is that being an expert network, and more especially being an expert network during the insider trading investigation, has ultimately been good for Gerson Lehrman. After its planned sale to Goldman Sachs cratered over liability issues in 2005, GLG began aggressively beefing up its compliance platform. By 2008, the stringency of GLG’s compliance platform was becoming a competitive issue, with smaller, nimbler expert networks touting their quicker response times (and implicitly their laxer compliance regimes).

Consolidation

Gerson Lehrman had an even bigger problem as pricing was pressured by greater competition. GLG created a large price umbrella, typically charging over $1000 per hour for consultations, and there were few barriers to entry for new expert networks. By 2010, there were around 50 expert networks globally, and the consultation fees were dropping to $800 per hour. Then in late 2010, the insider trading investigations went public and an expert network competitor to GLG, Primary Global, was deeply implicated in the scandal.

Suddenly, the words ‘expert network’ raised the blood pressure of buy-side compliance officers and GLG became the safe choice. GLG was not immune from the overall decline in expert networks by asset managers immediately following the Primary Global scandal, but it picked up market share. And the number of competitors shrank by more than half. Now, as usage of expert networks has rebounded, GLG is a key beneficiary. http://www.integrity-research.com/cms/2013/11/11/resurrection-of-expert-networks/

Going forward

GLG is hiring aggressively. With a staff of 850, it is already an order of magnitude larger than most other expert networks. It is advertising over 70 new jobs as it continues to scale. Three quarters of the new jobs are in the US, reflecting in part the centralization of many of its back office operations in Austin Texas, but it is also hiring in China, India (where much of its expert sourcing occurs), UK, Ireland and Singapore.

Overall, we think the new rebranding is more successful than its previous brandings, and reflects the resurgence of its business. The end goal is an IPO exit for its private equity investors, Silver Lake Partners and Bessemer Venture Partners. The exit was delayed by the insider trading scandals, but now that GLG has resumed a growth path, it is just a matter of time before GLG files, perhaps later this year.

Expert Membership Network

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Crowd Sourced Stock Ideas Beat Wall Street Analysts

March 24th, 2014

A recent academic study revealed that the stock market ideas published on internet investor forum Seeking Alpha outperformed the research written by Wall Street analysts and articles published by financial news services over the past seven years supporting the belief in the “wisdom of the crowds”.

Background of the Study

Last week, the Wall Street Journal wrote an article highlighting a recent academic study called Wisdom of Crowds: The Value of Stock Opinions Transmitted Through Social Media which found that stock market recommendations published on internet-based investor forum SeekingAlpha.com predicted individual stock returns, as well as earnings surprises, in excess of what is provided by Wall Street analyst reports and financial news articles.

The study, which will be published shortly in the Review of Financial Studies, was written by Hailiang Chen, Prabuddha De, Byoung-Hyoun Hwang, and Yu (Jeffrey) Hu of City University of Hong Kong, Purdue University and Georgia Institute of Technology.  The researchers analyzed the sentiment of approximately 100,000 articles and related comments posted on SeekingAlpa.com between 2005 and 2012.

The study evaluated the percentage of positive or negative words in SeekingAlpha articles and the associated comments on a particular stock, and tracked the performance of that stock after the article and comments were posted.  In order to eliminate the short-term impact of investors immediately responding to published articles, the researchers started tracking the performance of these share prices 48 hours after the article was published.

In addition, the researchers tried to determine whether the articles on SeekingAlpha were primarily moving the market rather than predicting it.  They did this by also analyzing whether the sentiment of SeekingAlpha articles or the associated comments were highly correlated with earnings surprises.

Findings of the Study

What the researchers discovered was the more positive the articles and comments were on a specific stock, the more likely that stock was to perform better than similar stocks over the next several months.  They also found that negative articles and comments also underperformed similar stocks in the future.

In fact, the study concluded that on average, someone investing according to the sentiment of the articles and comments and holding positions for three months would consistently beat the market.  This finding held up even when a number of variables were controlled for, including Wall Street analyst recommendations, upgrade/downgrades, earnings surprises, and the sentiment of traditional financial news articles.

In addition, the study found that the aggregate opinion of SeekingAlpha articles and comments had a strong correlation with earnings surprises, suggesting that the articles did not merely move stock prices, but they had some predictive power.

“We find that the fraction of negative words in SA articles and comments strongly predict subsequent scaled earnings surprises,” says the study. “The earnings-surprise predictability suggests that the opinions expressed in SA articles and comments indeed provide value-relevant information (beyond that provided by financial analysts).”

The study also found that SeekingAlpha articles and comments predicted stock returns over every time-frame examined: three months, six months, one year and three years.

Lastly, the study concluded that in cases where there is broad-based disagreement between the authors of articles and the community’s comments to these articles, the sentiment of the community was more accurate than the authors in predicting future stock price performance and earnings surprises.

Some Issues with Seeking Alpha

Despite the support that SeekingAlpha and crowdsourcing of investment ideas received from this recent academic study, a few have been quick to point out the dangers that sites like SeekingAlpha could create for individual investors.

In an article published this past weekend in Barron’s, called Seeking Alpha Needs to Take Stock of its Policies, John Kimelman wrote that several of SeekingAlpha’s policies could lead to sloppy analysis or even market manipulation by its contributors.

Kimelman wrote, “But Seeking Alpha has also received some troubling press in recent weeks, exposing problems inherent in the site’s policy of allowing anonymous contributors. The problems, I contend, also stem from less than stringent editorial standards that need to be tightened up so that Seeking Alpha can do a better job resisting stock manipulators who see the site as an easy mark.”

In the article, Kimelman discusses a recent instance when SeekingAlpha removed at least a half  dozen favorable articles about Galena Biopharma after concluding that the anonymous writers of these articles weren’t being truthful about their identities.  After investigating this issue, SeekingAlpha discovered that one individual wrote five articles under different pseudonyms.

Unfortunately, a number of Galena insiders sold shares of the stock which may have gained value, at least in part, due to the favorable articles.

Kimelman concludes that the by protecting the anonymity of the authors, SeekingAlpha is overlooking the dangers that could result from this policy, like enabling a writer to publish sloppy or one-sided analysis, or even allowing a single writer to try and manipulate stock prices by publishing overly bullish or bearish articles by making up phony aliases.

The second troubling policy Kimelman highlights is the fact that SeekingAlpha asks its writers to disclose whether they are long or short the stock when they write an article.  Unfortunately, the forum has no way to tell if the writer is telling the truth, an issue which could call into question whether the article is biased or not.

Kimelman concludes his article by explaining, “In response to some of my questions, [Eli] Hoffman [the Editor in Chief of SeekingAlpha] says that his site is taking steps to improve the quality of the site, including a new compensation system that rewards quality rather than just page-views. But no steps are being taken at this time to limit the use of anonymity.  I would urge them to reconsider that point.”

Summary

In our mind, the recent study clearly points out the value that crowd sourcing of investment ideas from a site such as SeekingAlpha, can provide to individual investors.  In fact, Yu Jeffrey Hu, associate professor at Georgia Tech’s Scheller College of Business and one of the authors of the academic study, pointedly explained that, “Seeking Alpha is the only platform to date that we have shown can predict individual stock returns.”

However, the recent Barron’s article also highlights how sites like SeekingAlpha can be abused by bad actors trying to profit from the impact the site has on stock prices – a fact that could put some investors at risk of being manipulated.

Given this, perhaps the most important role that SeekingAlpha, and other crowd sourced investment research sites, should play (at least at present) are they are great sources to find interesting ideas which investors can use to start their research process.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Readership Survey Results

March 19th, 2014

A big thank you to all who completed the readership survey we conducted in January of this year.  ResearchWatch will celebrate its 10-year anniversary later this year, and the feedback from readers is helping us prioritize future improvements.

Reader Profile

Most readers are directly involved with research, either on the sell-side (68%) or the buy-side (15%).  More readers are in a management role (39%), followed by research (35%), sales (16%), trading (3%), commission management (2%) and other (5%).

Nearly half of readers (48%) are with independent research providers, 10% with investment banks, another 10% with agency brokers, 15% are on the buy-side, 5% with market data vendors and 12% with other types of firms.

Three-quarters of readers are U.S. based, with the remainder primarily comprised of readers from the UK (9%), continental Europe (5%), Asia (4%), Canada (2%), Latin America (2%) and rest of world (2%).

Most readers read the blog on Mondays when the weekly summary email is sent.  Readers use a wide variety of news sources, with the most popular being Bloomberg, the Wall Street Journal, Financial Times, and Institutional Investor.

Interests

Readers are interested in most varieties of research, but those of most interest are fundamental equity research (18%), industry research (15%) and expert networks (12%).

Readers prefer topics involving general research industry news, acquisitions involving research firms, and commissions related to research payments.

Guidance

We asked readers for feedback on what they valued about ResearchWatch.  Generally, readers like the consistent coverage of a specific niche, investment research, which is only periodically covered by the broader media.  Readers appreciate that ResearchWatch is relevant to their business, that it has an understanding of research industry dynamics, and they value the commentary (although they don’t always agree with it.)

We also asked for advice on improvements.  Overall, readers would like to see more content more frequently.  Many responders asked for more coverage of the UK, Europe and Asia.  There were excellent suggestions for specific topics, such as people moves, business models, sales and marketing activities, pricing, what buy-siders want, and more.

Some readers feel ResearchWatch has been too negative about the business environment and trends in the industry while others have the view that we hold back in our commentary at times.  All very helpful!

Going forward

The reader input has been tremendously valuable to us and we are taking it all on board.  We have started working on ways to expand the coverage and content of ResearchWatch, without disrupting what readers currently value.

Part of this will involve encouraging readers to contribute ideas, thought pieces and success stories with the broader community of readers.  If you have interest in contributing, let us know.   ResearchWatch can be a forum for a more diverse set of voices besides ours.

We will revert with more specific plans as they congeal.  In the meantime, thanks to our loyal readers for all their great feedback.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

SEC Strikes Again With New SAC Insider Trading Charge

March 17th, 2014

Last week the Securities and Exchange Commission filed civil charges against yet another former SAC Capital analyst for insider trading, suggesting that the government is not done with its investigation of Steven Cohen’s hedge fund.

Background of the Case

In its recent complaint, the SEC alleged that Ronald N. Dennis, a former analyst at SAC affiliate CR Intrinsic Investors, traded on material nonpublic information he received from two other hedge fund analysts in the shares of Dell (DELL) and Foundry Networks (BRCD) during 2008 and 2009.

Dennis purportedly received illegal inside information about Dell’s financial performance from former Diamondback Capital analyst Jesse Tortora.  Separately, Dennis received an illegal tip about the impending acquisition of Foundry Networks by Brocade Communication Systems from Matthew Teeple, an analyst at a San Francisco-based hedge fund.  Both Tortora and Teeple have been previously charged by the SEC for insider trading.

The SEC also alleged that Dennis traded in Dell before earnings announcements in 2008 and 2009, enabling CR Intrinsic and SAC Capital to either generate profits or avoid losses of $3.2 million on its positions in Dell stock.

In addition, the SEC alleged that Dennis received information from Teeple about Brocade’s upcoming acquisition of Foundry Networks in July 2008, prompting him to purchase Foundry stock for CR Intrinsic before the news of the acquisition was made public.  This trade enabled CR Intrinsic to generate approximately $550,000 in profits.

Steve Cohen Identified

In the SEC’s complaint, a number of SAC employees were mentioned who Dennis reportedly shared the inside information he had received.  This includes “Portfolio Manager A” who the SEC alleges, purchased 120,000 shares of Foundry stock for CR Intrinsic after Dennis gave him the tip; “Portfolio Manager B,” who the SEC alleges traded Dell shares based on inside information Dennis provided; and “Portfolio Manager C,” who the SEC alleges, bought 500,000 shares of Dell in August, 2009 after speaking with Portfolio Manager B, who had received inside information from Dennis.

While the SEC complaint did not identify the three SAC employees Dennis allegedly shared his illegal tips with, people familiar with the matter say that Portfolio Manager A is likely to be Alec Shutze, Dennis’s former boss at CR Intrinsic; Portfolio Manager B is Eric Gerster, who became Dennis’s boss after Shutze left the firm; and Portfolio Manager C is SAC founder Steven Cohen.

Settlement Agreed

In response to the SEC’s allegations, Dennis agreed, without admitting or denying the charges, to pay $95,351 in disgorgement of profits, $12,632.34 in prejudgment interest, and a $95,351 penalty.  Dennis also agreed to be permanently barred from working in the securities industry.  The settlement is subject to court approval.

Sanjay Wadhwa, senior associate director of the SEC’s New York Regional Office explained this settlement, “Like several others before him at S.A.C. Capital and its affiliates, Dennis violated the insider trading laws when he exploited confidential information about public companies, in this case Dell and Foundry, to unjustly benefit the firms and enrich himself.  His actions have cost him the privilege of working in the hedge fund industry ever again.”

The case is SEC v. Dennis, U.S. District Court, Southern District of New York, No. 14-01746.

Summary

Almost exactly one year after the SEC announced a record $600 million insider trading settlement with SAC Capital affiliate, CR Intrinsic Investors, the SEC brought yet another civil case against a former CR Intrinsic analyst for insider trading.  While the settlement of this case was not surprising, what the case clearly proves is the SEC is far from done with its insider trading investigation of the firm, or the hedge fund community in general.

As we have mentioned numerous times in the past, the federal criminal and civil authorities collected hundreds of hours of wiretaps, have thousands of e-mail communications, and have dozens of cooperating witnesses on hand as part of its “Perfect Hedge” investigation.  Most experts we have spoken with believe this alone is sufficient evidence for the DOJ and SEC to continue bringing insider trading cases for the next few years.

In our minds, this should be sufficient legal pressure for the asset management industry to continue to clean up its act.  It probably also means that hedge funds will continue to hire legal and compliance professionals to help protect the firms from bad actors and potential insider trading risk.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader