The End Of 'Orderly And Fair Markets'

Capitalism may have bested communism a few decades ago, but exactly how our economic system allocates society’s scarce resources is now undergoing its first serious transformation since the NYSE’s founding fathers met under the buttonwood tree in 1792.  Technology, complexity and speed have already transformed how stocks trade; but As ConvergEx's Nick Colas notes, the real question now is what role these forces will play in long-term capital formation and allocation.  Rookie mistakes like the Twitter hack flash crash might be easy to deride, but make no mistake, Colas reminds us: the changes that started with high frequency and algorithmic trading are just the first step to an entirely different process of determining stock prices.  The only serious challenge this metamorphosis will likely face is a notable crash of the still-developing system and resultant regulation back to more strictly human-based processes.


Via Nick Colas, ConvergEx,

Last week’s Twitter hack and resultant mini-crash in U.S. stocks made for a few minutes of confusion and several days of humorous commentary.  It is, however, also a sign of the times.  That a system of communication where Justin Bieber, Lady Gaga and Katy Perry are the lead dogs could encroach on “Orderly and fair markets” should force some discussion of where U.S. stock markets are heading.  “I crashed the market and I liked it…” isn’t a top 40 song. Yet…

Let’s begin with a few basics about what role equity markets are supposed to play in a modern capitalist economic system:

  • Stock markets exist to determine the value of publicly held companies using all legally available information that may impact their future cash flows and strategic positions.  There are a lot of items on this menu of value drivers, from prevailing interest rates to the quality of a given management team.  The benefits to any individual or group of investors to predicting enough of these inputs correctly on a consistent basis are, of course, sizable.
  • The resulting prices are signals to the broader economic system about the ongoing value of the company’s business model.  Well-considered companies with high valuations can purchase the assets of laggard enterprises with low valuations and operate them better.  Poorly run businesses fail, freeing up society’s intellectual and physical capital to pursue better uses.  Private equity can purchase underperforming assets as well, hopefully to re-engineer the business and return some or all of it to better health.
  • Through the IPO process, equity markets allow promising enterprises to tap a large pool of capital that allows for further growth.  Once public, they can make acquisitions for stock and reward productive employees with real ownership in the business.
  • Stock grants to management in well-run businesses – old and new - can allow the operation to entice the best available intellectual capital to join the firm and maximize their focus while employed there.  This ideally allows society to ensure that the scarcest of all economic resources – people – flow to their best possible uses.
  • Stock prices provide valuable economic signals to society at large – labor, capital, government and all other economic actors.  This, by the way, is why the Federal Reserve and other central banks care so much about rising equity values.
  • Public ownership of private capital allows society to “Spread the wealth around” through the individual ownership of equity assets.  Individuals can buy diversified portfolios of public companies and earn returns on their capital that mirror the trends in return on investment capital for the businesses they own.
  • Yes, I know it’s fashionable to pick apart these idealistic characteristics at the moment.  At the same time, free market capitalism has a reasonable track record over history for improving the lives of large chunks of the human race.  So until we get a race of incorruptible philosopher-kings in the mix, this structure is the best thing going.

The Twitter hack is the most notable example of “How” this process has changed, specifically in the real-time valuation of U.S. stocks.  To understand where we are, you first need to parse out the important changes that have occurred in stock trading over the last +10 years.  Over that period, market structure moved from having one dominant exchange for a given stock (IBM on the New York Stock Exchange, Microsoft on the NASDAQ) to a highly fragmented system of multiple “exchanges” – pools of buy and sell orders managed by very fast computers.

Technology – very fast, efficient, even ruthless – is therefore the backbone of the modern U.S. equity market.  It should be no surprise that this development is not without controversy.  The multitude of trading venues takes some pretty amazing processing horsepower to keep in sync.  The computer code needed to arbitrage prices among them needs to be highly efficient and operate faster than a human can literally blink an eye.  And the physical plant – cutting edge servers located near the data centers for the major exchanges, connected by microwave or high-speed data lines – costs billions to build and maintain.

With all this infrastructure in place, the logical question is “What’s next?”  To answer that larger question, we need some additional context:

Point #1: Human based active equity management has had a tough slog since the Financial Crisis, especially as it relates to U.S. stocks.  Money flows out of domestic equity mutual number in the hundreds of billions, and there hasn’t been a three-month period for positive flows in years.  That’s a significant development because this base of assets historically funded much of Wall Street’s traditional single-stock research efforts.  Sell-side analysts still have their role, to be sure, but over the last decade the job has been more of a concierge for management meetings and conferences than single-source experts on the investment merits of individual stocks.

Hedge funds do still gather assets, but their investment process values internal resources and evaluation over traditional broker-supplied research.  This group is the primary customer for newer products, ranging from satellite imagery of store parking lots to cyber-tracking of online web companies’ traffic patterns.  They also have the luxury of focusing their efforts on just a few investment ideas rather than needing to cover the entire investment waterfront.

Point #2: Passive management of U.S. equity assets continues to grow in popularity.  Nowhere is this more visible than in the ever increasing asset base of U.S. listed exchange traded funds, where $39 billion of the total year-to-date ETF money flows of $65 billion have gone straight into domestic equities.  The single most popular deomstic stock ETF by this measure is a relative newcomer: the iShares MSCI Minimum Volatility Index Fund, which only launched in October 2011.  The investment goal for the index underpinning this product is to provide equity returns in the context low overall price volatility.  Everything is done with mathematical analysis, rather than having a team of human analysts make stock-by-stock evaluations of potential investments with the help of Wall Street research.  That’s a very different approach from the old-school methods of managing money anchored in human judgment, to be sure.  How this approach does over time is anyone’s guess.  But it is a useful signpost for how money management as a business is changing.

Point #3: We work with a variety of quantitatively based investment managers at ConvergEx, and their appetite for research skews strongly to datasets and real-time indicators of business fundamentals.  In an increasingly open world, and thanks to our ever-increasing reliance on technology, there is no shortage of new resources to feed a numbers-based investment discipline.

A few examples serve to highlight this growing field.

  • One research provider I know has permitted access to the contents of hundreds of thousands of individual email accounts.  Any personal identifiers are stripped out before they get the data, but what’s left is a very useful amalgam of information about buying trends for everything from online retailers to video content providers.
  • Another product we’ve seen recently offers up essentially real-time satellite imagery of retail store parking lots around the country.  Forget walking the mall; you can count cars parked next to 200 representative stores for your favorite retailer. And if U.S. retail is too prosaic, they can get you images of virtually any place on the planet, cloud cover permitting. 
  • Lastly, one product which has been out for several years scours the websites of every airline around the world to see how much these businesses are charging for tickets and how full the flights are getting.  It’s not a perfect source of information – you need a handle on cost structures to know earnings – but it is a great starting point to understand near term business fundamentals for a very volatile sector.

The upshot here is that the technology of market structure – large, expensive and complex – is looking for a dance partner and our increasingly tech-based society is increasingly able to play that role.  The critical questions to this inevitable direction are pretty straightforward:

  • Can anything change this glide path to an ever more technology-based system of stock analysis?  I can only think of one: a very large system failure that causes a recession in the U.S.  We’ve seen individual brokerage firms teeter on the brink of failure after a systems glitch caused a large trading loss.  The momentum behind the current migration to technology-based data analysis in order to assess stock prices is strong.  At this point, only regulation can likely reverse it.  And regulation in the U.S. only comes after large systemic failures – never before.
  • How will capital markets assess stock prices in 5, 10 or 20 years?  There’s a saying in the tech world, especially among hackers: “Information wants to be free.”  On Wall Street, information is supposed to be expensive, since it can be used to generate profits.  Now that technology is being used to generate fundamental insights into corporate performance and the direction of stocks, which approach will win?
  • Over the next decade, as investors in U.S. stocks continue to expand their use of datasets and online resources to analyze securities, information will likely continue to be expensive.  We’re still in the early days of this transition, after all.  Thousands of institutional investors still use analyst-based resources and personal judgment to allocate capital and this process isn’t going away.  Many are adapting as the world changes, offering up new sources of investment insight through an effective hybrid approach. 
  • Over the long term – and I am talking decades here – it does seem inevitable that the analytical function behind assessing stock prices will change dramatically.  Will equity research based solely on human judgment go the way of the neighborhood bookstore, a quaint anachronism in an ever more connected world?  I can’t help but think that the answer is “Probably.”

Will capital markets become more efficient as a result?  If computerized algorithms with access to petabytes of real-time fundamental data can perform single stock analysis efficiently and without human biases, shouldn’t we welcome the development?  Stock market volatility might diminish with all that finely parsed data, and capital markets could become more accurately predictive of future corporate profits and strategic outcomes.

At the same time, technology has a way of creating distance between humans just as much as it can bring them together.  Just ask any parent with a college-aged child for the ratio of text messages to actual phone calls from their offspring.  Will an equity market running on algorithmic autopilot serve to tie the managers of capital (senior executives) to the ultimate owners (shareholders) as robustly as one dominated by flesh-and-blood money managers?  It seems a stretch to think so.


No comments yet! Be the first to add yours.