With the debate over Tobin tax increasingly appearing on the front pages of the mainstream media, we thought we would revisit a theme discussed previously on Zero Hedge, namely the issue of whether or not the predominant trading paradigm, i.e., High Frequency Trading, offsets trading costs as proponents of this trading strategy claim. We had previously proposed some broad estimates in quantifying the cost of HFT: our results received the stern condemnation of those whose livelihood is dependent on microsecond speculative scalping of pennies on the dollar, thousands if not millions of times a day, under the guise of "liquidity provisioning." It is precisely these people who are most opposed to a trading tax as it will make most HFT-based strategies prohibitively expensive. And while a final proposed version of what a tax might look like is still to be determined, it is only reasonable to expect that it is most punitive to those that do the opposite of allocating capital on a long-term basis, and thus most detrimental to those parties who merely churn stocks under the "noble" pretense of providing liquidity. Of course, that this liquidity is provided in the most liquid of names as is (and of course those above the $1 rebate-collection threshold), is often ignored.
Furthermore, we have historically been very critical of VWAP algorithms, pitched by the likes of Goldman Sachs in their REDI trading platform as the core trading strategy involving non-dark liquidity and trades in open exchanges (of course for those who prefer to be exposed to Goldman's much more principally lucrative proprietary bid/ask in a dark pool venue, Goldman has Sigma X for that).
A recent research report by Quantitative Services Group presents one of the first empirical studies confirming our hypothesis that in its most frequent incarnation, VWAP algorithms, HFT not only does not reduce trading costs, but in fact augments these, and adds a further destabilizing factor of leaving the initial trade open to major predatory algo exposure courtesy of advanced adverse selection sniffing algos. The combination of these two has a much more adverse impact on overall per-trade P&L than an incremental tax, as associated implementation shortfalls and liquidity charges can quickly overtake even the "draconian"0.25% tax per trade currently proposed (a number which will likely be lower in the final proposed version of the bill, and will likely not have an impact on retail daytraders who for some reason seem to be the most vocal opponents of DeFazio's proposal). Yet this embedded toll is something considered normal as its trade off is a purported liquidity improvement. However, as much literature has demonstrated, HFT strategies do not provide liquidity equally across the equity landscape, and only the top 10% of all names, usually those that already have substantial natural flow, are the ones that "benefit" from HFT involvement. Why the Tobin tax outcry is not focused much more sternly on precisely this externality "leach" is very much unclear.
And the tradeoff is that while in the Tobin tax scenario at least money could be used to replenish some of the already empty vaults of the US government, the HFT toll is purely to the benefit of the Wall Street, and Chicago-based, fat cats. Which is why you can expect the Wall Street - D.C. lobby to scream bloody murder as the existing "market toll" status quo is threatened.
As the conclusion to the QSG report finds the "results indicate that there is a significant difference in the costs and trading velocity of VWAP algorithms when compared to Arrival Price algorithms, especially when applied to low price stocks. The Tracking Error (PWP) measure confirms most VWAP algorithms are challenged to beat a 10% PWP benchmark. This result is consistent with both the negative impact of certain HFT strategies and the possibility of positive momentum in institutional order flow." The punchline that "the average impacts for VWAP algorithms are nearly double those of Arrival Price algorithms" dictates why all HFT proponents (and certainly oponents) would be well advised to read the paper's findings: "This is a significant revelation to proponents of VWAP algorithms as a low impact strategy that flies under the predatory HFT radar." And as people react much better to actually quantified tradeoffs, here is the opportunity cost of VWAP for the select group of study participants: "The incentives to take action to prevent such impact costs are compelling. If the impact costs realized by the VWAP algorithms were reduced to the level of the Arrival Price algorithms, the clients in this study would have reduced trading costs by $35 million."
Below are more of the paper's findings. First, QSG recapitulates why HFT has become such a prevalent topic in 2009. This part should come as no surprise to Zero Hedge readers who have seen this topic dissected here before anywhere else in the MSM or the blogosphere, starting in March 2009 (highlights ours throughout).
HFT has dominated the securities trading industry headlines for much of 2009. In reviewing the portrayal of this activity by the industry’s media sources, we discovered a mixture of concerned curiosity and outright disdain for HFT from outspoken representatives of the buy-side and agency-only execution providers. Recently, support for the practice has emerged from a few buy-side firms and many of HFT’s more public practitioners. Industry blogs and self styled ‘white papers’ have gotten the attention of Congress, and the SEC is considering regulatory changes, already taking steps to curb an HFT-related activity called ‘Flash Orders’. The uproar surrounding HFT has been created largely by anecdotal evidence from traders and the soaring profits reported by the HFT operations of a few firms forced to disclose their results to the public. Unfortunately, there is little empirical evidence on the effect that high-frequency trading has on traditional institutional trading desks that are employing non-HFT strategies. In this report, we leverage QSG’s proprietary tick-based transaction cost attribution methodology to reveal empirical evidence confirming that significant increases in market impact costs are being experienced by certain types of institutional size trades. This report, the first in a series, will focus on a surprising segment experiencing HFT impact, executed through automated algorithms in liquid US stocks - VWAP targeted orders.
For those still new to the concept of HFT, QSG provides a good and brief overview of the evolution of HFT strategies:
The definition of high-frequency trading is both broad and evolving. As the name suggests, it generally involves trading strategies that rely on rapid, large scale order executions facilitated by advanced computing and communications technology. The significant data, order routing and communications infrastructure that supports HFT strategies are designed to virtually eliminate execution delays or ‘latency’. At its most basic level, the capacity to execute instantaneously creates the ability to arbitrage prices across execution venues, of which greater than 40 exist in the US alone. This profit making opportunity is then augmented by two additional strategies that are pursued exclusively or in combination: electronic market making and statistical arbitrage.
Electronic market makers create two-sided markets with the goal of profiting from the spread between the prices at which they buy and sell. Through changes in regulation, increases in volume, the introduction of decimalization and the automation of the trading floor, market making has significantly evolved over the last decade. The challenges to the business model of market makers include dramatically narrowed bid/ask spreads, fragmented markets and an increasing number of competitors. Today, market makers rely on sophisticated statistical models and trading algorithms that have automated the previously manual decisions as trading flows have increased dramatically and executions are measured in milliseconds. In addition, exchanges and ECN’s provide incentives for electronic market makers to provide liquidity by offering a rebate on trades, usually about $0.20 per 100 shares. Firms like Global Electronic Trading Company LLC (GETCO), Tradebot Systems Inc., Hudson River Trading LLC and Wolverine Trading LLC are examples of large electronic market making firms.
The other broad category of HFT profits come from statistical arbitrage (stat arb). Similar to electronic market making activities, stat arb shops using HFT strategies leverage a combination of low-latency trading technologies and an automated decision engine. The statistical patterns that these strategies exploit generally occur intra-day and don’t require overnight positions. Statistical arbitrage strategies in a high-frequency setting analyze price, volume, depth of book and trading velocity patterns to identify exploitable price trends. These price trends can be very short and shallow, created by a temporary liquidity imbalance, or they can be much longer in term and quite large, created by a large institutional buy/sell order. In the former, the role of the HFT stat-arb strategy is very similar to the market maker’s role. In the latter, the role of the HFT stat-arb strategy is very similar to trading ahead of the institutional order. In such cases, the stat-arb profits are not necessarily derived from a spread premium and rebate; they profit by ‘surfing’ the price moves available by trading alongside the institutional orderflow. Identifying such orderflow can shift the probability of profit in favor of the HFT and in some cases create a significant liquidity imbalance of its own. These circumstances have the potential to dramatically increase institutional trading costs.
The last point is critical to explaining the explosion of child algos in recent years, whose sole specialty is chopping up large orders into small, statistically non-significant order flow, which presumably does not raise the red flag for stat arb frontrunners. Alternatively, controlling the primary venues in which the child orders are created (i.e., Goldman's REDI) hands the keys of the kingdom to whoever is in charge of the primary child order splitting algorithm as nobody is as able to reverse engineer it as those who create it, and update it, often many times during the day.
The one factor consistent across all HFT strategies is that they benefit from increased volumes and micro-second execution advantages. The majority of profits are made on razor thin margins. For example, market makers are often executing trades with gross margins of 0.05% or nominally between one and two-tenths of a penny. Of course most of the volume in equity markets is concentrated in the largest capitalization stocks with the narrowest spreads and the lowest price volatility. These characteristics attract the attention of HFT strategies; essentially, trading begets trading in these names. Non-HFT trades (now thought to be less than 50% of overall equity trading) are necessary for HFT strategies to flourish. In order to manage the trading velocity required by HFT strategies, firms must often submit and cancel thousands of orders per second while simultaneously monitoring the market data that drives their automated trading systems. To get the broadest and fastest access, many HFT firms subscribe to enhanced market center data feeds like NASDAQ’s ITCH or BATS’ FASTPITCH, in place of feeds from the slower consolidated Securities Information Processor (SIP). In addition to the expensive data feeds, these firms have to invest heavily in their internal systems and data processing engines in order to monitor and process data on thousands of securities simultaneously. [so yeah, all it takes is an i7 and a $29.95/month program to run a HFT system, right]. The time frames under consideration are so short that many of these firms have taken the step to co-locate their technology inside the buildings housing the exchanges.
Most notably, we would like to draw the attention of Senator Kaufman to the following observation by QSG, as it best represents that current nature of the market, which for months has seen a confluence of declining overall volume coupled with ever increasing statistical arb market dominance. And there is nothing that HFT proponents can say that invalidates this statement:
Much of the media reports surrounding HFT have been focused on the potentially negative impact of such strategies. Proponents of the strategies usually point to the dramatic increases in volume (often equated with liquidity) and declining average bid/ask spreads. However, the collapse in average trade sizes, which dramatically increases the overall number of executions, is often overlooked. This increased trading velocity generates both increased profit opportunities for market makers and greater signaling opportunities for stat-arb strategies.
Another frequent question is just how deep has HFT penetrated traditional institutional order flow:
US equity volumes have grown dramatically in recent years, much of it driven by the emergence of HFT strategies. Notable is the significant overall increase in volume and the number of trades, especially following 2005 (see Figure 1).
A critical observation of who is trading with whom now that HFT accounts for 70% of trades: the conclusion is surprising in its simplicity: at least 20% of the incremental order flow (over 50%) has nothing to do with providing liquidity, thereby refuting all claims that HFT is purely an altruistic strategy seeking merely to benefit all market participants. If that was the case HFT would max out and plateau at 50%.
The prevailing estimates of the daily volume impact of HFT strategies now range between 50% and 70% of daily volume in the US. Of course, as HFT strategy estimates cross the 50% barrier, it is clear that HFT participants are increasingly trading amongst themselves. If electronic market makers are primarily providing liquidity, taking the other side of natural orderflow, then clearly stat-arb strategies that are demanding liquidity (as they seek to identify and exploit orderflow patterns) would account for HFT strategies pressing through the 50% barrier.
While electronic market makers are supposedly ‘providing liquidity when it would not otherwise be available’, what are the effects of their presence when natural liquidity is present? The ‘interpositioning’ effects of high-frequency market makers can cause inflated volumes in very short periods of time. Instead of a single transaction occurring between two natural sides, a market maker’s speed in submitting and cancelling orders may cause two or three trades to occur. The volume inflation and reduction in average trade size that results from these activities especially influences participation-based trading strategies as these algorithms are designed to track market volume. It may also encourage trading algorithms to cut execution sizes into even smaller lots, significantly increasing the number of trades, leading to a new source of exploitable ‘information’ and increased exposure to ‘adverse tick’ risk.
It’s well known that sophisticated stat-arb models routinely monitor market data and the depth of limit order books to detect asymmetries in trading interests. The goal is to exploit and profit from them before the flows reverse and larger traders have a chance to finish their orders. These HFT strategies increase the costs of completing institutional trades and often introduce ‘adverse selection’ as orders are completed in names that are moving contrary to the institutional trader’s investment goals. Our study seeks to illustrate the role of high-frequency trading in the implicit transaction costs associated with participation-based algorithms.
QSG proceeds to provide the results of its analysis:
The data driving the analysis is QSG client executions including data regarding the trading algorithm employed between January 1, 2009 and October 23, 2009. Our sample set included over more than 95,000 orders and represented greater than $30 billion in executed value. To further illustrate the topic, we analyzed the trades between September 10 through October 23, 2009, using the ‘adverse selection’ analysis methodology introduced by Henri Waelbroek et al in a recent study from Pipeline Financial and AllianceBernstein titled "Adverse Selection vs. Opportunistic Savings in Dark Aggregators" [a study presented previously on Zero Hedge].
The T-Cost Pro tick-based attribution methodology matches fill-level client execution data and trade and quote data using a proprietary matching algorithm. Once the data is synchronized, the T-Cost Pro system calculates the cumulative ‘Liquidity Charge’ or footprint that resulted from the client executions. This impact cost is separated from the price impact, or ‘Timing Consequence’, of the competing trades that are responsible for the remainder of the price drift over the execution period. The technique considers the impact made by each individual execution in the order and accumulates the impact throughout the life of the order.
The sum of the cumulative Liquidity Charge and the Timing Consequence equals the Implementation Shortfall. This attribution allows us to investigate costs at a level that extends well beyond what we can do with the traditional benchmark measures and is particularly useful in illustrating the performance characteristics of algorithmic trading strategies. By separating costs into these two elements we can examine both the liquidity management characteristics and the price trend reaction characteristics of an algorithm. This technique is also better suited for the complex challenge of contextualizing ‘Best Execution’ analysis.
And here is a good reason for why the cheaper the stocks have gotten, courtesy of the 2008 market crash, the greater the greater the participation of HFT in the broader market.
2008’s steep decline in stock prices of large capitalization stocks has introduced an interesting phenomenon. Since most stocks are quoted in penny increments, the minimum tick size for most stocks is also $0.01. This introduces an interesting profit margin bias for HFT strategies. Since profits accrue to HFT on a share basis, low priced, high turnover stocks often have the potential to provide improved profit margins. To account for this we divided our trade dataset by stock price, splitting it into stocks less than/greater than $10 for all trades less than 5% of the day’s volume. We suspect that HFT activity will reflect the margin bias and be more intense in the lower priced stocks. As a preliminary evaluation of this assumption, we measured the Market Trade Velocity (MTV) in the stocks traded in each sample set. We found that stocks less than $10 had 11% more executions per second on average than stocks greater than $10.
Getting back to quantifying the cost of VWAP:
To ease comparisons we categorized the algorithms into two groups, VWAP algorithms and Arrival Price (Implementation Shortfall) algorithms. VWAP algorithms are engineered to execute in-line with market volumes through time, while Arrival Price algorithms are designed to execute near the Arrival Price with less regard for targeted volume patterns and time intervals. Arrival Price algorithm orderflow is often more concentrated, occurring earlier in the execution period. These algorithms are purported to have higher levels of ‘market impact’ than VWAP algorithms, which have reputations for executing at smaller interval volume participation rates.
Here is what the study uncovered:
The first set of results is for the subset of data where we calculated both the T-Cost Pro att ribution measures and the participation weighted metrics. The metrics are presented in percentage terms (basis points) so it should be noted that transaction costs in smaller price stocks will tend to be greater than higher price stocks; this separation improves the quality of our comparisons (Figure 3).
It is significant that the VWAP costs are larger for all measures during this period.
This is contrary to the perception that VWAP implementations are an efficient way to reduce trading costs, especially market impact. While the average liquidity charge difference between Arrival Price and VWAP algorithms for the greater than $10 category isn’t nominally large, it is greater than 50% greater on a relative basis. In the less than $10 category, the performance difference is striking and significant across all three metrics. The idea that the footprint created by VWAP strategies could be almost three times that of the Arrival Price algorithms has large ramifications for the automated strategies and the possible influence of HFT. Of particular importance in the less than $10 subset is the comparison of Liquidity Charge to Implementation Shortfall. Given that the measured Liquidity Charge is greater than the total implementation Shortfall, it is clear that the algorithm’s own impact on price is the driver of these transaction costs. These comparisons to the participation weighted benchmark indicate that the Arrival Price algorithms add value in the less than $10 group and slightly underperform for the larger price stocks. The large positive Tracking Error for the VWAP algorithms shows that these orders on average underperform a 10% participation weighted VWAP by 13 bps in the less than $10 subset. QSG’s attribution to Liquidity Charge and Implementation Shortfall suggest that this cost is not due to price drift over the execution period but rather the VWAP algorithm’s own impact on price.
Of most interest in Table 2 & 3 is the sharp increase in trading velocity (strike participation) for VWAP algorithms in the less than $10 subset compared to those in the greater than $10 subset. The average trade duration only increases by 13%, while the average number of child order executions (strikes) increases by 170%. This is also reflected in the Average Strike Participation, which increases by a factor of 1.75 for Arrival Price algorithms and by a factor of 4 for VWAP algorithms. The much larger number of executions related to VWAP strategies for similar sized orders and trade durations indicates the hyperactive parceling activities of these algorithms in the less than $10 subset of trades. We have found some VWAP algorithms to execute multiple times per second to keep up with volume, exposing the order to the additional ‘adverse tick’ risk that drives trading costs.
While both algorithm categories executed near the market average trade size for trades in stocks greater than $10, the values diverge in the low priced stocks. Arrival Price algorithms registered an average trade size (109%) advantage, while the VWAP algorithms show a gap (96%) in the measure. These statistics, showing greater strike participation and smaller execution sizes are indicative of higher velocity trading, which both attracts and is caused by high-frequency order flow. Importantly, it is apparent that VWAP algorithms incur a much greater percent of adverse ticks than does the market during their trading interval, especially when compared to Arrival Price algorithms in the dataset having trades in stocks less than $10.
And the final nail in so many proponents of the "HFT reduces trading costs" coffins:
When the analysis period is expanded to 2009 year-to-date, it’s clear that the trend in high cost VWAP orders persists, especially in the lower price stock category.
Here is the conclusion that Sen. Kaufman should take straight to the SEC and ask why the Goldman study prepared exclusively to perpetuate REDI and Sigma X's monopoly of critical firm revenue streams, does not touch upon at all:
While the average cumulative Liquidity Charge is over 60% greater for VWAP algorithms compared to Arrival Price algorithms in the high price category, they increase greater than 110% in the low price segment. As was discovered previously, the average number of strikes per order increased dramatically for the VWAP algorithms in the less than $10 subset while the trade duration remained nearly constant. In context of Liquidity Charge, this is compelling evidence of adverse execution quality for VWAP algorithms, especially when considering the average adverse tick ratios of 20% for VWAP algorithms year-to-date. The fact that the VWAP adverse tick ratios are three to nine times larger than those of the Arrival Price algorithms is of great concern as these values are consistent with systematic liquidity imbalance biases and predatory competition. In the year-to-date analysis period, we again found all cost and tick metrics averages for algorithms in the same subset to be significantly different at a 5% level.
Our results indicate that there is a significant difference in the costs and trading velocity of VWAP algorithms when compared to Arrival Price algorithms, especially when applied to low price stocks. The Tracking Error (PWP) measure confirms most VWAP algorithms are challenged to beat a 10% PWP benchmark. This result is consistent with both the negative impact of certain HFT strategies and the possibility of positive momentum in institutional orderflow. This exposes the limitation of such benchmarks in their ability to separate an order’s price impact from price drift associated with other trades over the trading interval. The limitation is overcome by the T-Cost Pro Liquidity Charge measure which is capable of isolating the cumulative impact of each order, revealing that the average impacts for VWAP algorithms are nearly double those of Arrival Price algorithms. This is a significant revelation to proponents of VWAP algorithms as a low impact strategy that flies under the predatory HFT radar.
The details of the study uncover an important artifact from today’s trading environment: increased order parceling has three negative ramifications. First, more ‘strikes’, or executions per order, increase a client’s exposure to adverse ticks and this tick risk translates into higher impact costs. Second, more strikes increase the chances of leaving a statistical footprint that can be exploited by the ‘tape reading’ HFT algorithms. Third, should HFT strategies identify the order and begin to trade in anticipation of the orderflow, this will begin a positive feedback loop that can significantly change an algorithm’s behavior and invite even more predatory orderflow.
The incentives to take action to prevent such impact costs are compelling. If the impact costs realized by the VWAP algorithms were reduced to the level of the Arrival Price algorithms, the clients in this study would have reduced trading costs by $35 million. This study also highlights the value of rigorous algorithmic evaluation measures. The proper trading analysis measures empower equity managers to retake control of the execution process with confi dence, avoiding the errors of anecdotal decision making that aren’t supported by facts of an increasingly challenging trading landscape.
Aside from the critical observations provided for the first time in a formal research-paper that such staple HFT strategies as VWAP do nothing to improve implementation shortfalls, and in fact generate not only adverse executions, but invite the opportunity for numerous additional negative stat-arb initiatied feedback loops, it is likely the case that as more analyses of HFT cost-benefit become public, the general population will soon realize that the already embedded tolling features of HFT far surpass the potential costs associated with a Tobin tax. Furthermore, as a Tobin Tax is a certain way to eliminate a vast majority of the non market-making features of HFT (those which generate the adverse impacts to institutional traders), it is becoming evident that one way to restore normalcy to an environment in which one HFT strategy trades almost exclusively with another one, in the process adversely impacting the increasingly fewer number of non-algo based traders, is by such a measure as transaction taxation. At the end of the day, we suspect the cost from a Tobin tax will be materially lower than the highlighted HFT toll payments discussed above, especially in a variant that does not impact day trading enthusiasts.
To be sure, a Tobin equivalent is not the only way to remove all the adverse HFT baggage accumulated over the years. Another proposal includes the periodic (instead of continuous) clearing of trades. As periodic clearing eliminates the possibility off line-jumping by predatory and block sniffing algorithms due to all trades clearing all at once at a set interval, this would be a comparably viable approach to eliminating the parasitic features of HFT. Yet with the governmental financial windfall associated with a Tobin tax, we believe the political agenda will be much more focused on the Tobin proposal. Yet if that were to encounted substantial populist and Wall Street lobbyist resistance, periodic clearing is precisely what Senator Kaufman should be espousing as the way to fix the computerized mess that US capital markets have become.