The Gateway to Algorithmic and Automated Trading

Alpha in the machine

Published in Automated Trader Magazine Issue 35 Winter 2015

Marco Fasoli started using artificial intelligence and machine learning systems while completing a PhD in natural sciences at Cambridge University. He went on to become a co-founder of Titian Global Investments, applying advanced predictive technologies to financial markets as the firm's managing partner. And he's since acquired another title: co-founder of QuantBridge, a joint venture between Titian and Thayer Brook aiming to become a strategic hub for quant talent. Automated Trader talks to Fasoli about the systems that guide QuantBridge's trading decisions and finds out about his future plans for technology development, which he believes could cause a major shake-up in the retail investor space

QuantBridge's Marco Fasoli talks to Automated Trader at the Royal Automobile Club in London, UK.

AT: How did your scientific background lead you to financial innovation in markets?

Marco Fasoli: Doing my PhD was really the first time I became exposed to artificial intelligence and machine learning, primarily in the context
of looking at large bodies of data and trying to identify certain kinds of signals from a lot of noise.

Subsequently, I spent a number of years in the investment banking world and realised that what was artificial intelligence and machine learning in my university and academic days at that time had become mainstream and was increasingly being applied in industry to look at a variety of mission critical phenomena - including weather forecasting, real-time information retrieval, the management of very complex systems, like for example electricity systems.

I came across a large number of software companies that had developed very interesting applications that were being successfully rolled out in the world of industry and, having become familiar with capital markets, it became obvious to me that there seemed to be a gap in the financial industry in terms of some of these innovations that had been very successfully deployed.

So I got together with a group of people who I have known and worked with for many years, who had experiences ranging from computer science, operational research, financial engineering and capital markets and what we wanted to do was to try and see if we could refine some of those technologies and apply them to financial markets.

AT: How do you compare to the other few companies in this space?

MF: One of our characteristics is that our ideas and thinking process really comes from outside the financial industry, from science, operational research and computer science.

In that sense, we think that is quite an important distinction relative to many other systematic or quantitative-based approaches that have emerged more from the world of finance. We do think it is important to have a clean-slate mentality when you are looking at these opportunities with less of the baggage that can be typical of people that have been steeped in the financial industry for many years.

For example, when we look at managing risk, we look at risk in a completely different way than has been traditionally used by finance professionals. When we started to first look at opportunities

of deploying these technologies in the financial industry, we were struck by the fact that the most prevalent way of managing risk in financial markets is by looking at variations of returns, of standard deviation and volatility-based measures.

AT: Why was that a surprising discovery?

MF: If you think of real industry, say an electricity network or grid, what really matters for them, and ultimately reflecting what matters for the end customers, is the risk of a blackout. You don't really care how the volumes or throughput of electricity changes over time. What you
really want to do is minimise to the maximum extent possible the risk of a blackout. We look at financial markets in a similar way.

From a risk perspective, the thing that is most important for investors is: how much money am
I going to lose? Investors do care about variations with respect to the downside, they want to make decent money out of the strategy they are invested in, of course, but as importantly, they don't want to lose money.

AT: How is that reflected in the design of your risk systems?

MF: The way we constructed our risk management systems is that they minimise capital loss risk as opposed to what most risk management systems in financial markets do - minimise the variations of returns or the volatility. We have taken a completely different approach. In other industries where there are some mission critical applications - how is risk managed, what are the best ways of reducing that risk and what is the risk that really matters for the customer? That is why the example of an electricity blackout makes sense, because in the end that is what investors or clients care for. They want to be able to put the kettle on when they need and don't want to lose power.


AT: And is that where your artificial intelligence system fits in?

MF: We spent a number of years developing
our software systems initially focused on the predictive component. So it is all 100% machine- learning based. Essentially, we have software that is able to change its own structure as a function of what it sees in the market. The software is self-learning and adaptive so, based on what it sees, it is designed to learn the behaviour of the market over time and to self-improve its own performance by changing its own structure.
It self-generates its own parameters and input variables without the need for us to specify any type of rule up front. In a sense, it is not based on preset rules, the rules are generated by the systems themselves based on what they see.

The first question we asked ourselves was: is it possible to predict the movement of the markets?

AT: That is a big question - what did you find?

MF: It is challenging, and of course has risk, but it is possible to do it in the same sense that people can forecast, say, the weather, so long as there are a number of important requirements that your systems have for a reasonable chance of success. The first is that the forecasting horizon has to be relatively short because of the phenomenon of forecast degradation, which means that your initial conditions change because markets are non- stationary. In practice it's a bit like the weather - anything after a week you start to degrade relatively quickly.

Our systems have a daily forecast horizon, although our systems can also work really well intraday. But the strategies that we have implemented in a live environment on the predictive side are all 24-hour forecast horizon.

Secondly, you have to have systems that are able to strip out as much as possible the noise that exists in markets. There are recurring relationships between the past and the future that can be identified but it is challenging to do so because these relationships are often very complex and non-linear.

We have concluded that those relationships do exist, but (they exist) over the short term and are often submerged by a lot of noise. So the systems need ways of processing the data and cleaning the data effectively. It is very important to have a high signal to noise ratio.

Thirdly, the systems need to be adaptive and self- learning (because) they need to be able to cope with changes in the fundamental behaviour of markets they look at. What might be a good way of predicting today may not be good tomorrow.

And the last thing we found is that we tend to get much better results with a market specific focus.

AT: Can you tell me how that translates into your trading strategies?

MF: Each market is traded by a separate set of 600 systems, and each of those only looks at the same market. So, let's take gold for an example. We have 600 self-learning adaptive systems and each one generates its own input variables, parameters and predictive algorithms as a function of what it sees so it is a little bit like having a proprietary trading desk. Imagine you have 600 intelligent traders all trading gold, and every day they have to make a prediction for what will happen the following day. They all go through the same analytical process, but they have a slightly different tool kit given to them. So some will focus on analysing the behaviour of gold in the most recent past, others will look at the last five years, some will decide themselves how far back to look but the way they do the analysis is all the same. They will look at cleaning the data, self- generating the input variables, the parameters and predictive outcome.

On top of those 600 traders, you have a head trader, which is another self-learning machine that sits on top and looks at the underlying behaviour of the 600 (and then) selects which of them it will consider as the finalists for the prediction. Typically between 25 and 50 systems are chosen. And then the signals generate a single signal - buy or sell. (This process) is an intelligent synthesis, not a trivial majority rule.

AT: Where are your systems finding alpha?

MF: What is interesting is that the 600 systems trading gold operate completely independently from the systems that are trading silver, that are trading heating oil, that are trading crude and so what we find is that there is very low correlation between the trading systems themselves. So for example, the trading systems that trade crude oil compared to the trading systems that trade heating oil have correlation around zero. This is a daily correlation measured over five years now.

If you just look at the daily correlation of the five years, the underlying markets have correlation of about 0.9 - they are very highly correlated.
So there are periods of high and low negative correlation. Essentially what our systems are doing is they are looking to capture alpha in the short term.

Our systems are able to capture some alpha in terms of relative behaviour between the two markets. It also means that our strategies across the systems tend to be a lot less directional than most systematic strategies. Unlike our systems that self-generate their own rules for each specific market on a daily basis, most systematic programs tend to apply the same set of fixed rules across all markets.

AT: Other than commodities, what other asset classes are you trading?

MF: We have a diversified strategy that looks at equities, commodities and fixed income bonds. Fx, on the predictive side, we find tend to be not as receptive because we think there is a lot of noise compared to the other markets. I also think recent monetary policy has made it much more difficult for quantitative approaches to identify alpha opportunities. We also have a long only adaptive asset allocation strategy that is traded on a basket of highly liquid ETFs covering equity indices, equity sectors, treasuries, corporate bonds, commodities, and cash across which our dynamic engine shifts capital on a weekly basis.

AT: The predictive side of QuantBridge is focused on the institutional market, but you also mentioned that your strategies could provide solutions for the retail space as well?

MF: Over the last few years, we have been looking at the risk management side. What we wanted to address was the issue of how to manage risk in financial markets and the risk of a capital loss for investors. So we have developed a 'smart allocator', which is now fully deployed in one strategy.

Smart allocator systems are designed to effectively minimise the risk of capital loss whilst delivering a minimum level of return at the portfolio level. They don't look at the underlying instruments on an individual basis but they look at the interrelationships across the instruments. It seeks to determine the optimal weight to attribute to each of the instruments in the portfolio so as to achieve two things: first of all, to deliver a minimum level of portfolio return and at the same time minimise the risk of capital loss for the portfolio as a whole.

This is very different from the predictive systems in that the real heavy lifting in terms of value add is on the risk minimisation.

The approach (also) differs significantly from the allocation strategies that are used by most modern managers and fund managers in several ways. First of all it is dynamic. The strategies we have been implementing, including our ETF smart adaptive allocator, has a weekly rebalancing frequency. Most asset allocation strategies tend to have much, much longer rebalancing horizons.

And secondly the actual weighting is driven by the risk minimisation engine, which seeks to minimise the risk of capital loss (drawdown) for the portfolio while seeking to deliver a minimum level of portfolio return. This differs fundamentally from traditional asset allocation models - which are based on assumptions related to modern portfolio theory: returns are normally distributed, risk is measured by standard deviations, with correlations across different assets remaining constant over time.

We believe those assumptions are fundamentally flawed.

AT: Why isn't everyone doing something different then?

MF: It is an optimisation exercise which is actually very complex because, although traditional measures such as volatility are easy to quantify, the risk of capital loss as a process is actually very hard to model.

If you are going to lose 5% or 10% in your portfolio, there are so many ways to do it, and some ways are "better" than others. Being able to have a risk minimisation exercise centred around capital loss demands that you have a way of measuring and managing the shape of the drawdown which is evolving.

There are infinite ways of losing the same amount of money and so having a handle on how to measure that capital loss risk is what we have come up with. It is not perfect, but it is a way that reliably approximates the capital loss as a measure by looking at several different functions that relate to it.

We think that this type of product and functionality is a good match with what retail oriented investors, as well as institutional investors, are looking for.

An adaptive allocator that allocates dynamically across a basket of very liquid assets and sectors in a long-only strategy, using highly liquid global ETFs, generating effective diversification is, we think, very appealing because it (can generate) stable returns with a tightly controlled capital loss in a way that traditional passive strategies, multi- ETF or multi-asset strategies, aren't able to do. It delivers "smart-beta" functionality by combining low cost investment instruments with adaptive technologies.

AT: What about transaction costs?

MF: Transactions costs are very low. They have a very low impact on the adaptive allocator asthe average holding period is four weeks and are slightly higher on the predictive side because the average holding period is about seven days.

AT: Are products like this appropriate for the retail side?

MF: Well, the strategy is long only and the universe is made of 17 ETFs, the most liquid, covering bonds, equity indices, equity sectors, and commodities. In bonds we have corporate bonds and treasuries. The allocator can choose up to a certain amount of cash and every week, it looks at the market behaviour of the different instruments and selects typically anywhere between 8 and 11 of those ETF and weights (them). The following week it will reassess.

For the long only adaptive asset allocation strategy, the worst intra-month decline experienced in October 2014 was only -0.66% compared to the -6.15% of the S&P. Past performance not indicative of future results. Source: QuantBridge


AT: So what's next for you?

MF: We have formed a joint venture with Thayer Brook, called QuantBridge, where strategies coming from different types of technologies are grouped together and new strategies are developed, essentially a strategic hub where quant traders and technologists partner together. Over time all the various strategies might also be grouped together into a new multi-strategy product.

We think the industry has turned radically, and that the single strategy approach doesn't really work or at least is not sufficient.

Thayer Brook is an established manager witha 14-year track record, has the backing of the Japanese group Mizuho, and a similar mindset regarding the evolution of the space.

AT: How big will the combined entity be?

MF: Collectively the combined entity has seven strategies occupying diverse positions within the spectrum of trading methodologies. Through internal resources and external collaborations, we have a total of 11 highly experienced quants and traders both supporting existing strategies and the development of new ones.

I would like to add that a big picture factor (that led us) to focus on this area over the last couple of years is that the financial industry is a bit primitive from a technological perspective compared to other industries. It is the one big industry that has not really been disintermediated to anywhere close to the degree that it could be if you think of publishing, retailing.

Part of the problem is the cost structure is very cumbersome, there have been very low levels of technology innovations and many actual strategies themselves have not proved to be good value for money for investors.

There is going to be heavy dislocation in the industry and we want to be well positioned.