by Giles Nelson

Financial services: How the profit motive drives technology

Feature
Dec 13, 2010
Financial Services IndustryIT LeadershipIT Strategy

Whatever one’s opinion about the utility or threat that the financial services industry presents there is on thing that is irrefutable – parts of financial services are highly innovative users of IT.

In particular, the trading of equities and other financial instruments, such as some derivatives, foreign exchange and even bonds, is constantly changing and re-inventing itself. The reason for this? It’s the profit motive.

If the trading operation of an investment bank or hedge fund can trade more effectively than their competitors they can make more money.

So technology is providing the competitive edge in several key areas. Algorithmic trading, which uses computers to automatically submit and manage orders for all sorts of different financial instruments, is the dominant form of trading in equities.

In developed markets more than 90 per cent of equity trades involve a computer managing the trade in some way. High frequency trading is another growing trend.

This involves the very rapid and frequent submission of orders into the market. Often many thousands of orders will be sent per second.

The time it takes for these orders to be submitted into the exchange from the trading firm is critical – there are advantages to be had in being faster than the competition. Anything above a few milliseconds will usually mean that you’re not going to be competitive.

High-speed computers and low-latency networks are crucial to modern trading. Co-location is common, where computers owned by the trading firm will be in the same physical building as the computers that run the exchange to ensure that the length of network cable separating the two is as small as possible. Yes, the speed of light and nanoseconds matter here.

Regulation has also played its part. In an effort to create a single European market for equity trading and to increase competition and lower costs regulation was introduced in 2007 to make it easier for new exchanges to be setup.

This led to a proliferation of exchanges in many different countries trading the same equity instruments. For example, only about 30 per cent of trading in Vodafone now takes place on the London Stock Exchange.

With many exchanges now trading the same instruments, trading has become more complex. A trading firm can’t just go to one exchange – they will have to search many to get the prices they want. Technology is required to make sense of all this.

In the last year or so there have been calls for more controls and restrictions on the use of algorithmic and high frequency trading. In the opinion of some, including politicians in the US and Europe, things have gone too far – trading practices are now too far removed from the fundamental role of capital markets, that of raising investment.

In part this is because the monitoring of markets by many regulators is so far behind. The chairman of US regulator the SEC said recently that the technology used to monitor the markets was up to two decades behind the technology used by those trading.

Some regulators even still receive trading reports by fax. This imbalance certainly exists and reached its zenith in May of this year when US equity markets suffered significant falls within a very short amount of time – the so-called flash crash.

An investigation led the SEC to conclude that the use of an algorithmic order had been the trigger of the crash but its effects had been multiplied by the complex, multi-exchange US market structure. It took months for the SEC to report and during this time the causes were subject to much speculation with very few facts emerging.

My view is that this is an extreme example of a wider trend — that of business processes generally speeding up. Other examples include faster bank-to-bank payments, telecommunications companies sending marketing offers to end-users based upon their current location and retailers automatically ordering new stock as goods are acquired by shoppers.

There are many others. The lesson here is that if your process, whatever it is, executes much faster than you are able to monitor it, trouble then follows.

I like to think about what information theory tells us — that to have a perfect representation of an analogue signal one needs to sample it at double its frequency (do a web search for Nyquist if you want to look at this further).

Therefore if your monitoring lags significantly behind the speed at which your process executes, you’ve got very little hope of knowing how they’re performing. Your efficiency, customer service and observance of regulations all suffer.

The management of these faster processes is now key. The aim is for the organisation to become more responsive — not only to detect aberrant behaviour as soon as it happens, but to try and predict when limits may be breached.

Together with this, there’s a need for the software that controls and monitors these processes to be far more flexible itself – to allow the behaviour of the processes themselves to be changed responsively, as regulation, the organisation’s business or market conditions change.

A virtuous cycle can then be entered into where processes get incrementally better tuned to the way a business works — more efficient in terms of execution time and use of human resources, and more accurate in the spotting of current and future predicted problems and opportunities.

In financial markets, regulators and other market participants are beginning to catch up with the participants they regulate. For example, the FSA is using Complex Event Processing (CEP) technology.

As Alexander Justham, Director of Markets at the FSA said recently in an interview to the Financial Times, CEP will give the FSA “a more proactive, machine-on-machine approach” to surveillance.

Brokers, who provide their clients with access to markets, are using a combination of event processing and Business Process Management (BPM) tools to both detect suspicious patterns of trading behaviour and then manage subsequent analysis and investigation.

If a pattern of front-running trading is detected, where a broker’s own trader may be trying to take advantage of having visibility into the broker’s own client orders, a BPM managed process is started which leads compliance officers through a structured set of steps of analysis and investigation.

The calls for financial markets to go on a go-slow, with the use of algorithmic and high-frequency trading technology restricted will go on. Regulators in the US and Europe are likely to make recommendations in the coming weeks.

I believe that regulators should not try and put the genie back in the bottle.

There is an inevitable progression towards doing things faster in financial services and other industries. Being ahead of your competition often gives advantages.

But executing processes quickly doesn’t mean that you have to lose control. Technology can give the kinds of real-time insight required to keep on top of things and ensure your customers are delighted and competition depressed.

Dr Giles Nelson is deputy CTO for Progress Software