As consumers we now take for granted that we can obtain instant access to information. One example is online retailing. We expect to be able to go online, search a store for a particular good (perhaps after using a price comparison site) and find out whether any inventory currently exists. That’s a remarkable change from the days where an order might be submitted blindly and it would take “up to 28 days for delivery”. As I’ve written before from my own invovlement, think of the way that consumers access airline information: passengers can view up-to-the-minute changes in ticket pricing, view and change seats and check-in as well as obtain live information on expected arrival times. Tracking a parcel is another example – again something we now take for granted. Social media status updates are very real-time oriented. This ability for us to see and change such information ourselves, instantly, is now pervasive. And it’s become so only quite recently. It was in May 2009 that Facebook launched streaming status updates. As consumers we’re used to the kind of information visibility described above. It can therefore be surprising that so many businesses appear to be behind the adoption curve and have difficulty in obtaining the continuous business visibility they need. To disclose my own interest, my organisation, Progress Software commissioned some research by Vanson Bourne towards the end of 2009. The research asked 400 business and IT leaders in large US and European companies in many different sectors about their attitudes to a whole range of IT-related issues. The results on the topic of real-time information were particularly revealing; 94 per cent of respondents stated that “access to real-time information was critical to their business”. However when asked whether they were currently able to access information in this way, only eight per cent said they could. This shows an enormous disparity between aspirations and reality. Nowhere is this disparity more obvious, and more important than in financial services. Organisations such as investment banks and hedge funds have in fact embraced real-time technology. They are using Complex Event Processing (CEP) to respond to events that are occurring in the market and to trade automatically. This has led to higher trading volumes and faster moving markets. Advantages of this such as higher productivity for traders and more liquidity in the market are widely acknowledged. In a fast moving market however there are dangers when unexpected things occur. On the 6th May 2010, the Dow Jones index suffered a very sudden and sharp fall. One of the reasons cited for this occurring was that a trader entered a “fat finger” trade where the wrong price or amount was entered. A simple case of human error. The wider point here is that key organisations in the trading process such as banks, exchanges and regulators don’t have the right real-time information available to be able to not only understand what exactly is going on, but to prevent situations such as this occurring in the first place. Too often, information is available to these organisations afterwards, when actually it’s too late to do very much about it. There are analogous situations in other industries. Take mobile telecommunications firms for example. There is a continual battle for customer retention and to provide good customer service in a largely commoditised industry. Telcos want to be able to more responsively interact with their customers to, for example, be able to offer more personalized tariffs. Telcos typically do analysis on how someone has used their phone retrospectively and send a letter with a suggestion – perhaps to change to a tariff more suitable for international calls. What telcos would like to do is to be more responsive. If a sequence of international calls is seen within a relatively short amount of time then text the end user offering them the upgrade. The user will be far more likely to take the offer up. I recently spent a day at one of the top five European banks. On a daily basis they are dealing with mortgage and loan applications through multiple channels – branches, the Internet, phone etc. – and want to ensure that these are prioritized appropriately and also are processed in a timeframe that meets the expectation of the customer. This bank is currently going through deployment of real-time event processing technology to give them a continuous view of this process, so exceptions can be dealt with immediately and work prioritised. Remarkably, prior to this, management relied upon end-of-day faxes to keep them appraised – hardly effective when financial and quality objectives are set on a weekly basis. Having a “rear-view mirror” approach to information is becoming unacceptable. It means that customers are poorly interacted with, that faults in processes are not identified quickly therefore leading to extra costs to fix them and in some cases product competitiveness fails as firms can’t respond adequately to what is happening around them. This seems to be recognized, at least in part, as illustrated by those 94 per cent of survey respondents mentioned above. There’s lots of work to do to bridge that gap between the reality and the vision. Related content opinion Website spoofing: risks, threats, and mitigation strategies for CIOs In this article, we take a look at how CIOs can tackle website spoofing attacks and the best ways to prevent them. By Yash Mehta Dec 01, 2023 5 mins CIO Cyberattacks Security brandpost Sponsored by Catchpoint Systems Inc. Gain full visibility across the Internet Stack with IPM (Internet Performance Monitoring) Today’s IT systems have more points of failure than ever before. Internet Performance Monitoring provides visibility over external networks and services to mitigate outages. By Neal Weinberg Dec 01, 2023 3 mins IT Operations brandpost Sponsored by Zscaler How customers can save money during periods of economic uncertainty Now is the time to overcome the challenges of perimeter-based architectures and reduce costs with zero trust. By Zscaler Dec 01, 2023 4 mins Security feature LexisNexis rises to the generative AI challenge With generative AI, the legal information services giant faces its most formidable disruptor yet. That’s why CTO Jeff Reihl is embracing and enhancing the technology swiftly to keep in front of the competition. By Paula Rooney Dec 01, 2023 6 mins Generative AI Digital Transformation Cloud Computing Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe