How to Improve Application Performance and Reduce Latency

Web developers can no longer look at network latency and application performance as mutually exclusive concerns. Fortunately, there are several ways that developers can "hide" data transmission and computation so that user experience doesn't suffer.

By George Lawton
Mon, September 10, 2012

CIO — Developers have traditionally looked at network latency and application performance as two separate phenomena. However, modern Web development needs to consider both of these phenomena together as they move toward more complex applications and networking infrastructure.

"Latency is clearly the biggest factor in network constraints of page loading on the Web," says Guy Podjarny, chief product architect at Akamai. This is clear in run measurements of real users or in synthetic measurements, he adds, especially when compared to changes in download and upload speeds. "Unless you start with an especially slow connection, even doubling the speed will not make much difference. But with growth in latency, load times increase linearly."

Latency Is a Big Deal for Users

Latency measures the delay between an action and a response. Over the Internet, individual delays at the application or packet level can accumulate, as each element in the chain of communication is linked to others. Asynchronous development approaches, such as Ajax and the use of Asynchronous JavaScript, can help to reduce these delays by separating the program logic from the need for constant network connectivity.

Analysis: Latency, Interference, Security Main Wireless Networking Challenges

Latency can often be hidden from users through multi-tasking techniques. This lets them continue with their work while transmission and computation take place in the background. The differences that latency-sensitive software design make can be dramatic, Podjarny says—start times that are four times as fast as load times twice as fast, plus better resilience due to fewer intermittent failures.

Major companies see significant usage and sales benefits from shaving off even fractions of a second of latency. For example, the Microsoft search engine Bing found that a two-second slowdown in page load performance decreased revenues per user by 4.3 percent, Podjarny notes. (His personal blog points out 16 more Web performance optimization statistics that demonstrate the importance of reducing latency.)

Developers also need to think about the law of unintended consequences of feature creep and address the possibility that new features may in fact subtly push users away. For example, when Google offered to let users increase the number of search results per screen from 10 pages to 30, the average page load time increased from 400 ms to 900 ms. The number of searches initiated per user dropped by 25 percent as a result, even though these users voluntarily chose to see the more voluminous search results.

How Will App Affect the WAN?

With latency such an important issue, software vendors are spending much more time today considering how their designs will impact the wide area network (WAN) than they did in the past, says Damon Ennis, VP of product management at Silver Peak. "The most important design pattern a software vendor should consider is to reduce the overall 'chattiness' of the application in order to minimize the number of round trips required for each operation, such as File > Open, Close or Edit."

Continue Reading

Our Commenting Policies