5 Poisonous Sins That Lead To Poor Performance

If it would be possible to develop web-applications without performance issues i.e. the Excalibur of development, the Arthurian developer of such an application would be crowned King of the Web Development Round Table.

Unfortunately, the fantasy phrase “web-applications without performance issues” sounds....er...like a fantasy.

Are you guilty of the one of these poisonous performance sins?

HTTP Requests

Web pages consists of many varying elements that must be downloaded: images, style sheets, scripts, flash, etc. In order to download each of these files, we have send separate HTTP Requests, wait for a response and only then, can we use certain elements to render the web page. Therefore, the more HTTP requests sent, the longer we will have to wait for a comprehensive response (i.e.-fully rendered web page). 

The obvious solution?

Reduce the number of requests that are sent to the web server. 

  1. Where do we come across redundant HTTP Requests?

Overly complex page designs- Yes, it is true. Beautiful, exciting page designs can deter new users if it slows down web page rendering. The simplest solution, in HTTP requests, as in real life, is to reach a compromise between beauty and functionality.

  • Separate CSS Files - If there is a separate style sheet for every site page, additional requests will be sent while downloading each page. This will affect performance. Good practice for avoiding such a time consuming issue (in web speak, every second counts!) is to combine different CSS files into one file. After the initial request, the browser will cache the CSS file and no additional downloading will be required.
  • Separate Images - Many, many images need to be downloaded to render your page: icons, buttons, logotypes, etc. Each image means a separate HTTP Request, which is not good. To avoid this you, it's best to implement the CSS Sprites technique. It allows combining separate images into a single one. After implementing CSS Sprites, simply use CSS background-image and background-position properties to display the desired image segment. As a result, overall size of image will be about the same, but it will have reduced the number of HTTP Requests. This will increase page load time significantly.

Compressing Issues

Another way to reduce time of page rendering is by compressing files before sending them fromthe server to the browser. Any network has a finite amount of bandwidth for its connection to the Internet. So, in many cases this connection can be a performance bottleneck.

How can we come over this limitation? 

The simplest solution is often the best one (thank you, Occam's razor!): use compression. Compress text files (html, css, javascript, etc.).It allows moving the load from the network to the CPU of the server.

The Mechanics: The browser sends a header telling the server it accepts compressed content (gzip and deflate are two compression schemes): Accept-Encoding: gzip, deflate

The server sends a response if the content is actually compressed: Content-Encoding: gzip. For HTTP compression, GZip is considered the most effective and most popular by means of browsers and HTTP server.

It can reduce file size by a whopping 70%!

Cache issues

HTTP/S allows local caching of static files by the web-client (i.e. browser). To really optimize the abilities of caching across all browsers, configure the web server to explicitly set caching headers and apply them to all cacheable resources. Cacheable resources include JS and CSS files and other binary object files (media files, PDFs, Flash files, etc.). 

HTTP/1.1 provides the following caching response headers:

  • Expires and cache-Control: max-age. - These specify the “freshness lifetime” of a resource, or the time period during which the browser can use the cached resource without checking to see if a new version is available from the web server. There are "strong caching headers" that apply unconditionally; that is, once they're set and the resource is downloaded, the browser will not issue any GET requests for the resource until the expiry date or maximum age is reached.
  • Last-Modified and ETag. - These specify some characteristics about the resource that the browser checks to determine if the files are the same. In the Last-Modified header, this is always a date. In the ETag header, this can be any value that uniquely identifies a resource (file versions or content hashes are typical). Last-Modified is a "weak" caching header in that the browser applies a heuristic to determine whether to fetch the item from the cache or not. (The heuristics are different among different browsers.) However, these headers allow the browser to efficiently update its cached resources by issuing conditional GET requests when the user explicitly reloads the page. Conditional GETs don't return the full response unless the resource has changed at the server, and thus have lower latency than full GETs.

It is important to specify one of Expires or Cache-Control max-age, and one of Last-Modified or ETag, for all cacheable resources. It is redundant to specify both Expires and Cache-Control: max-age, or to specify both Last-Modified and ETag.

Code issues

There some code issues that influence web site performance.

  • Using “try-catch” construction. - Exceptions are EXPENSIVE! A stack trace must be created and a special flow control handled. So, this operator should be used only for processing exceptions but not for flow control.
  • Well-formed code. - This may sound odd, but performance is an area where well-formed code can be a problem. Long and clear names of variables, a lot of descriptive comments, spaces. All this is good for debugging at developer's workstations, but not for executing at the client side, because every redundant symbol is additional Internet traffic.
  1. What we need in this case is obfuscation. Originally, obfuscation is used for making code difficult for reading and understanding. But for scripting languages, such as JavaScript it can be used for reducing size of file. As a result, traffic will decrease. For this purpose, special programs called obfuscators are used. Modern obfuscators replace the constant numbers, optimize code to initialize the array and perform other optimizations that at the source level are difficult to near impossible.

As a result, JavaScript obfuscation reduces the size of HTML-files and thus speeds up the download.

Minimize usage of SSL

If two HTTP web servers are developed with identical HTML, and SSL is applied to only one of the servers, the client browsers will experience noticeable performance degradation when you browse the SSL web server. The solution? Use encryption sparingly. The use of large bitmaps behind an SSL site should be used with discretion. 

Implement the aforementioned solutions and one day, thou too might free the Excalibur and be Knighted, King of the Web Development Round Table.