Speed up API access with caching
Integration with third party sites using their APIs is all the rage. Sites like Twitter, Google, Facebook and Flickr all provide the ability to integrate their content into your website. Using these services for site development provides more interesting and useful content to your visitors, for example an up to date Google calendar or an active Twitter feed.
This is great as long as the website doesn't suffer as a result. If the third party has issues that you can't control, then your website will be slower. Sometimes there are fail-safe mechanisms to minimise the effect of third party services, but a much better approach is to use a cache. Instead of showing empty space where no new data is found, the cache would always contain some content.
At Bluelinemedia we take a holistic approach to our in-house web development.
Is your site slowed down by third party code?
From our own experience both back-end and front-end integration with third party data sources can affect performance. One of our clients installed JavaScript widgets in a page using our bespoke CMS, which led to very slow page loads because some of the widgets took an incredibly long time to load.
This article looks at the solution to this problem: caching. I am not going to talk about caching in general, but specifically the caching of data and scripts from third party providers. This is not suitable for dynamic content based on visitor submissions such as search forms etc., and you should not attempt to cache this type of response unless you really know what you are doing, i.e. you cache on a request rather than page basis.
Improving performance for back-end integration
Back-end is the most common integration and can have the greatest impact on the speed of your website. Common integration methods such as SOAP and HTTP posts ultimately send content in a standard format such as XML. This content is then processed by your script for presentation in the relevant page.
Caching can be quite easily implemented when you connect to the web service, by saving the data to a file in a cache directory on your site. Using this file in place of the API call for future requests is fairly straightforward and requires the fewest changes to your code. You will need to adjust the caching period based on the service, for instance Twitter is likely to update more often and therefore require more regular caching than Flickr, which may only be updated once a week.
Twitter is a good example of the advantage of caching, because of their limit of 2000 API calls a day. If you display a Twitter feed without caching, every time a visitor (or bot) views your page they are generating another API call, and it only takes say 200 visitors looking at 10 pages of your site to use up your daily allowance. Whereas if you cache the response every five minutes then you will use only 288 API calls a day, irrespective of actual visitor numbers.
How to create your own cache
There are two obvious options: running a regular cron job, or using on-demand caching. Using a cron job to run a script at fixed times during the day means that visitors won't experience any delay. There is no need to check for dates, and as long as you build up a swap file rather than directly overloading the live "cache file" you won't suffer from any outages at the third party site. The downside is that the cron will always run, even when there are no visitors to your site.
On-demand caching can be used either when you have no access to crons or where a cron job will waste processing. Each time a page is requested that contains a cached item, your code can simply check if a cached version is available and use this instead of getting the dynamic version. As part of checking for the availability of a cached file the code can also test if the cache is too old, and re-create it from the live source. The only downside is that a visitor who happens to hit the site as the cache is re-created will experience a slower page load.
I've provided an example PHP script at the bottom of the page. Find out more about our PHP web development.
Load front-end third party code more quickly
Caching can help when using third party JavaScript, images and other content. Using the techniques outlined above you can copy any content from the remote site into a sub-folder of your site and link directly to the cached copy instead of the third party site. The cron technique is more easily implemented as the on-demand method would require parsing of JavaScript and other content through a PHP script to determine values like the age of the cache.
Caching can speed up third party code on your website
Use of targeted caching when integrating with third party sites can provide huge performance gains that your visitors will really notice, and stop them abandoning you for another website. Implementing this caching method in the simplest way means the solution is easy and quick to maintain.
Ultimately, caching of third party data brings your site's performance entirely under your control. You can directly influence the speed of content delivery without having to rely on other people (who maybe don't care quite as much as they should).
Download the following file examples: