Rythie's Blog

Technology and stuff

Speeding Up Web API Access

Web based APIs, such as the Twitter and Facebook APIs, are now commonly used in web applications. However, accessing these from a web application can quite easily slow it down, making it unresponsive to users.

This post is an overview of the major ways to make your application faster. This post is about application responsiveness, it’s not really about scalability (though nothing here should really prevent you from scaling). I’m not going to go into the technical details of these here, as there are plenty of resources on web.

Understanding the problem

HTTP based APIs typically take least 100ms per request due to network latencies. Multiple requests often mean you need several hundreds of milliseconds. Slow servers can balloon this to even longer, 10 seconds or more. Measuring the problems in your application, either using a profiling tool or with simple timings, is a good idea. Without measurements you risk wasting a lot of time optimising something that’s not a problem, whilst leaving the real problems untouched. Notably profiling can tell you if your biggest problem is really your API requests or something else, like your own database layer.

Cache it

Caching is the most obvious solution to the problem, i.e. store the result in a file somewhere (or use memcache) and get it again when someone else requests it. This can work well, if everyone viewing the site needs the same things from the API (i.e. it’s not personalised) and if that’s the case you can go one step further and fetch this data with a cron job. However it’s often not that easy! The big problems are that it’s not very good for personalised data and you are often presenting stale data to the user.

Fetch them at the same time

It’s quite likely that you will be requesting more than one endpoint from the API in a single page. The problem with this is, for each extra request you make the page will get slower. In most web programming languages nothing else can happen whilst these requests are in progress. The solution is to request these in parallel using cURL multi, this can be quite messy so I’d suggest using a library such as Rolling Curl [PHP]. Using cURL multi should mean that your requests only take as long as the longest one.

Be selective

Even though you can request multi endpoints in one go, it’s still best you don’t request stuff you don’t need, as unneeded requests can still slow down the page.

Don’t forget post-processing performance

It’s easy to think that your slow performance is completely caused by remote requests for API endpoints. However, try loading up a profiling tool (e.g. Xdebug if using PHP) to see where the performance bottlenecks really lie. For example if you are receiving a lot of results and running regular expressions on them, that could easily be the problem.

More than one way to do it

Up until this point I’ve assumed that you are requesting endpoints when you load the page, however there are a few other ways to do it.

AJAX

You can fire off an AJAX request which fetches the API endpoints, this can then be inserted into the page later on - typically this means the users see the main page load quickly, with the results from the API loading a little time after. The problem with this is that due to the use of Javascript to get the content, search engines won’t be indexing the content.

Cron

If suitable to your application you can use a cron to request end points at regular intervals and populate a cache, which is later read by the pages

Messages queues

Message queues can be useful if a simple AJAX or cron solution isn’t suitable, usually this is when a lot of requests are required. A message queue can be populated with the requests needed and something else can fetch these requests from the queue and process them.

Push methods

Lastly some APIs can actually push new data to you in real-time, such as Twitter streaming API. This means you never have to serve stale data again, however it’s not commonly available in APIs - also, it typically requires subscribing to certain subset of the data, which means it’s suitable to users who use your site regularly and requires caching of a lot of data that may never be accessed.