What A Load: Load Testing APIs

David McIntosh
4 min readApr 8, 2020

Where I work, we’re pretty new to the idea of an API Economy, micro-services, etc., so we, obviously, don’t know all we need to know yet about APIs and how to best develop and test them. Just last week, something struck me as peculiar. Two members of our Quality Assurance team were running tests to replicate behaviors seen with an endpoint when it receives several concurrent requests. This proved quite difficult, especially because the testing was rather manual. They were literally trying to submit requests at the same time by clicking on a button on our front-end application. When I saw it, I immediately wondered if maybe there wasn’t a better way to accomplish what they were trying to do. With a little research, I found out there’s actually a whole lot of ways to automate the sending of concurrent requests to an endpoint. This, of course, was never my concern. There had to be a way. It’s 2020. Tigers are contracting COVID-19. Of course there’s a way to send concurrent requests. The question is, what’s the most effective way to do so, especially when dealing with NodeJS application? This led to the discovery of two very neat npm packages: loadtest and artillery.

loadtest

loadtest runs a load test on the selected HTTP or WebSockets URL. The API allows for easy integration in your own tests.

artillery

Artillery is a modern, powerful & easy-to-use load testing and functional testing toolkit. Use it to ship scalable backends, APIs & services that stay performant & resilient under high load.

As defined on npmjs, both packages essentially do the same thing. They both allow for load testing of APIs of different kinds, whether HTTP or WebSockets, etc. For both, it’s pretty simple, you install the package (usually globally), and then you can write your specific test commands to run against a backend service/endpoint. Both packages output a report containing useful information after all the requests have been executed.

As they’re pretty similar, it’s mostly up to personal preference, however I choose to use loadtest. It’s easier for beginners to get accustomed to the syntax and start writing test commands, and it provides a report that’s easier to read, as well. Let’s see how we would get started with loadtest…

Getting Started

npm install -g loadtest

After running the above command, loadtest will be installed globally on your machine, and you’ll be able to run tests from within any terminal window.

Now, let’s assume you have an API ready to test. In my case, I have a basic endpoint listening for requests on port 3000, which expects a GET request with a number in the query string. So, something like: http://localhost:3000?number=20 (don’t actually click that). This endpoint accepts a number and calculates the Fibonacci sequence for it. Although, it could, technically, do anything. We just need something to test. You could even use an online endpoint, like

Now, let’s formulate our loadtest command. We want something like this:

loadtest -n 1000 -c 100 --rps 200 http://localhost:3000?number=20

Let’s break that down…

  • loadtest — this is just to say that we are running a loadtest-specific command and want to utilize that globally-installed package.
  • -n 1000 — this is saying that we want to send a total of 1,000 requests to the specified endpoint.
  • -c 100 — this is saying that, of the 1,000 total requests, send 100 of them concurrently.
  • — rps 200 — this is saying that we want 200 requests per second. Theoretically, then, we should expect all 1,000 requests to be finished in approximately 5 seconds.
  • http://localhost:3000?number=20 — this is just the endpoint we want to send the requests to. You can substitute it for any endpoint you wish to test.

Let’s see what happens when we run that command on our machine…

loadtest on-screen results

Notice how loadtest prints a useful report after sending all the requests to the endpoint. This report indicates key items such as how many errors occurred with the requests (if any), the total time all requests took to complete, the requests per second, the mean latency experienced and also the longest time a single request took to complete.

This accomplished two things: Not only do you get a way to perform concurrent requests on an endpoint, but you also gain useful information about it to see where it can be improved. This eliminates the need for testers to have to manually attempt to send concurrent requests to endpoints. It’s a tool that can be used by developers, during unit-testing and also down the line by the testing team during their regression tests. Overall, I think it’s a useful weapon to have in your team’s arsenal, especially if you’re serious about building robust, efficient APIs in today’s day and age.

While we only tested a GET request, loadtest offers multiple ways to test both the PATCH and the POST request, the latter of which is more common. You can send a JSON payload in a string in the command, or even have loadtest read the payload from a file. For more information, check out the official loadtest docks on npmjs. If loadtest proves useful, but you still want to try out artillery, here are the docs.

Happy Testing!

--

--

David McIntosh

Interested in Computer Science, education and world domination.