Locust - Code construct based Load Testing
Locust is an open load testing tool-driven via Python Code rather than XML/UI.
In this blog post, we will check how we can use Locust to perform load testing for our services/websites, etc.
Locust is an open-source load testing tool where we define user behavior via Code construct in form of Python rather than getting tangled with messy UIs and bloated XML configurations.
What will we cover in this blog post
- Setup Locust.
- Simulate Basic Load Test on simple HTTP service.
Pre-requisites
- Python (+ pip for installing locust package)
- Any HTTP Service to simulate Load Test. (We will be creating a small Spring Boot HTTP service as part of a demo in this blog
Demo Spring Boot HTTP Service
We have created a simple REST API via Spring Boot as below which has a simple endpoint /greet which does no CPU or I/O intensive work but just returns a string. This service is running locally on port 8080 which we will perform a load test on.
Sample Spring Boot HTTP Code
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
@SpringBootApplication
public class SpringBootLoadServiceApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBootLoadServiceApplication.class, args);
}
}
@RestController
class RandomResource {
@GetMapping(value = "/greet")
public String greet() {
return "Locust Load Test.";
}
}
Locust Setup
As usual, we will create and activate Virtual Environment to avoid package conflict for future use.
python -m venv virtualenv
Activate Virtual environment:
source virtualenv/Scripts/activate.bat
Once the virtual environment is set up, we will install only the package required locust as below using pip:
pip install locust
Once the locust is installed, verify if the below shows the version to prove a successful installation.
locust -V
If it shows as below, we are good to start load testing with Locust.
Setup Locust Script File
Now we will write a python code(springboot_locust.py) to simulate load testing which will hit our local endpoint as below.
Code Snippet
from locust import HttpUser, task, between #this is self-explanatory we importing required classes, function from locust module.
class SpringBootUser(HttpUser): #Simulates a single Real-world User. HttpUser class is extended so we provide what task the user will do.
'''this won't take effect in our scenario, as this attribute is optional and specify wait time randomly picked between parameters passed (1-3 seconds) between multiple @task executed. We have only one @task executed'''
wait_time = between(1, 3)
'''This is main core functionality of locust. @task are treated as a singel http request made.
We can multiple have @task. In below we are going to hit /greet endpoint on our localhost'''
@task
def hello_world(self):
self.client.get("/greet")
Have put comments to make sure the code is self-explanatory. WE are just defining one Task, a Task is a single execution of a Micro thread. We can have multiple tasks too, to simulate a flow along with weight to determine the order in which they are executed, else they will be picked up randomly.
Now we will start the locust test, by executing the below command to start locust web UI:
locust -f springboot_locust.py
It will start locust web UI on port 8089 by default and show below screen:
NOTE: We won't see that much of a spike as this service doesn't do any I/O or CPU-intensive work. But idea is to see a small spike to verify our service load testing. However, we can add the I/O part, in form of calling some HTTP or DB based on response to this article :)
Perform Load Test
We will perform 2 tests one with 100 users and the other with 1000 users. We will spawn 5 and 20 users spawned per second until it reaches the end-users target of 100 and 1000 respectively.
Load of 100 users with 5 users spawned per Second
We will Start Swarming post filling below values: Locust scaled to 100 users, with each user sending a request to API From the below charts, we can see this is no difference as load increased until 100 users where average median response stayed at 2ms while 95% percentile response time was around 3ms. So we can user did not show any spike ith 100 user request being served. There are different tabs per test as below to show real-time results of tests and most are self-explanatory. Statistics, Charts (real-time results), Download Data (For sharing results post-test) are mostly used.
Load of 1000 users with 20 users spawned per Second
We will now simulate 1000 Users Test but spawn 20 per second to get proper charts for blog purposes and see how the service reacts:
We see now, 1000 users are hitting our APIs:
We will let the service be under stress for a couple of minutes to scale until 1000 users before going to the below charts.
From the below charts, we can see how locusts capture response time as our load on service increased in form of users adding up along with requests fired. We see an increase in Median as well as 95% percentile response times as load increases on API.
Below bulletin shows show time increased on the scaling of users:
- For less than 400 users running, we have a Median response time of 2ms and a 95% percentile response of 5ms.
- For less than 450-675 users running, we have a Median response time of 3ms and a 95% percentile response of 7ms.
- For less than 750-1000 users running, we have a Median response time of 4-5ms and a 95% percentile response of 10-12ms*.
So we can see how locust provides real-time charts to capture response time for services, which based on the load generated behave differently. Obviously, the above service is a dummy one, but in a real-world scenario, it will help to identify services that behave abnormally under load. Which then can be passed via profiling tools for further analysis based on locust reports.
Report for same load test can be found here.
Obviously, this is just a basic use case, but a locust can do much more than this. The above can help you start with basic unit load testing locally before we do it in an integrated environment.
Apart from this, there are a lot of customizations available, which are specified in the documentation, link below for same.
Resources
Thank you for reading, If you have reached it so far, please like the article, It will encourage me to write more such articles. Do share your valuable suggestions, I appreciate your honest feedback and suggestions!