citizenM Case Study
Launched — 2022
Performance testing the complete mobile customer journey of a leading hotel group
spriteCloud was contacted to conduct performance, load and stress tests for the complete mobile customer journey of citizenM. citizenM’s goal since its inception is to streamline the hotel experience by allowing a high level of self-service through the use of technology. This hotel chain has a mobile application that can manage most of the actions customers undertake when they stay at citizenM, including booking a room, checking-in, using room controls (such as the lights and media) and checking-out.
As the functionality of the mobile app continues to evolve with new features, such as lighting and media controls, the importance of the mobile app in terms of performance and stability became more apparent. citizenM requested testing from spriteCloud to determine the performance and the limits of the complete production landscape for its mobile app.
“…To ensure the smooth operation of all systems, we decided to stress and load test our digital and on-premise infrastructure. We took relevant guest journeys and tried to simulate 100% and 200% occupancy. To do this properly you need a good engineering partner like spriteCloud to be able to model this….”
Director Technology & Digital
The challenging part of this project was the fact that the final execution of the tests happened in real-time: real hotel rooms were used – leaving no room for errors. The team worked for several weeks, preparing the scripts on an acceptance environment before running the scripts on the live production environment.
In this case study, we will explain how spriteCloud supported citizenM’s test goals, by exploring the importance of performance for the client, the strategy used on this project, the approach to the testing scenarios, and finally, we will discuss the results.
Identifying the impact of performance
The citizenM mobile application is one of the core control component for hotel guests. Since guests can manage the booking and room controls directly, it is crucial that the response and performance is good. When guests change the lights, for example, the response should be immediate. There should be no noticeable or annoying lag, whether 10 or 100 guests are using the lights.
In this optic, how does the system respond when all guests actively use the light controls at the same time? And what response time do we then need to take into account? Those are the types of questions that we set out to answer for citizenM. Another key metric that was important, in order to identify the impact of performance, was the response of the app when subjected to the highest possible usage regarding the booking flow. In short, could the system handle a situation whereby all available rooms would be booked and all guests would use the app to create a booking? At what ‘booking rate’ would the response still be acceptable?
spriteCloud put a lot of thought and effort into the process before the start of the engagement to help citizenM create a clear plan. During this initial phase the collaboration with all the third parties involved was smooth, giving citizenM full confidence that the performance test of their application was in safe hands.
Images of some of the room control options of the CitizenM app.
Test strategy and tooling
spriteCloud worked together with the party developing the mobile application to determine the required load to be generated and the load models. They monitored the server statistics during the load execution. Additionally, the middleware party was also involved, which proved to be a great collaboration since they were able to provide the required details, access the API endpoints and monitored the performance in real-time during the execution.
The load scenarios were highly complex, creating a booking, for example, required that the user did not have an open booking yet. Also, checking into a room would necessitate a room to be available and cleaned. This resulted in elaborate scenarios with various admin-level API rights in order to create these checks and set rooms and statuses to the desired state when needed.
In communication with citizenM, spriteCloud proposed to do several test rounds. The first group of tests were on the acceptance environment. After tuning and improving the light scripts, isolated tests were executed on specific rooms on the production environment. Finally, during two days the tests were executed at one of the hotels, where multiple floors were completely empty and dedicated to this purpose.
In a nutshell, spriteCloud’s testing strategy went as follow:
Create the script and the user flows on the acceptance environment.
Start testing on the acceptance environment (including booking, check-in, room control, check-out, and cleanup).
Tune and clean the scripts where needed until we were confident that the tests could run on a live environment.
Execute light isolated tests in a handful of rooms in the live environment. During these tests, spriteCloud staff was at the hotel to see if all the controls were working as expected.
Schedule the days to run the full official tests on production in one of the hotels.
Run the tests using all rooms on multiple floors of the hotel. These tests included:
all the relevant control elements of the mobile application,
a ‘disco’ test: running a stress test on the light controls of all the rooms,
and a stress test related to the booking, check-in, and check-out flow.
Analyse and deliver a final test report.