Resources

What we learned from testing hotel websites on real guests

Written by Lily McIlwain | Aug 10, 2018 9:00:00 AM

At Triptease, one of our core values centres on continuous learning. Our business exists to help hotels compete in a world of OTAs who have access to first-class data analysis and enormous data scale. To compete in this world, you have to put testing at the heart of everything you do.

Our advantage is that we can test conversion optimization hypotheses across millions of guests quickly. While we can't share the results of all of these tests, this article details one recent example of how we combined qualitative and quantitative testing to significantly improve the performance of our industry-leading Price Check.

 

In the six weeks prior to making changes to Price Check’s design, the average conversion rate for sessions in which Price Check was displayed was around 3.35%. In the six weeks since the changes were implemented, the session conversion rate has averaged out at 3.9%. That’s a conversion rate increase of 16%.

 

Why perform user testing?

In the words of the great Mark Twain: “supposing is good, but finding out is better.” It’s easy to forget sometimes, but we are not our users. We can make assumptions about how they think, how they feel, and how they react to our products, but unless we take steps to ensure those assumptions are true then it would be foolhardy at best to make strategic decisions about the direction of our product.

Furthermore, while it’s no secret that we’re a data-driven company, quantitative data can only tell us so much. It’s crucial to combine what we learn from our raw product usage data with qualitative feedback from the people we’re actually designing for. It’s only through this approach that you can achieve the level of empathy with your end user that will enable you to design a product that truly fits their needs.

 

How we tested

Our aim with this specific user testing session was to collect feedback on the existing design of Price Check and to test out a new design. The session was scenario-based: we asked participants to imagine that they had booked a holiday and were now at the stage where they wanted to book a room at a particular hotel. The website for the hotel in question featured Price Check on the booking engine, but we did not mention this in advance. Our subjects did not know which area of the website was under observation or what they were supposed to be testing; we simply asked them to perform the task of booking a room as they would normally.

 

 

 

The 'old' and 'new' versions of Price Check being tested

 

What we found

A resounding learning for us was that we should make clear on the Price Check pop-up itself that the prices displayed are based on the guest’s exact search parameters. On the existing Price Check design, the search parameters were not replicated - leading some users to question whether they were seeing live prices or pre-filled ones. Given that our 99.5% pricing accuracy is such a core element of our proposition, we knew that addressing this misunderstanding was a priority.

When we tested our updated Price Check design featuring clear search parameters, 100% of our test subjects said that they understood the comparison was based on their search.

We also found that some users were confused by the ‘No availability’ text if there was no corresponding rate on an OTA. After trialling several options, our current iteration is now ‘No rooms’ with grayed-out text to indicate that the user won’t find results for their search on that particular OTA. When testing the new design, all those taking part indicated that they understood the new ‘No rooms’ copy.

 

The impact of the change

Having listened to real-life users and implemented their feedback, it was time to see if Price Check’s performance would improve as a result.

In the six weeks prior to making changes to Price Check’s design, the average conversion rate for sessions in which Price Check was displayed was around 3.35%. In the six weeks since the changes were implemented, the session conversion rate has averaged out at 3.9%. That’s a conversion rate increase of 16%.

This increase was monitored over around 53,000 converted sessions. The change was not implemented over our entire client base.

A control alternative was running on a proportion of our client websites for the same period of time and did not experience any significant uplift.

Now, we know better than to claim this as a victory and leave it at that. We’re the first to admit that this doesn’t count as a true A/B test: comparing the performance of two designs over a period of time is not the same as comparing them during the same time period, under the same conditions, to a group of people. However, this result is a strong indicator of the new design’s success - and backs up the qualitative data gathered by our user testing.

Like our entire platform, Price Check is a constantly-evolving entity - and the learning and improvement doesn't stop here. Find out today how Triptease can work with your hotel to improve website conversion rates.

 

This article was edited to rectify a misprint at 3pm on Wednesday 8th August.

 

About The Author

Lily is Lead Product Marketing Manager at Triptease. When she's not investigating the industry or spreading the word that #DirectIsBest, she enjoys music, cycling, and obscure radio quiz shows.