A new name won’t fix your product

Why product names matter but user experience and understanding should always come first.

Jade Goldsmith
Booking.com — UX Writing
8 min readMay 27, 2020


Last year my team and I began renaming a product called Booking.basic. As the UX Writer, I took charge. We wanted to rename the product because we felt the name no longer represented the offer. Within a month or so I came to the conclusion that a new name would not fix our product and that we had more work ahead of us.

What is Booking.basic?

As part of our commitment to deliver the best possible experience and prices to customers, Booking.com sometimes independently sources rates with stricter booking conditions from third-party providers.

We display these rates as Booking.basic. While the name seemed appropriate when we first launched, it didn’t consider new features or improvements that we’d develop over the next two years.

In early 2019, the designer, researcher and I started looking into ways to scale. We wanted to offer rates at a variety of price points, rates at more luxurious properties and multiple rates at the same property. We quickly found that the name was confusing. People couldn’t understand what was basic about Booking.basic.

First steps

We spent early 2019 collecting data and understanding what was causing the confusion. I ran workshops to create a messaging architecture, develop a value proposition and brainstorm new names.

Next, we turned to user testing to see how people would respond. We found that a new name wouldn’t be enough to eliminate the confusion. We’d need to do more to improve understanding.

A year’s worth of research, user testing and experimentation

Round 1 | Testing a new name and tagline

From the workshops and user insights, the designer and I decided to change the name Booking.basic to Smart Price and add the tagline The budget-friendly choice. We thought the tagline would be the smallest step towards improving understanding.

We found that people preferred Smart Price to Booking.basic but they didn’t understand why the price was smart. Many users thought that this was a recommendation from Booking.com — and no one acknowledged that the inventory was from a third-party or that the conditions were stricter.

We changed the badge to Smart price and added The budget-friendly choice tagline

Round 2 | Scaling to more rates

I knew that Smart Price was an attention grabber, but I wondered if it was grabbing attention for the right reasons. Would it make sense if we displayed prices at more luxurious properties or for multiple rooms? I suggested re-running the test with two Smart Price rooms at the same property.

As I suspected, people got confused when we showed two rooms with the Smart Price badge. They couldn’t understand why the rooms were cheaper than the others when most of the facilities were similar.

Two third-party rates at the same property with the Smart Price badge

Round 3 | Defining the value proposition

Knowing that the Smart Price badge was a good attention grabber, I decided to zoom in on the taglines.

I chose the following options based on the earlier survey results. We knew from the surveys that budget was most important to users so I tried to focus on budget in three different ways.

  • Budget-friendly: I focused the copy around the business value. I indicated in the tooltip that these were always the most affordable options but did not explain why.
  • Explore more for less: I tried inspirational copy to see if I could pique curiosity. In the tooltip, I focused on the value for the customer and talked about what they could do with the money they’d save.
  • Always a great price: I used reassurance to help customers understand that these rooms were always the cheapest. In the tooltip, I explained why.

The third option was the most effective. People understood why the rooms were cheaper and sought more details about the booking conditions compared with the other two options.

I used the budget theme to craft three different taglines and descriptions in the tooltips

Round 4 | Drawing our first conclusions

The designer, researcher and I consolidated our findings and launched another test. We explained that the rooms were from partner providers and that they were budget-friendly. To further clarify, we added a Learn more call-to-action and a post-selection summary of the booking conditions. Finally, we started to see changes in behaviour and made two conclusions:

1) People want more information but don’t want to work hard to find it.

2) People make assumptions if they cannot immediately find what they need.

One of the options we tested after compiling our findings from rounds 1 through 3

These findings led us to shift our focus from a new name to the booking conditions. We concluded that if we used an attention-grabbing name without being explicit about the booking conditions we’d risk the wrong customers making bookings.

Customers would book these rates without understanding that they couldn’t get an invoice, had to wait for up to 24 hours for confirmation and would not receive full customer support. They’d be disappointed, attempt to cancel and be unlikely to book with us again or recommend us to others.

Round 5 | Making the conditions easy to find

Badge placement

We saw that if we put the badge in the price column, people found it more easily and associated it with the price rather than the room or property facilities.

Booking conditions entry points

The booking conditions entry points weren’t as clear cut. People had all sorts of different preferences. To reach as many people as possible, we showed the booking conditions in four spots. They’d see a summary in the Your choices column, in the final column after selection, and the full conditions if they clicked on Learn more or the Smart Price badge.

We tested a few different placements for the badge and booking conditions

Rewriting the booking conditions

In round 5, we made it easier for customers to find the booking conditions. When we did this, we noticed that people did not always understand the conditions completely. We needed to make the copy as clear as possible before we continued.

To do this, I took the copy out of the prototype and tested it on its own.

The user researcher and I created a survey and sent it to 300 English speakers. I wrote a question that described a booking condition and offered four different phrases. The people needed to identify which of the four phrases best described the condition in question. We identified the clearest option for each and used the winning phrases in round 6.

Round 6 | Back to the name

Our original goal was to rename Booking.basic. We still needed to do this in order to make this product easy to understand and attract the correct customers to make bookings.

During round 6, we took our findings from round 5 and tested the new booking conditions copy and placements with six different names from our brainstorm.

In the table below, you’ll find there was no clear winning name. In all options, people could more easily find and understand the conditions compared with the original, but we still weren’t sure which option was best. Each had its perks.

A summary of the major findings

A/B test

To complement all the qualitative findings we’ve discussed so far, we decided we were ready to A/B test with real customers. We chose three names all with the same display: Booking.basic, Smart Price and Marketplace.

Why these names?

First, we needed to see how our original name, Booking.basic, would perform with the new booking conditions copy and design changes. Perhaps this would be enough to help our customers understand the product?

Second, Smart Price and similar names that implied a discount often led people to ignore the booking conditions in user testing. We needed to test this hypothesis quantitatively.

Third, whenever we mentioned a third-party we knew people had a clearer understanding of the booking conditions. After testing a few different ideas in round 6, we settled on Marketplace.

A/B test results

We ran the A/B test for four weeks alongside a qualitative survey in the confirmation page.

To our surprise, Marketplace was clearest. Customers understood why the rates were cheaper, where the rates came from and why the booking conditions were stricter.

In all three versions, customers understood the booking conditions significantly better than when compared with the original. We also saw a drop in customer service tickets to support this improvement.

The three variants compared with Booking.basic base


After carefully analysing the results and looking at the overall user experience, we decided to keep the Booking.basic name for the time being. We changed the booking conditions copy, entry points and badge placement as a first step. You can see the changes we made in the last image above.

Next, we updated the booking conditions copy, entry points and badge placement on apps, mobile platforms and in the post-booking experience to make sure our customers would have a cohesive experience no matter where they make a booking.

Now, we are looking into new ways to make the third-party inventory product even better for our customers. Our research and efforts have proven that a great price is very important but the strict booking conditions make the product less appealing. We are exploring ways to remove some of these booking conditions. This could mean that the product name is no longer relevant and we are also exploring a no-name approach.

Key takeaways

1. A new name won’t fix your product

Always make sure you understand your user problems before coming up with a new name. Start by asking people what’s wrong and observing where things go off course. Only then can you address potential product improvements and name options.

2. Creating a good user experience requires a team

I worked closely with the designer and researcher during every step of development. We met daily to talk things over, watch user testing videos and plan our next steps. We shared our suggestions and observations regularly with our product manager to make sure we were all aligned. It’s important to work as a team to craft good user experiences.

3. Data is your friend

We did brainstorms, sent surveys, used usertesting.com, conducted copy research, ran A/B tests, did a post-booking survey and did a post-analysis on customer service to confirm our results. Having both qualitative and quantitative data was critical in strengthening our reasoning and making sure our findings were valid.

This process may sound time-consuming, and it was, but in the end, we were able to make relatively small copy and design changes that have a large impact. Now, we are more transparent with our customers. Customers are better able to find and understand this product when they are comparing their options. Not only do they get a better experience, but they can be more confident in their selection and we can be more confident that customers are making bookings that suit their needs.

We’re always on the hunt for new writing talent. Wanna join us? Apply here.

Special thanks to UX designers Hadar Yonna and Guilherme Cerqueira, user researcher Andrew Briffa and product manager Christos Tsifopoulos.



Jade Goldsmith
Booking.com — UX Writing