Choosing a price for your webapp or startup using multivariate testing
It seems that one of the more difficult questions facing startup founders is how to choose a price. Without a physical product that has a fixed unit cost, it can be hard to decide how much you should charge, especially at the beginning. There has been a fair amount of discussion and various blog posts (and a free eBook) on the topic but it often ends up as an educated guess – what you would pay for it.
When we first launched our server monitoring service, Server Density, we were in the same position and so decided to base our pricing on FogBugz On Demand. This is a hosted version of Fog Creek’s well known bug tracking service which is charged at $25 USD per user per month. That’s about £16 GBP and so we decided to go for a nice round number, £10 per server per month. During our beta users indicated this would be a fair price when we asked them about it. And that’s what it’s been ever since…until now.
Competing with free
We built Server Density out of our own frustration with existing tools. Sure, they are very flexible and are free (open source) to use, but the real cost comes with the complexity of setting things up and ongoing maintenance. As a small company (or even a medium sized company), that is overkill. So we positioned Server Density as a very easy way to quickly get server performance monitoring and alerting set up.
The problem is you’re competing with free products and regardless of what time/cost savings are involved, there’s always the perception of “well I could just download it for free”. This isn’t the place for a discussion of free (open source) vs commercial but we had to try and differentiate the product in ways other than “ease of use”. Our customers get that once they’ve signed up, the hurdle is getting that signup in the first place.
That there are free alternatives is important because it makes a difference to what your pricing is. When we surved our customers, one of the biggest complaints was the price. People loved the service but thought it was too expensive. Since then we’ve made improvements to try and reach the “must have” status but also thought we’d try a pricing experiment.
A/B testing pricing
Unlike at the beginning, we now know how much each customer costs us and how much revenue they generate per month on average. Using this data, we were able to pick 2 new (lower) price points to test based on the assumption that £10 per server per month was too expensive. So from the 19th Dec 2009 we started a test on our website to randomly assign a price to each user as they landed on the site and measure both signups and then conversions.
This was very simple multi-variate testing executed using Google Website Optimizer picking from 3 price options:
- £10 GBP per server per month (original)
- £7 GBP per server per month (combination 1)
- £5 GBP per server per month (combination 2)
The results – signups
We need to run the test for at least a month so we could go through an entire signup/free trial cycle. Our free trials last for 30 days but we ended up running it for over 2 months – 19th Dec to 3rd Mar. We had a pretty good idea what the right choice was near the end but it was interesting to see the data come up and run through several trial cycles.
Based on signups, the best price was £7 showing 13.1% more signups than £10, performing better than £5. This is interesting because £5, is obviously cheaper. This showed an 11.7% improvement so it was very close. The chart below shows the number of signups and demonstrates the variations over the experiment.
The results – conversions
Signups are important because as mentioned above, we have found that once users get into using the app they find it extremely useful and are more likely to put their pricing objections aside. However, even more important are the conversion figures because they translate directly to revenue.
As expected, the lower pricing points showed more conversions but that does not necessarily mean higher revenues. Our average customer monitors more than 1 server so the real test was whether the lower price point resulted in an increase in the number of servers monitored.
It did. We did not see an increase in conversions but a decrease in revenue because of the lower price – we in fact saw an increase in conversions, an increase in the number of servers monitored and an increase in the revenue.
- At the £7 price point compared to the £10 price point we had 20% more conversions and a 78% increase in the number of servers. This translated to a 25% increase in revenue.
- At the £5 price point compared to the £10 price point we had a 70% increase in conversions and 161% increase in the number of servers. This translated to a 30% increase in revenue.
So the clear winner is £5 right? Not necessarily. It is clear that lowering our pricing is the right move but the revenue difference between £7 and £5 is only 5%. Given the tiny difference in signups (even with more conversions), it makes little difference choosing £5 over £7. Further, setting a lower list price means we have to adjust our reseller pricing which is where our largest revenues come from. Decreasing the price from £10 to £7 makes the most sense right now because it gives us both the revenue increase from direct signups but also provides flexibility when negotiating reseller deals.
Impact to existing customers
We have, from today, reduced our pricing to £7 GBP (~$10 USD) for all customers. The figures make sense for new signups but if you do this, you have to be careful with existing revenues because they will be affected too (unless you’re evil and don’t make the change for existing customers). I ran our numbers through our forecasting spreadsheet to ensure this wouldn’t cause any cashflow problems and the change made almost no real difference. I’m hoping to see some of our existing customers add a few more servers now their monthly bills will be lower.
What about complaints during the experiment?
I thought we might have existing customers complain if they went to our website and saw a different price from the one they signed up with. That was not the case – none of our existing customers noticed! We had a couple of e-mails from new visitors who had colleagues showing different prices from them when they were evaluating but that did not affect sales in any way we can measure. Google Website Optimizer is pretty clever about how it remembers the pricing – it’s not just cookie based.
Choosing a price at the beginning is hard but once you have data you can quickly use it to make decisions about what the best price will be. In our case we were able to reduce the price to achieve a revenue increase. We no longer have an arbitrary price – it is based on our costs and clear statistics showing the effectiveness of each price point. Having real data also allows you to show you’re not reducing prices as a panicked reaction to slow signups. We are now confident we have the right pricing for direct signups and room to work with resellers.
Enjoy this post? You may also like Growing an ops team from 1 founder