VPlus+: An Enhanced Model for Price Optimization
by Paul M. Gurwitz, Ph.D., Managing Director, Renaissance Research & Consulting, Inc.
One of the most popular ways of finding the correct price point is the Van Westendorp model. This article explains this method and proposes a new approach to price optimization developed by Renaissance Research & Consulting: VPlus+.
What is the best price for a product or service? Two things are obvious about the answer to this question: 1) It’s very important, and 2) It’s not easy.
Correct pricing is a basic requirement for marketing success – if a product is too expensive (or, in some cases, too cheap), it doesn’t matter how good it may be in other respects. Its perceived quality, brand name, features, and everything else may be top-notch, but if prospective customers think the price isn’t right, they aren’t likely to buy. Another way of looking at the same issue is that the “correct” price is actually an evaluation of the item’s features – the better customers think it is, the more likely they are to pay for it.
But finding out what that “correct” price is is not a trivial matter. A number of different approaches has evolved over the years, and each has its proponents. These include approaches like tradeoff or key driver approaches, which treat price as just another attribute of the product, whose relative importance is to be determined; techniques, such as Brand Equity, which focus on the relationship between price and one other variable, and try to find the price point at which switching occurs; or measurement of key price points using direct questioning.
One of the most popular forms of the latter approach is the Van Westendorp model, after the Dutch economist Peter Van Westendorp. It is based on establishing “limit” price points for a given item, plotting the frequency distribution of those limits, and so establishing an acceptable price range and an optimum price for the item.
Briefly, the Van Westendorp model works as follows:
- A prospect for a given product is presented with a series of discrete price levels for it (e.g., $10, $20, $30, etc.)
- The prospect is then asked to use these prices to identify four critical price points for the item:
- The highest price at which the product is too cheap, where quality is in question
- The highest price at which the product is a bargain, and the prospect would buy it without hesitation
- The price at which the item is getting expensive; here, the prospect would have to think about it, but would still consider it
- The price at which the item is too expensive, and is beyond consideration.
- The cumulative frequency distributions of the answers to each of these questions (and their inverses) are plotted on a graph whose x-axis is the price points, and whose y-axis is the cumulative percent identifying that price point.
- The intersection of these lines produce the criticial price points for the item:
- The Indifference Price is the price at the intersection of the “Bargain” and “Expensive”. (This is often interpreted as the “normal” price).
- The Optimal Price is the price at the intersection of the “Too Cheap” and “Too Expensive” lines – in other words, the price that minimizes rejection for either reason.
- The Lower Bound is where “Too Cheap” and “Not Cheap” cross; the Upper Bound is where “Too Expensive” and “Not Expensive” cross. They define the Acceptable Price Range.
In principle, the Van Westendorp model is attractive: it is easy to administer, easy to analyze, and (usually) yields common-sense results; that’s why it’s so popular. Nonetheless, it has certain problems.
- The form of the “Expensive” question is ambiguous, and therefore hard for a respondent to answer. This is because, as posed, “expensive” is not a price point, but a region defined by the other price points. The result is often prices tied with other points, or even (if allowed) nonsensical answers (e.g., an “expensive” price below the “bargain” price)
- The price points derived by the Van Westendorp model are optimized in terms of the number of customers responding at a given price point, but do not take the prices themselves into account. In other words, it assumes the optimal price is the one that attracts the most customers (or at least drives the fewest away), without taking into account that, at a given cost, one makes more money from the same number of customers at a higher price – and that therefore the optimal price may not necessarily be the one that yields the highest market share.
VPLUS+sm: A New Approach to Price Optimization
In order to deal with the problems the Van Westendorp approach poses, we propose the following methodology, which we call VPLUS+sm. This approach starts by assuming that each consumer’s decision space about a product or service (with respect to price) is divided into ranges:
VPLUS+sm measures these ranges by asking three questions:
- What is the highest price at which the product/service is too cheap: you would question its quality?
- What is the highest price at which the product/service is a bargain: you would buy it without thinking about it?
- What is the lowest price at which the product/service is too expensive: you would not even consider buying it?
The questions are asked in order; a respondent is instructed not to use the same or lower price used to answer a previous question. (This approach is, therefore ideal for Internet surveys, where logically inappropriate answers can be “greyed out” in real time during the interview.)
Using the answers to these questions, four cumulative distributions are plotted: one for each of the questions, plus the inverse cumulative distribution of Question 2. This yields the four Van Westendorp price points, with one difference: the Indifference Price is the intersection of “Bargain” and “Not Bargain” – in other words, the price at which half the sample thinks the item is a bargain.
In this way, VPLUS+sm determines, at each price point, the number rejecting the item (as too cheap or too expensive), the number accepting the item (as a bargain), and, by subtraction, those willing to consider the item. From these, the model calculates, in Van Westendorp fashion, the optimal price points and ranges with respect the maximizing reach.
Calculation of Expected Value
The model then goes further, measuring the expected value at each price point:
- Minimum Expected Value is the product of the price and the percent accepting the item at that price
- Maximum Expected Value is the product of the price and the percent accepting or considering the item at that price
The two price curves can also be used to determine the price region that maximizes Expected Value. The “Value Limits” of price are the prices that maximize the two estimates of Expected Value:
- The Lower Limit is the price that maximizes Expected Value considering only respondents who accept the product or service outright at that price. It is the more conservative estimate.
- The Upper Limit is the price that maximizes Expected Value considering all respondents who either accept or would consider the product at that price.
- Because the full Expected Value at that price would only be realized if all “considerers” converted to acceptance, the real Expected Value at that price will be less than the Upper Limit; in fact, it might even be less than that of the Lower Limit price, since the Upper Limit price does not maximize Expected Value from acceptors alone.
- In order to assess how realistic the Upper Limit price is, we calculate another statistic, the Minimum No-Risk Conversion Rate. This is the percentage of considerers that would have to convert to acceptance at the Upper Limit price for it to yield an Expected Value equal to that produced by the Lower Limit price. A small value indicates that the Upper Limit is “feasible”: enough considerers will probably accept the product at that price that the Expected Value of the Upper Limit will be greater than that of the Lower. Conversely, a high percentage would suggest that the Upper Limit price is unrealistic: it is based so heavily on considerers that its “real” Expected Value is likely to be lower than that produced by the Lower Limit.
The VPlus+sm model is calculated in two versions, each of which has advantages and disadvantages:
- In the Discrete Model, the price curves are developed directly from the discrete price points; intermediate responses are linearly interpolated between them.
- In the Continuous Model, the price curves are calculated mathematically using a probit function to fit a curve to the price points.
Both methods have advantages. While the Continuous Model provides a more valid method for interpolation between data points than simple linear interpolation, it also tends to smooth out bumps in the price curve. Because it allows discontinuities and thresholds, the Discrete Model may in some cases be a more faithful representation of consumer behavior.
Because neither version is necessarily the best in all circumstances, both are generally calculated, and the results compared. If the Discrete Model shows few irregularities, and if the Continous Model fits the points well, the latter is used.
Because its less ambiguous question sequence makes it easier to administer and interpret, and because of the additional price response information it provides, we feel that VPlus+sm is truly the “next step” in price optimization research.
Other content shared by Renaissance Research & Consulting, Inc.
by Paul M. Gurwitz, Ph.D., Managing Director, Renaissance Research & Consulting, Inc.
This article describes several methods for deriving feature importance and their advantages and disadvantages, including Stated Importance, Key Driver Analysis and The Kano Model. Renaissance Research & Consulting offers a model that has the combination of flexibility, validity, and ease of administration - RenSat+. Read Article »
by Paul M. Gurwitz, Ph. D., Managing Director, Renaissance Research & Consulting, Inc.
Classical techniques of attitudinal segmentation, like Cluster Analysis, produce interesting groups that are not particularly useful. Targeting methods like CHAID produce groups related to an objective, and therefore useful, but they lack depth and are not interesting. This white paper presents a mix of the two methods - Hybrid Segmentation - as a method that produces segments that are both interesting and actionable. Read Article »