How do you design a survey to support product development?

August 18, 2020
The Trig Team

What insights can you get out of survey research?

How complex does the survey need to be? Where do you even start?

In this article, we give you a lay of the land from our experience designing custom-fit surveys and analyzing the data to address a range of product development needs.

  1. START HERE → What are the most important questions you want to answer? Without a clear focus, survey research may not yield the results you need. We often start our research process by working with the client to develop a brief written research plan describing the key questions we are aiming to answer, the survey design methods that would be employed, and the target survey respondent population. This starting step ensures that the team is on the same page and understands what the survey will (and will not) yield for information. 
  1. Remember, your respondents are humans. Survey fatigue is an important reality - your respondents have limits on their time and attention. Optimizing your survey to be efficient and focused on your most important questions will yield useful data. Using your research plan as a checkpoint can avoid survey scope creep, where it is tempting to add a few more questions of interest that are lower priority. 
  1. Choose your research methods. Again, it all loops back to “what are your key questions?” We think of research methods as tools in a toolbox - you want the right tool to match the job at hand. Some frequently used methods in our projects have included the following, and the methods used for any particular survey connect directly to the research questions being asked. For all of these examples, we’re using refrigerators as our imaginary research study product type to see how these options could play out.
  • Likert scaling - Named for its inventor, social psychologist Rensis Likert, this scale is what you commonly see in surveys asking respondents to give their response to a question on a five-level rating scale (Likert, 1932). We commonly use Likert scaling when asking respondents to give their impression of how much they are interested in, like or need a particular new product feature. For example, “How interested are you in having food expiration sensors in a new refrigerator?”
  • MaxDiff - MaxDiff, short for “maximum-difference scaling”, is a valuable approach to dive deeply into the customer preferences and priorities (e.g., Marley and Louviere, 2005). MaxDiff questions typically give respondents a set of product attributes from which they must select the “best” and “worst”. For example, a survey about refrigerator features would require the survey taker to select the “best” and “worst” among a set of potential feature options (e.g., feature 1: greater icemaker capacity; feature 2: transformable interior compartments; feature 3: integrated battery back-up for power failures...and so on). Ultimately, this approach yields what customers feel are the most valuable attributes relative to the other options. The downside of this approach is that survey fatigue is an issue to consider in its design and that typically the attributes are ranked individually rather than as sets.
  • Conjoint Analysis - Conjoint analysis typically involves showing the survey respondent different combinations of features and asking them to choose their preferred combination (e.g., Green and Srinivasan, 1978). For example, conjoint analysis for refrigerators would provide a number of candidate refrigerators that each have distinct sets of features, for the user to choose their favorite. In the selection process that mimics real-world purchasing decisions, the user is forced to make a decision that incorporates trade-offs among the attribute sets. In our refrigerator example, the respondent may now have to consider three or four refrigerator design features plus price in their selection process. This survey design approach reveals which attributes may be most important to the customer and drive their purchasing behavior. There are several flavors of conjoint analysis (e.g., choice-based conjoint analysis, adaptive choice-based conjoint) which affect how the survey is designed.
  • Price Sensitivity Meter (van Westendorp) - The van Westendorp (1976) approach aims to reveal what price range the market would likely bear for a candidate product. Surveys incorporating this research technique typically ask a series of questions to determine what survey respondents would say is the reasonable purchase price for the candidate product. For our refrigerator example, the key take home point from this analysis would be a range of price points that most respondents feel is reasonable for a specific new refrigerator design. 
  • Exploratory research to reveal personas and customer segments - Statistical techniques such as cluster analysis can identify which survey takers seem to group most closely together in their responses to the series of questions asked. Combining this information with other data collected on demographics, geographic area, job types, beliefs, hobbies, and so on, may reveal unique customer subgroups and what seem to be key common characteristics.
  • Text analysis - Open text questions put more burden on the survey respondent to provide written comments, but can yield answers you may have missed otherwise. For example, a question like, "What do you feel are the weaknesses of this particular refrigerator design?" motivates the respondent to think critically about the product design put in front of them and share their perspective. Text responses can be valuable as illustrative quotes or utilized more quantitatively. Data analytics code can be developed to auto-recognize specific terms used and categorize responses based on the overall sentiment of the open text response.

  1. Once you begin to collect data, assessing data quality is an important step. How do you ensure your survey respondents provided thoughtful replies and your survey did not get foiled by bots? We typically develop some custom code to auto-flag the survey data for any questionable responses. Examples of how we might flag survey responses include: the time it took the survey taker to complete the survey, nonsensical answers, or respondents that constantly put identical values. After flagging any suspect respondents, we usually continue the survey until we have the total desired number of qualified responses.
  1. Finally, the exciting final step is to crunch the final data, visualize, and interpret the results. Combining our data analysis and design skills among our team, we aim for reports that are comprehensive, visually pleasing, and have key insights extracted that support the next steps in the design and development process.  

We have found that combining product design and research methods capabilities within the team can result in powerful insights to guide the product development process. For example, when a design team generates a realistic drawing or intuitive storyboard to showcase a potential new product, survey respondents are able to provide high quality responses about their reactions. In more complex research methods, such as conjoint analysis, the multi-disciplinary team can develop realistic sets of attributes that would incorporate trade-offs, considering cost of goods sold (COGS). Incorporating survey research into a design process can be ultimately cost-saving, serving as warning lights for product features unlikely to find commercial success and discovering what resonates the most with your target customers.


Green, P. and Srinivasan, V. (1978) Conjoint analysis in consumer research: Issues and outlook, Journal of Consumer Research, vol 5, September 1978, pp 103–123.

Likert, R. (1932). A Technique for the Measurement of Attitudes. Archives of Psychology, 140, 1–55.

Marley, Anthony AJ; Louviere, Jordan J. (2005). Some probabilistic models of best, worst, and best–worst choices. Journal of Mathematical Psychology. 49 (6): 464–480. doi:10.1016/j.jmp.2005.05.003.

van Westendorp, P.H.(1976) NSS Price Sensitivity Meter (PSM)—A New Approach to study Consumer-Perception of Prices. Proceedings of the 29th ESOMAR Congress, Venice, pp. 139-167.

The Trig Team

The Trig Team sometimes likes to go stealth. Engage stealth mode.


Related Content

What is Concept Validation Testing?
Qualitative vs. Quantitative Research? | Methods