In this blog I outline 10 good practice techniques in how to create survey questions, including shortcuts on how to conduct surveys or questionnaires. These simple steps apply whether you are gauging customer satisfaction, conducting consultations, or gathering employee feedback through exit interviews and staff surveys.

“To ask the right question is already half the solution of a problem.”

Carl Jung

I vividly recall many of the surveying best practices from my degree in Psychology; maybe that’s why I use a Carl Jung quote for this blog! I’ve carried that knowledge throughout my career, for example in marketing, customer service, business intelligence and statistical analysis. What follows are some of the easy, non-technical points to bear in mind when creating surveys. These are solutions to the common mistakes people seem to make and summarising years of CPD on the matter, sharing tips I’ve learned along the way…

How to make a satisfaction survey

 1. Avoid boring people with preamble.

Before I get into how to create survey questions, I’ll cover this important bugbear first: Introduce succinctly why you are asking for feedback. As a reminder what succinct means in practice (many people struggle with this skill!), it means no more than a few sentences and one paragraph. Nobody cares about anything more. As part of this, let people know why their response matters. Respondents simply want to know:

  • What’s the point of this survey?
  • Why should I bother?
  • What will you do with the results?

2. Ask for only what you’ll use…

It may seem obvious, but only ask what you will actually use and analyse. Otherwise, you will be swamped by data, adding proverbial haystacks to your needles, and of course you waste people’s time requesting information which you then don’t use. There’s also GDPR considerations to think about.

To do this, keep your questions limited to the most important variables you really want to understand. Then limit demographic, location, and other supplementary categories to a minimum. Such practices will also make the presentation to your audience much easier!

For example, I recently encountered an exit interview survey asking over 40 questions. That’s an immediate put off for the respondent! These were probably compiled with good intentions of help the organisation understand various aspects. It covered numerous (often overlapping) questions of what the employee thought about the organisation, with lots of multiple choice and free text options, and then a range of demographics for later slicing and dicing of the data (e.g. job role, seniority, department, location, age group, sex, and other personal demographics). The end result was the HR department couldn’t analyse it all or produce meaningful insights given the complexity. Everyone’s time was wasted. So for the sake of simplicity and headache-avoidance, only ask what you will actually use!

3. Shorter is better.

Whether it’s question or survey length, accept that people simply don’t want to spend the time doing surveys. This will also help avoid response biases, whereby only the most motivated (positive or negative) will complete longer surveys. Pick questions that are relevant to the key variables you want to understand.

Remember, you’re generally acting upon the goodwill generated in point 1. People are also more impatient in online questionnaires than other formats. You don’t want people quitting because they’re bored or confused, or worse: continuing the survey disinterested, which will likely fill your results with nonsense that could inform decision-making!

4. KISS: Keep it simple, stupid!

Use plain English and short sentences. Avoid using jargon/technical terms and acronyms that your respondents may be unfamiliar with.

Doing this helps avoid what I see as the six critical points of communication breakdown in surveying (blog coming soon!). For now and in summary, bear in mind that misunderstandings can occur between what you ask and what the respondent interprets. This can have the effect of making the entire findings meaningless or your interpretation flawed.

5. Ask just one thing per survey question.

It may sound intuitive, but surveyors often and inadvertently make this mistake. Such questions can only lead to fuzzy responses and/or interpretations, and so can be written off as a waste of time. Avoiding the word ‘and’ in your questions will give you a head start here!

For example, consider: “Please rate the politeness and professionalism of the member of staff who dealt with your enquiry”. Instead, choose either ‘politeness’ or ‘professionalism’, or ask each concept separately.

6. Minimise and organise your multiple choice questions

Multiple choice questions are helpful for standardising, categorising, and avoiding typos in your response data. For example, asking people to choose their age group instead of typing their age.

When creating multiple choice options, limit them to no more than nine options (ideally five or less). Any more can get confusing for the person completing your survey, trying to wade through the options to find the one relevant to them. It also helps to order them alphabetically.

7. For ratings, use Likert scales.

Likert scales are a useful structured rating scale for assessing things, such as your product or service.

Going from 1-5 is a good rule of thumb, but ensure each side of the response equation is balanced. Below is an example of a Likert scale for satisfaction ratings, but you can do the same with levels of agreement with statements or other such ratings:

  • 1 = Very Dissatisfied
  • 2 = Dissatisfied
  • 3 = Neutral
  • 4 = Satisfied
  • 5 = Very Satisfied

8. Allow free text responses…

Don’t always limit people to closed questions and categories. Allow people to expand on their answer, but focus their mind. E.g. “What would be the most important thing we could do better?”.

You need not dive in to detailed discursive analysis right away; ‘Word clouds’ are a visually appealing and convenient way of summarising the gist of the content. For example, here’s a handy word cloud generator I’ve used before which also includes a smart ‘sentiment analysis’ and highlights key phrases. Try it for yourself by copying and pasting the text of this survey blog for example!

Such questions also aid your understanding of what to do in response to the results. Your customers, employees and others you’re surveying often have brilliant ideas and insights to share, so ask them and see what you get back.

To save everyone’s time, I suggest limiting the free text to just one or two questions placed at the end, where people can expand on answers given elsewhere. It makes analysis easier and the questionnaire quicker to fill out. For example, you might ask something open-ended and yet focuses the mind on the important things, like, “What’s the best thing about working here?” or “What’s the most important thing we can do to improve?”

It may also help to add a word limit upon any free text boxes asking for comment; 250 words or 1000 characters should suffice. Note: Ensure your respondent knows there is a word limit on your form.

9. Figure out how many survey responses you need.

There are many tools available online to help you work out your required sample size. I’ll soon share my own template sample size calculator as a helpful template and explainer. Put simply, there’s just two things you need to determine:

  • What is the population I’m interested in? Is it the general public, your employees, customers, or something else? This is the group that, if you had all the time in the world, you’d survey everybody.
  • How confident do you want to be in the results? The more representative you want your sample to be of the population, and the more precise you want the results, the larger the sample required. Consider for example what decisions you will be informing with the results.

To get you started on this, I used a handy online sample size calculator provided by Survey Monkey. As shown in the image and as an example scenario, I’ve used a population size to represent 100,000 customers. I want to be 90% confident my results will be representative of what my customer base would say. And I’m comfortable with a 5% margin of error; that means if my survey found 75% of customers were satisfied, I would be very confident that had I asked all 100,000, the ‘true’ result would be somewhere between 70% – 80%.

Survey size sample calculator

Broadly speaking, a sample size of between 100 to 1,000 respondents should give you the gist of what’s going on, and the confidence to make decisions. Doing more is generally a waste of time. For example, if half of those people are dissatisfied with the service provided, you can be pretty confident that in turn, half of your wider customer base would probably feel the same way. Whether you ask 100 people to give you a broad 50-60% dissatisfaction rate, or 1,000 people to nail down this down more precisely to 53%, you probably would want to do something about it! For context, YouGov and other polls of the UK population tend to sample around 1,000 people as a standard way of gauging the opinions of nearly 70 million people!

Also consider: Do you even need a survey? The whole point of conducting online surveys is to gather the views of a large volume of people for minimal effort. I.e., for when you don’t have the time to interview people individually. If you only expect to get a handful of responses or need the views of a small cohort of individuals (under 25), consider instead using the best (and most time-consuming) approach to gathering such management information: Ask people in person!

10. Do it the easy way!

There are a number of great surveying tools out there to automate the survey response and data gathering, saving you masses of time and presenting a professional image to your respondent. Survey Monkey and Snap Surveys are two popular platforms used in organisations large and small. Such platforms have the bonus of creating a responsive and mobile-optimised format for you. They also provide more in-depth guidance on how to create survey questions along the way.

There are some plans that allow limited surveying for free, but even price tag of the basic paid schemes (tens of pounds per month) are well worth incorporating into any marketing budget. The return on investment however, in terms of better understanding your staff, customers, and product offering, will be astronomical.

I hope you find these tips helpful. Comment below if you did or indeed if you think I missed something! If you would like expert support and advice building your own survey and question set, by all means please get in touch.

In due course and as mentioned, I’ll expand my free guidance to include the best employee exit survey template, a sample size calculator, and guidance on the six hazards of communication breakdown in conducting questionnaires. Until then, I wish you the very best with your endeavours!

Kind Regards, Adrian