Ask a stupid question…

When was the last time you asked a question? I bet there’s a good chance you’ve asked at least one today, whether at home or at work. And if you haven’t asked one then no doubt you’ve been on the receiving end. Quite often there’s more to a question than meets the eye.

“Which bin is it today?” also translates as “You haven’t forgotten to put the bin out, have you?”

“What time will you be home from the pub?” has many translations which are often directly influenced by previous pub visits

“Did you hear about [insert name and unfortunate event]?” also translates as “I’ve heard about [insert name and unfortunate event], can I tell you all about it?”

Questions are also a fundamental part of market research and consultation. The process often involves developing a survey or questionnaire. Some people feel very passionately about distinguishing the two, it’s semantics to me, I’d like to think we have more important things to get worked up about. I tend to go with survey as it’s less letters to type.

Truly there is much more to coming up with research and consultation questions than you might think.

The design process is vital in virtually everything we do. Whether it’s building a house, bringing a new product to market or developing a website. Without time and consideration in design the end result is unlikely to hit the mark. Question design and surveys are no exception.

What’s the question?

This is going to sound counter-intuitive but the question you want an answer to isn’t necessarily the question you should be asking.

For example, you’re thinking of bringing a new product or service to the market. Instinctively you want to ask ‘is it a good idea?’ and ‘would you use it?’. But really you should be thinking about if they would ever have the need to use it, do they already use similar products or services, what is important to them in the offer, what would they be willing to pay (let’s not get started on willingness-to-pay methodologies).

There are a whole host of pitfalls and considerations to keep in mind when developing questions.

  • Open questions which should be closed.

Q: Please tell us if you agree with our proposal.

A: Weird and wonderful responses owing to a humongous open text box.

When it comes to opening or closing a question, often the biggest impact is on you (assuming you’re the one who will be analysing and reporting the findings). Open-ended questions can often be great for capturing rich insight and a depth of response. But they can also be a swine to code and analyse.

  • Leading questions and response options which lack balance.

Q: Our brand is just so unbelievably great, isn’t it? ISN’T IT?

A: YES / No

Q: How satisfied are you with our service?

A: Very satisfied / satisfied / pretty satisfied / quite satisfied / not so satisfied

I’m long enough in the tooth to get why bias and leading question design creeps in. There can be a pressure from above to get a good response for whatever reason. Or you believe in something so much that you want the ‘right’ answer. But you’ve got to ask yourself, why ask people for their views and conduct research if you want to influence their response like that? You might get a more positive answer (although often people will see past attempts to influence them anyway) but you will never have the confidence in your results.

To grid or not to grid, is that the question?

Stick with me on this one. So those big grid questions you might have come across, very popular in council resident surveys back in the days when postal was the go-to method? Or maybe at a training event on attendee satisfaction across a range of factors?

Yeah, they work on paper. A neat way to throw multiple questions into one format and speed things up for the respondent. They were also fine when online was strictly accessed on desktops and laptops with a good-sized screen.

But guess what, the year is 2018, we’re trying to do things online wherever we can and 76% of us use our smartphones to visit the web (check out Dan Slee’s helpful summary of Ofcom’s 2018 market report).

Now, have you ever seen a grid question on a phone? Go on, try it. I would send you off to a link but I doubt you’d ever return.

The point is not about the grid question itself. In fact, there are times when it really works. It still has a place when you need a paper survey.

The point is about working with the platform. The method. Survey software can make your surveys mobile adaptive. But they can’t make them mobile friendly. That’s about the content. That’s on you.

A question isn’t just about the question

A survey isn’t simply a set of questions. Often there will be context and narrative critical to the response. Never is this truer than in consultations. For years the Gunning Principles have been etched on my brain. One of these principles focuses on providing sufficient information ‘to permit intelligent consideration’. Basically, how do you expect someone to tell you whether they agree with something if you don’t give them enough information to weigh it up?

You can have the greatest set of questions ever seen. But if you don’t get a couple of things right, your efforts may well be in vein. After all, the whole point of developing a survey is for people to answer your questions.

  • Audience

Identify your target audience and come up with a plan to reach them. Some market research ahead of launching a product aimed at people under 25 isn’t going to be much use if you don’t have anyone that age in your sample.

  • Timing

Give your survey period some thought. Allow enough time for you to reach your audience. And think about those times when you could be chasing your tail. Pushing a survey out during the Christmas and summer holidays is usually tough.

Timing is also critical if you’re evaluating something. You need to strike that balance between allowing time for impact and not leaving it too long that the ‘something’ could be a distant memory.

Get ahead of yourself

This is not a typo.

There are times when you absolutely should not get ahead of yourself. Like the 8 year old me in the closing stages of the 95/96 Premier League season, when I thought my beloved Newcastle United were destined to be champions under Keegan. I would’ve loved it too. I’m still not over it.

But there are times when you should. And survey design is one of those times. Whenever I’m developing a survey I always think ahead to the survey closing and ask myself what will the responses to the questions tell me? And how will I analyse it and make it mean something?

This isn’t because I’m psychic or plan to fudge the results. But I like to imagine what the analysis and report might look like to understand if I’m asking the right questions.

Take it right back to the core objectives of the survey or research. Will it actually address these? If not, it needs more work. And there’s no shame in redrafting. I once got to v19 of a survey. It was miles better than v1.

Also ask yourself, are there any other relevant data sources? This is also a question to ask yourself at the very start; you might be asking questions you already have the answers to. But it’s also one to keep in mind throughout the process. Is there a national dataset lacking a local sample that you would like to compare against? Is there historic local data to compare against?

So, what’s the point?

The point is if you’re designing a survey, or any research for that matter, don’t rush. Question design is important. Once they go out, you’re stuck with them. Think about your audience, how to reach them and the information they need. Think about the platform, what works and what doesn’t. Think ahead to the analysis and how your questions link to your objectives.

The analysis of any research gets the most attention. It’s the end goal. Powerful findings, new insight.

The fieldwork and promotion get the most resource and effort. After all, you need some data to work with.

The value of planning and design can be overlooked or compromised in an effort to ‘get it out’. But these are your foundations. If they’re not right, like a house, your research will fall down.

 

This article was written by Institute member, Adam Pearson.

Adam is a freelance consultant, providing research and consultation services to the public and not-for-profit sectors. Find out more at www.pearsoninsight.co.uk

Shopping Basket
Scroll to Top

Your membership questions answered

View our frequently asked questions or contact our dedicated account manager for further support.

You can reset your password here. If you’re still having issues, please send us a message below.

We have many ways you can pay for your membership.

  • Credit card
  • Online
  • Invoice
  • PO

You can renew/upgrade your membership here.

To find out more, send us a message below.

You will receive a reminder email from our dedicated membership account manager 4 weeks before your renewal date. This email will contain all the information you need to renew.

You can also renew your membership online here.

You can update your contact details here. Alternatively, please send a message to our membership account manager below.

Please send a message to our membership account manager below. 

Still need support?

Our dedicated Membership Account Manager is on
hand to assist with any questions you might have.

Request a callback

Leave a message and our team will call you back

"*" indicates required fields

Name*

Send us a message

We’ll be in touch with you soon.

Name(Required)
Email(Required)