Insights, metrics and reporting
Guidance to help maximise the data you get and how to use it. This guidance is a work in progress and we will add to it over time. We welcome your feedback and ideas.
Your department may have its own dedicated data analytics and customer insights teams you can tap into.
Survey design for effective analytics and insights
Only ask questions that are vital to your objectives, not nice to haves.
This can be challenging, especially with multiple stakeholders feeding into the survey design.
Be clear about what you are going to do with the information from each question as you design the survey.
This helps to crystalise your needs and creates a survey that has a clear focus.
Respecting respondents' time will usually result in a higher response rate.
Knowing the purpose of each question will also help you develop an analysis and reporting plan, for example:
- Each question gives me [insert] information.
- For questions x, y, z we need to understand the result cut by audience characteristics of a, b, c.
Thinking about your questions in this way will help you to design your survey and plan your analysis and reporting approach at the same time.
Takeaway: Apply discipline and clear methodology to your question choices.
Be aware of the difference in dataset size when using multiple choice vs single choice questions.
Let’s say we have 8 questions in a survey, and each question has 5 options (a, b, c, d, e).
Single choice (aka radio button):
- Respondent can select only one answer to the question.
- The data will be captured in 8 columns.
Multi choice:
- Respondent can select many answers to the question.
- Each question’s options will be its own column. In this example 8 questions * 5 options = 40 extra columns to analyse (vs the 8 if single option only).
Takeaway: There is no issue with asking multiple choice questions, but if using them do a quick calculation (number of multiple choice * number of options) to see just how big your dataset will get.
Our data suggests that respondent satisfaction rate declines when there are too many free text questions.
Consider the case where you may unexpectedly get 10,000 text responses to say 4 text questions. Do you have resources to read and process all 40,000 pieces of text? If not – limit the text questions.
Takeaway: Limit the amount of free text questions you include in your survey. This may vary depending on the project, but as a rule of thumb, try to use no more than two free text questions per survey.
Capture sentiment analysis in a quantitative question first.
Sentiment analysis, while powerful, can sometimes struggle to accurately interpret the nuances of language. Things like sarcasm, double negatives and subjectivity can cause the automated sentiment analysis to get it wrong.
Do you want 100% accuracy of sentiment? simply ask a quantitative question before you ask your text question.
Example:
- Q: How do you feel about [insert]?
- A: positive, neutral, negative etc.
- Then add a text box asking them to explain why.
Takeaway: If you want to gauge sentiment, ask it directly through a radio/single-select question.
Free text responses to multi-barrelled questions are hard to analyse.
If your question is double (or triple) barrelled such as “what is your favourite colour and why”, it is difficult to tease out the separate answers.
Therefore, only ask one question per free text box. But again, remember to limit the number of text questions in your survey.
Takeaway: Do not ask more than one question per text box.
Keep demographic questions consistent across projects.
It is difficult to compare demographic datasets if they aren't consistent.
With consistent and standardised demographic questions, it is possible to link with other data sources if appropriate (i.e. other research on the topic, Australian Bureau of Statistics (ABS) data, other consultation data).
It also ensures consistent user experience and appropriate language is used for these questions.
Access our library of demographic questions for consistent and best practice questions across NSW Government consultations.
Always remember your privacy obligations - only ask demographic questions that are important to the outcome of the project.
Takeaway: Use standardised and consistent demographic questions. Only ask what's necessary for the intended purpose and no more.
There is often a need to create sub-groups of respondents based on LGA or maybe Regional vs Metro to investigate if there is a difference in responses.
To limit the need to clean data and speed up your analysis, use a list of pre-determined geographic answer options where possible (instead of a free text box). You can add an 'other' option to the question if required, though be aware you'll need to clean this for spelling and grammar errors.
If you have access to the services of a data analytics team, or are familiar with joining external data, it's best to ask for respondents' postcode.
Input of a postcode is simple for the respondent to enter (4 digits) and quite granular. This means a data analytics team can easily convert this (or ‘zoom out’) to your desired geography.
You can refer to the library of demographic questions for examples of geographic questions.
Takeaway: Use pre-determined geographic answer options where possible (e.g. radio button, dropdown). Ask for a postcode if you can access the services of a data analytics team.
This may vary depending on the project, but as a rule of thumb, surveys should take no longer than 5-10 mins to complete.
It is hard to create good surveys, testing will hone your craft and identify errors.
Ask someone with no knowledge of the project to test the draft survey.
Having a 'fresh pair of eyes' provides great feedback and helps sense check on any jargon that might have snuck through.
Ask someone from the project team to test the draft survey and try every pathway to ensure any survey logic you've applied is working.
Use the preview function of Social Pinpoint to test and not the live consultation link as you are unable to remove any test data that is collected from the live link and this may impact your results.
Takeaway: Ask different people to test your survey. If your survey has logic applied, test all pathways. If you make changes to your survey, test it again.
Your response rate will influence your results.
Anything less than n=20 responses should have a ‘caution low base size’ warning.
These results are still important, just contextualise that they are from a small group of people and should be treated as indicative only.
Takeaway: Be mindful of response rates and provide context for response rates in your reporting.
A simple question at the end of your survey will help you learn from participants and improve future surveys.
Connecting this data across different survey types and lengths can provide insights into how and why some surveys are easier and more engaging than others.
An example of a survey satisfaction question:
- Q: Before you go, was it easy to provide feedback today? [Radio button: Yes/No]
If respondent selects no, a text box appears:
- Q: How can we improve your experience? [text box]
Takeaway: A simple satisfaction question can provide powerful and actionable insights for continuous improvement.
Insights, reporting, and closing the loop
Data vs Insights
Data is all the information you collect as part of a consultation. Insights tells the story behind that data.
Insights help decision makers understand the data and take action.
Using data to craft insights is a specialist skill. If your team doesn’t have this skillset, check if your agency or department has access to a team with research, insights and data analytics skills to support you.
Planning and delivery of your report/s
- Understand how you will be reporting from the very beginning. Create a plan for your report/s as you plan the rest of your consultation activities.
- Understand what your reporting audience's needs are regarding timing, data detail and insights.
- Ensure you understand the approvals chain and what level of sign-off you need for your report. This will impact your turnaround time.
Who is the audience for your report?
- If the audience includes decision-makers, does it have the information they need to make an informed decision?
- If the audience includes the general public, is the report written in plain English with no jargon?
- Ensure your reports meet all digital accessibility requirements.
- Do you need to create different versions of your report (e.g. Easy Read, translations, summary report)?
Social Pinpoint reporting capabilities
There are three key Social Pinpoint reports:
- Overview - key analytics about your site, a single project, or a group of projects. These analytics help you understand high-level information such as how many visitors you attracted, the 'depth' of their engagement, the types of tools and activities people engaged with, and other information regarding visitor behavior.
- Results - reporting insights for individual engagement activities (e.g. surveys, social maps, discussion forums, etc.) including access to the underlying data, analysis of your results, meta information about the activity, and demographic information about your participants.
- People - demographic information about the participants of your site, a single project, or a group of projects.
Most reports and data can be exported from the platform in either PDF, XLSX or CSV.
Get familiar with Social Pinpoint's reporting capabilities, including the text analysis tool so you know what information you will be getting and that it suits your needs.
Make reports and outcomes available
- Develop reports summarising the feedback received during the consultation.
- For transparency, make the reports available on your consultation page.
- Clearly outline how the input has influenced decision-making or policy development.
Engage through multiple channels
- Utilise a variety of online channels to disseminate information, including social media, email newsletters, and official government websites.
- Ensure that information is presented in formats suitable for diverse audiences such as multicultural audiences.
- If you used targeted social media advertising to promote your Have Your Say consultation, consider advertising the findings the same way to connect again to that audience.
Feedback loop integration
- Establish a continuous feedback loop by encouraging stakeholders to provide input on the implementation of decisions and policies.
- Use online surveys, forums, or dedicated feedback mechanisms in your Social Pinpoint consultations to collect ongoing input.
Regular updates
- Provide regular updates through online channels, keeping stakeholders informed about the status of decisions and actions taken.
- Use newsletters, blog posts, or video updates to engage a diverse audience.
Email communication
- Allow stakeholders to opt-in for email notifications to stay informed on your consultation (the 'Project Follow' button)
- Speak to the DCS Have Your Say team about using the email functionality in Social Pinpoint to contact your stakeholders, ensuring all privacy obligations are met.
Social Pinpoint guidance
Some tips on closing the loop from Social Pinpoint: