Canadian model

The initial groundwork that set the scene for the Kiwis Count survey began when the Institute for Citizen-Centred Service (ICCS) was created in 1998 from activity undertaken developing measurement approaches to service quality in the Canadian Public Service[1]. This movement had a particular focus on service quality, looking at client experience. This focused on citizen centred service delivery and has since changed and grown, introducing new elements and increasing the focus on how citizens experience government services[2].

The Public Service Commission collaborated with ICCS, the licensing of the Common Measurements Tool (CMT) methodology catapulted this into focus. The CMT is/ was a methodology for measuring customer satisfaction and customer experience and allowed for performance measurement and comparisons between different organisations and different jurisdictions. It was introduced to the New Zealand public sector as part of the wider New Zealanders Experience Research programme in 2007-2008.

In 2011, the Commission produced a revised and updated guide to using the CMT[3]. This was prior to growth in interest in trust and confidence within public sector management but can be clearly seen to underpin the ongoing movement to understand in depth of government-citizen/ client interaction and the role that this plays in driving trust and confidence.

Contextualised to New Zealand through drivers research

The Commission purchased a license to use the ICCS approach to measuring service quality (which later developed into the Kiwis Count survey) and the Common Measurement Tool programme. The general aim of purchasing the licenses was to build capability across the Public Service to measure and understand customer experience and service quality. This aim was met, with many agencies using the approach and going onto further develop their customer satisfaction and customer experience measurement.

Kiwis Count

Building on the ICCS base, an initial iteration of the Kiwis Count survey began in 2007. There were two ‘point in time’ surveys, using a randomised sample of 6,000 in 2007 and 2009 with both sample frames gained from the electoral role. The questionnaire included trust in the public service in general, trust based on personal experience with services, mode of contact, quality of services, ease of access, and satisfaction with service experience.  

A further point in time survey was planned for 2011 but this was modified to be a continuous survey. This used a sampling frame gained from the electoral role and sent out approximately 2,600 survey invitations each month. Response rates varied between 45%-50%, with some oversampling to account for low response of identified populations, particularly Māori and younger informants.

This pattern of fieldwork continued until 2019, albeit with a smaller sample frame and survey completions with a growing proportion of informants completing the survey online instead of the paper-based mode.

In 2019, as a broader review of the Kiwis Count programme was progressed, there were several issues identified that needed to be addressed. These included increasing cost of the survey, concerns over the utility of the data collected (particularly in relation to service satisfaction/ quality) and concerns over declining response rates.  To address these, the survey was shortened considerably, and narrowed in to focus more closely on trust and confidence.  Data collection moved to an online panel system, in line with the same shift being made to ICCS.

Using this method allowed for increasing the sample size from 2,000 responses to 4,000 per year, collected quarterly. The online panel survey system is an opt-in approach and holds a database of more than 400,000 people who have indicated they are happy to take part in surveys for a nominal reward. Use of this method also meant that sample was closer to the general New Zealand population and thus required less or no post survey weighting.

Throughout changes in Kiwis Count, maintenance of the time series for trust in the Public Service and trust based on personal experience has been an important factor in each redesign. The survey has always included a list of services provided by government, broader in scope than just the departments, departmental agencies, and Crown agents identified in the Public Service Act. Though the services included some provided by local councils at one time, the most recent iterations have specifically excluded local government, judges, and politicians from the definition of Public Service.

Reporting for the past ten years of results on headline measures in Kiwis Count are available here.

OECD New Zealand country study

New Zealand was approached by the OECD due to the response to the COVID-19 pandemic, to understand more about what was driving trust. The resulting OECD country study Drivers of Trust in Public Institutions in New Zealand included a 2022 survey of New Zealanders that measured aspects of the OECD’s trust framework. Results from New Zealand and other countries that participated in the survey are reported here.

The New Zealand survey identified demographic and socio-economic factors (e.g. ethnicity, education, income, age) as factors that influence trust, which is similar to previous multinational OECD research. [4]

The strongest predictors of trust in the Public Service in New Zealand were perceptions of responsiveness of services, satisfaction with administrative services, and reliability, followed by integrity and fairness.

The OECD study found that New Zealand had higher trust than countries with similar levels of cultural diversity, and higher cultural diversity than other countries with similarly high trust levels. This paradox highlights that there is a unique context within countries that results in different patterns of trust.

The OECD country study made a range of recommendations, including that the Commission should measure the drivers of trust. The present study was an exploration of how drivers of trust could be added to the quarterly Kiwis Count survey.


[1] More details of how this came about can be found here: https://citizenfirst.ca/our-story/who-we-are/history-of-the-icc

[2] For more details see https://citizenfirst.ca/our-work/research-and-publications/citizens-first

[3] For more details see https://publicservice.govt.nz/assets/DirectoryFile/Common-Measurements-Tool-updated-March2011.pdf

[4] Congruence between political affiliations and current government in office are also predictive of trust in previous OECD research and would normally be controlled for in analysis, but due to political neutrality requirements the Public Service Commission (who funded the OECD research) is unable to measure political affiliation.