Overview and Rationale
1. Background and Purpose of the PSQS
1.1 Background
The problem:
Unlike universities' academic operations, there are few robust, readily available or common metrics that cover activities of non-academic university units. Nor are there relevant sources of data from public sources that might fill the gap. To help improve performance and drive standards the system requires a straightforward set of measures that are sufficiently flexible to accommodate the contexts and structures of universities as a whole as well as their individual administrative and service functions.
A solution:
The PSQS mission has been to design this set of measures so as to test service quality from a client/customer’s perspective. A core requirement of the survey design is that it can be readily adopted by other universities, adapted to their requirements and used to benchmark service quality within and across institutions.
The survey uses seven short, clear questions to poll all staff about each service they have had personal experience of in the preceding year. The resulting report provides insight into areas of strength as well as opportunities for improvement, and a means to promote and pursue high standards across the full range of professional services.
Timeline:
In 2013 The University of Nottingham launched the maiden survey of its entire staff to evaluate their opinions and experiences of the professional services within the institution.
In 2014 Bristol and Cardiff participated in the exercise and in 2015 the group was joined by Oxford and York. In subsequent years the participants varied but there were usually five participant universities. By its design the PSQS accommodates the structures of services in each university; the common question set allows for benchmarking and comparisons of services across the institutions.
In 2016 Nottingham retained the services and support of SDA/GIDE as the primary partner for the project management, technical design, web hosting, and analysis and reporting capability. SDA/GIDE is a member of and abides by the standards of Market Research Society (MRS) and ESOMAR and is registered with the Information Commissioner’s Office as a Data Controller.
The survey will run again in 2020 with ten possible participants.
Logisitcal benefits:
The consortium approach allows for a fairly low cost exercise, with shared cost potentially reducing as numbers grow. The core costs (design, hosting, administration, basic reporting) for participants for 2018 were in the region of £4,000 (ex VAT) with an additional ‘menu’ of analysis and reporting options available on an institution-by-institution basis.
The straightforward purpose and survey and process design, as well as the readiness of current participants to share supporting materials (eg internal communications plans, lessons learnt, etc.) means there is a relatively light burden on participating institutions.
1.2 Purpose
The purpose of the PSQS for its participants is to understand the quality of their support services, as used by academic units and other functions, and thereby drive continuous improvement.
Generating empirical evidence about service quality is fundamental. Perceptions of quality can (all too readily) be driven by anecdote, assumptions, political considerations, a vocal minority, misunderstanding or misinformed expectations. The PSQS provides a platform for collecting empirical data which contributes to a consistent and cumulative evidence-base of service quality over time.
It also provides participants with the means to benchmark their quality of service data against the average of analogous services at a number of broadly similar institutions. In 2020 the PSQS will trial an anonymous "Benchmark Beacon" (best in class) metric illustrating the 'art of the possible' intended to introduce the option to consider and exchange best practice.
A final purpose of the PSQS design is to raise visibility and dialogue about service quality across an institution, the intention being to foster a better understanding and appreciation of such services while also creating an incentive and means to improve them.
1.3 Caveat
What the PSQS cannot do and which is beyond its immediate purpose is to determine the causes of differences in perceived quality of services within an institution or between one institution and another, whether attributable to remit, resourcing, leadership, management, efficiency, or other such factors. Within an institution, it may be possible for leaders and senior managers to know or determine such causes and act on them. Looking across multiple institutions (or at averages of analogous services) this is not the case: for example a particular service in one university may be differently funded, occupy a more advantageous position in the management structure, or be better led than in another. However, with the introduction of the Benchamrk Beacon, outstanding results will be signposted and participating institutions are free to initiate contact each other to better understand options for service provision improvement.
2. Overall Approach and Methodology
The overall approach is straightforward:
- Each university works with SDA/GIDE to agree timings, content, customisations and any such matters prior to running the survey. Commercial terms (based on a common proposal and a menu of options) are between each university and SDA/GIDE.
- Although the question wording is identical across all institutions, as are the essential instructions, the survey is customised to each institution in terms of the set of units, their names or titles, their purposes, and any appropriate distinctive terminology (for example whether you call such services ‘professional services’, ‘support services’, or ‘central services’), your university’s logo and brand colours, the introductory and thank you pages, etc. The survey is therefore experienced as an entirely internal management tool.
- Over a period of about four to six weeks all staff at a university are surveyed across all levels and functions (academic and administrative), between about the end of April and about mid-June. The exact timing and duration is up to each university.
- Staff members are asked to assess the services they received only on a direct, personal basis (ie not as a representative or on behalf of a team or unit they may be responsible for). The focus is on services received from a central Professional Service, though there is sufficient flexibility to accommodate some variations from that model.
- Upon opening the survey, respondents tick box(es) in a list of services for those they have had direct contact with. Alongside each service unit, they see a brief summary (about 10-15 words) of the capabilities and services provided to confirm and educate. They are then asked the exact same set of seven questions for each unit.
- Each university is also fully responsible for its own internal communications, promotion and utilisation of the survey and results.
- Reporting of the results follows quite shortly after the survey closes, depending on the analysis and reporting options chosen by each participating university. Data is gathered about respondents’ roles (academic vs administrative, their unit, etc) so that analyses also reveal perceived variations in provision to other units.
- Benchmarked results follow once all universities have completed their run, so may be provided somewhat after internal institution level results are available. Nottingham can provide advice and guidance to ensure the correct alignment of units for the purposes of benchmarking analyses.
3. Design Rationales
- Target population. During the original design of the PSQS we considered a range of options on how to define the survey target population, from narrow (heads of large departments) to global, encompassing all university staff at all sites. Looking at the narrowest target population, Heads of Units are arguably more likely to have political agendas or to deal with Professional Services only when service issues are escalated or in response to crises. They may not reflect the experience of staff who use Professional Services on an on-going basis who experience for example consistent high levels of service quality. Looking at the widest target population increases cost and complexity and may mean some staff have limited involvement or awareness of the full range of services. To avoid the potential for skewed results of the narrow approach, the survey was designed for use across all staff in a university, while also incorporating means to reduce complexity and educate respondents over time.
- Respondent organisational knowledge. Respondents may have unclear, incorrect or incomplete knowledge about their own university. There may also be uncertainty and misunderstanding about which unit is responsible for a given function, and staff may be unaware of the full range of services. The PSQS was designed to avoid the effects of such factors and serves an educative function by providing for each named unit a concise explanation of the service each unit or team is meant to provide.
- Reporting and taking actions. Some universities provide results only to heads of services and the Senior Management Team while others make results fully available to all staff. The approach that is right or best for any university will depend on its culture and practice regarding such management performance tools and staff surveys. Participating universities are therefore in full control of how their results are shared within their institution and benchmarking does not allow for individual universities to be identified. Participants can prompt responses to results and actions as they choose.
- Benchmarking. Most universities maintain a similar range of capabilities and functions (from accommodation to research support to timetabling) but have different ways of organising, managing and delivering those functions. The PSQS was designed to allow benchmarking of functions regardless of local management structures.
4. Experiences to Date
The experience of each university varies as lessons are learnt year, improvement plans implmented and the PSQS (which may go by different names at each university) becomes embedded in organisational culture and practice.
We can briefly summarise - at high level - the Nottingham experience, where the survey has run since 2013:
- An invitation to participate from the Vice-Chancellor is sent to all University staff in late May, highlighting the value and importance of individual views in seeking the highest service quality levels.
- The response rate for the last four years has been about 15%, with about 1,000 responses received. Each respondent evaluates several units and the total number of evaluations has been 5-6,000 per year – a substantial data resource.
- In October the results and reports are made available to all staff through a variety of means, including a Tableau viewer where staff members can see all results for any or all units (including open text comments), and compare those to results for all services and the benchmarked average for the analogous service at other participating universities.
- Heads of services are required to reflect on their own unit’s performance and to develop action or improvement plans as appropriate. The University Executive Board receives a report of outcomes, as well as summaries of the action plans agreed with Heads of Units.
- The PSQS results are also incorporated into other processes, for example Professional Service reviews, and have high visibility internally.
- The PSQS has increasingly become built into the organisation culture. It keeps Heads of Services mindful of the importance of a focus on service quality (and the ease of identifying failure to do so). For the Board (which receives results and action plans), it allows issues and problem areas - and indeed high quality provision – to be readily identified and to gain a better sense of the ‘overall health’ of services and the organisation
5. About the PSQS in 2020
The design of the survey will remain the same as in previous years so as to retain consistency and to allow for year on year comparability.
Design, hosting, project management, analysis and result reporting will continue to be provided by SDA/GIDE. Their own support services are consistently regarded as excellent and the product as being very good value for money by all participating universities
Surveys on the SDA/GIDE platform are designed to be responsive to the device being used so will work on mobile devices such as tablets and phones. As the survey has some long response lists, completion on very small screens is not recommended.
SDA/GIDE surveys are hosted on servers in a secure data centre with regular backups and recovery procedures.
Once the survey is closed each institution’s data file will be downloaded by SDA/GIDE and quality checked (eg removal of blank submissions, duplicates, incompletes etc). The data will be transferred to SPSS or similar software for analysis. Participating universities may choose to have a copy of their data file in order to undertake analyses themselves.
Potential Enhancements for 2020
To be discussed at the PSQS 2020 Kick-off meeting (see item 6, below) scheduled for mid-March 2020. Topics to include:
- The active marketing of PSQS with a view to increasing the aggregate data and thereby enhance the value of the benchmarking process. This website forms a part of this strategy.
- The inclusion of an anonymous 'best in class' indicator to provide potential beacons of excellence for participants. The intention is that these will have the potential to drive improvement through the exchange of best practice.
- A related development will be to include a means for universities wishing to engage in the exchange of best practice, to contact the anonymous 'benchmark beacon' with a view to examining 'what best looks like'. It would be useful to discuss how this can be implemented.
Costs for 2020
The basic costs for running the survey in 2019 are £2,850 + VAT and include:
- Online survey design and hosting
- Prepare and deliver survey data file for analysis
- Benchmark tables and charts in Excel/pdf format
A range of additional and reporting options are also available at additional cost including, for example, pdf reports with executive summary.
6. March 2020 PSQS Kick-off meeting
All those interested in participating in 2019 will be invited to a kick-off meeting to compare prior experiences, and to share issues and ideas. Some points from previous meetings include:
- All those interested in participating in 2019 will be invited to a kick-off meeting to compare prior experiences, and to share issues and ideas. Some points from previous meetings include:
- Survey timing, duration and response rates. The longest running survey duration has been about one month. Most start in early to mid-May, while the latest ran into early July. Response rates varied from 14% to about 20%.
- Response rate management. Monitoring response rates actively using the online facility provided by SDA/GIDE is important. This shows when numbers begin to trail off, indicating the benefit of a reminder to prompt completion, as well as which units may not be responding, suggesting that (if communications were cascaded rather than sent globally) that the Head of the unit may not have forwarded the invitation or not given a sufficient level of encouragement or rationale for participating. Some institutions sent weekly reports to heads of units, giving them the number and % of their staff who had completed the survey. It may also be effective to send a single report showing response rates across all units to Heads, so that units with lower response rates may be additionally motivated.
- Order of units evaluated. Following a discussion on possible biases, it was agreed to change the survey so that, regardless of the order in which the units are listed, they are completed in random order.
- Respondent behaviours. SDA/GIDE data captured about how respondents proceed through the survey, which showed that respondents take 2-3 minutes to complete the opening questions (i.e. introductory text, indicating which unit they work in, the type of role they hold, etc.) then about one minute for each service evaluated. The time taken per service reduces the higher the number of services evaluated, presumably as people become familiar with the questions. Unsurprisingly, the data showed much variation around these averages - obviously people can be interrupted in the middle of completion and some will spend more time thinking about the answers and some will speed through the survey. Also, the data showed that only 13% of respondents did more than 10 evaluations, with most institutions having an average of 4-6 per person.
Using this, it is possible to include a statement in the survey along these lines: ‘How long the questionnaire will take to complete depends on the number of services evaluated. Evidence from previous years suggests that on average it will take 4-15 minutes but possibly longer if more than 10 services evaluated.’
- Analysis, Reporting and Action Plans. There is wide variation in responses to reporting, both the institution results and the benchmarking. Most provide a tailored summary report to their Executive Board. Some, including Nottingham, also produce their own results explorer tools using Tableau to provide wider access to results. Nottingham provides all University staff with access to all results for all units, including open comments. Most require some form of ‘action plans’ from service units, either by exception (eg for units below their relevant benchmark, University average or whose results are poor or declining) or universally.