Customer Scrutiny Panel s (CSP s) Review of the Customer Satisfaction Surveys

Download (0)

Full text



Customer Scrutiny Panel’s (CSP’s)

Review of the Customer Satisfaction Surveys

Date the report was presented to housing

management board

18 June 2015

Review Lead Fiona Plumridge

Report author Fiona Plumridge

CSP members directly involved in the review Fiona Plumridge Patrick Sichali Doreen Howell Ted Jones (part) Les Isaacs (part)

Executive Summary

The Customer Scrutiny Panel decided to complete a mini service review on customer satisfaction surveys at their annual planning day in May 2014. The STAR survey 2014 highlighted that satisfaction had decreased since the 2012 survey results.


1. That the board agree the report and recommendations.

2. That the board task housing managers to complete an action plan to address the recommendations

3. That housing managers report this improvement plan and progress report to the Customer Scrutiny Panel in six months and again in 12 months.







This report covers the mini review of customer satisfaction surveys. The review was carried out by five members of the Customer Scrutiny Panel (CSP).

The panel discussed and agreed what we wanted to investigate at an initial planning meeting and put together the service review notice that we served to Stevenage Borough Council in January 2015 to advise them of the review. The notice (see appendix 1) gave details of:

- The reason for the selected review

- Further information and evidence required - Agreed team roles of the panel on this review

Customer satisfaction surveys are used to measure satisfaction of the Council’s housing services. The results collected provide quarterly performance statistics to these services.



To establish whether current satisfaction survey results are used effectively to improve housing services and make a difference. From our findings, we suggest actions and changes to processes that may lead to an increase in customer satisfaction. Our objectives were as follows.

• Find out the satisfaction suveys that are currently used and what questions are asked

• Establish who the surveys are aimed at

• Understand how surveys are distributed, the return rates and how the council encourage customers to complete and return them

• To look into how results are recorded, what they are used for and what is learned from them

• Investigate whether poor results are followed up by staff

• Determine if satisfaction survey results make a difference to services and are worthwhile



We used a range of methods to carry out the research for this service review: Documents reviewed

• Performance information - recent STAR survey (return rate was 1,000 from

4,000 sent)

• All satisfaction survey forms used by housing services

• Knowledge sharing with East Durham Homes, North Lincolnshire Homes,

St Leger Homes and Sutton Housing Partnership Other methods of investigation

• Asked service managers for information on how their service use, record and learn from their satisfaction surveys


3 3.3


After full discussion at meetings 2 and 3 it was agreed that the information that has already been provided by staff, is adequate and nothing more could be achieved by interviewing staff.

The group met 5 times from the Kickstart meeting on 06/01/15 to the final meeting on 10/05/15. Berni O’Regan (Customer Focus Manager at SBC) was introduced at the 18/02/15 meeting as a critical friend.


4.1 There have been some general positive findings overall:

The panel feel that Customer Satisfaction Forms, as a feedback mechanism, is a good idea.

However, a weakness is that some survey forms are currently not being used. Some services hold a monthly/quarterly prize draw for returned forms, as an incentive.

Some service area staff do follow-up if residents comment on returned forms that they are not satisfied.

Again a weakness is that this does not happen across all services and it is unclear how this contributes to improving services.






The panel agreed and noted that housing services should encourage customers to return completed satisfaction forms; that customers observations and remarks are very important to service delivery and the council should emphasise that their comments really can make a difference to how services are provided.

Ensure that staff and customers understand why we do satisfaction suveys. It is not just a tick-box exercise.

The answers the panel received to the questions they asked staff were not always substantiated. It was not always clear what some services do with the satisfaction survey results and how staff felt the results could help them improve their service.

Feedback to customers is very important. The panel like the straightforward “You said, We did” response.


6.1 All survey forms to be up to date. The panel suggests that a new survey form be designed for all housing services to use and divide the form into two sections. – section one to ask 4/5 standard questions (the same questions to be used across all services)


4 6.2




Benefits: Section one provides a standard set of questions that would allow an even measure in monitoring performance across all services. Section two provides service specific questions that would give a detailed measure to an individual service.

Risks: Effective questions would need to be clear and not leading.

Resources: There would be an impact on staff time to review and agree set questions. There could be a cost attached to reproducing survey forms Timescales: 6-9 months

All survey forms to use a five-point scale

Benefits: Preferred by SBC and used by STAR survey Risks: Mid-point score does not really tell us anything

Resources: There would be an impact on staff time to review and agree set questions. There could be a cost attached to reproducing survey forms Timescales: 6-9 months

1. Develop a standard procedure for the collection and recording of survey results

2. Centralise a team within the council, where all survey forms/data are returned to (i.e an administration team), where all reponses are collated and recorded on the computer system.

3. The performance management team could access the system to monitor trends and results, or

4. consider using an external company or social enterprise to collect satisfaction data, or

5. Consider using Resident Inspectors to carry out mystery shopping and report on their findings.

Benefits: A more robust and transparent system. Risks: Set up time

Resources: Staff and resident inspectors would require training in the new procedures

Timescales: 9-12 months

Hold a monthly/quarterly/annual prize draw for customers who return satisfaction surveys as an incentive for them to return their forms.

Benefits: Increase in returns. Simple to introduce as a standard.

Risks: Would those who answer telephone surveys be included? If it is agreed to hold prize draws - how and where this will be published?

Resources: Cost element is attached to this – from whose budget and how much would the prize be for?

Timescales: 6-9 months

The panel queried whether SBC have the IT capability for customers to complete online surveys (if they have an online account with SBC) or to automatically send a satisfaction survey link via email? Could a pop-up box appear on the website if they report something online?


5 6.6



Benefits: Returns could automatically be directed to a centralised team. This could reduce costs on postage and staff time

Risks: Not everyone has online accounts or computer access. Returns may decrease if customers do not complete them.

Resources: Staff time - Cost attached to set this up Timescales: 12 months

Create an app that can be downloaded to smart phones for customers to use Benefits: Could be a convenient and easy way for customers to interact Risks: May or may not increase returns

Resources: Staff time and possible cost involved Timescales: 9-12 months

Invest in some ipads/tablets – these could be pre-loaded with satisfaction surveys and could be used by both staff and resident inspectors to survey customers. Benefits: Cost savings – printing/postage – all satisfaction surveys could be pre-loaded and used by all services or resident inspectors


Resources: Initial set-up costs and staff time Timescales: 9-12 months

Ensure that the satisfaction results are published in the Tenants Annual Report and are also on the council website. Show satisfaction results on screen in CSC. Consider how to disseminate satisfaction results throughout the year.

Benefits: Customers would have a range of ways to see how a service is

performing and what has been done to improve the delivery of a service. It would also send a clear message that the council value their customers views and opinions.


Resources: Staff time to implement Timescales: 6-12 months


7.1 Notice of Service Review

If you have any questions about this report please contact Fiona Plumridge on 01xxxxx or email



Appendix 1

Service Review Notice

Title: Review of how the housing service measure, report and utilise customer satisfaction survey results


• To establish what surveys take place • To find out who the surveys are aimed at • To verify what we learn from them

• To find out what staff think of them and find out what customers think of them • To establish how the surveys are distributed, the return rates, how we encourage

customers to return them and how we can increase returns

• Investigate what we do with the results – where are they recorded, what they are

used for and what were the key results in the past 6 months

• To find out if surveys make a difference and improve services

• Investigate what could be done differently and make recommendations

Evidence used:

• Performance information reviewed at the Away Day in May 2014

• Information from the STAR survey results that showed a decrease in satisfaction

Further evidence required:

1. Copies of satisfaction survey forms

2. Survey questions and results, details of their survey returns, what is done with the results from: Repairs; Gas; Caretaking/Grounds Maintenance; Investment;

Supported Housing; Tenancy; ASB; Aids and Adaptations; Lettings

Project team and roles:

• Collate and evaluate information – All

• Interview staff – Doreen Howell and Ted Jones

• Interview tenants – tbc if needed (asked at meetings where residents were


• Benchmarking – All

• Draft report – Fiona Plumridge and Patrick Sichali • Present report – All

• Scrutiny Panel Champion – Maureen Herdman

• Scrutiny Panel Champion for facilitating – Gill Laurence • CSP critical friend – Berni O’Regan

Review start date: January 2015 Review finish date: May 2015


7 Signed by:

Chair of Scrutiny Panel: ……… (print name & sign)

Scrutiny Champions: ……… (print name & sign)