In October 1963, a group of young professionals came together to create a housing association and L&Q was born. Back then, the organisation was named Quadrant Housing Association to reflect the naval history of the borough of Greenwich, where it was based.

The initial investment saw 32 people each buy shares worth £2, and that £64 investment has now become a £12 billion social business that combines a dream to end homelessness with the entrepreneurial flair of city professionals.

Recent years have seen a couple of significant mergers. In 2016, we merged with East Thames to create the largest provider in the country and three years later with Trafford Housing Trust to address the need for affordable housing in the North West.

L&Q banner

We now house around 250,000 people in more than 105,000 homes, primarily across London, the South East and North West of England. However, as a charitable organisation, our role goes beyond providing homes and housing services.

We are a long-term partner in the neighbourhoods where we work. The ongoing involvement in our communities is as important to us as it is to our residents. This year, our work to transform communities has continued, with another £8 million invested through the L&Q Foundation. This investment is helping our residents to lead independent lives, secure employment and transform the fabric of our neighbourhoods.

Getting feedback from customers

Choosing the right survey methodology and getting the question wording right is vital in gaining valuable feedback from customers. We have a page on the L&Q website which tells customers about the surveys we run, so if they want any information or just want to check that the survey is genuine, they can check on the following page. The call centre staff can also refer residents to the page if they phone up with queries / concerns. There are pros and cons with each, which we've outlined below:

Automated SMS survey (back and forth style)

We send out an automated survey via SMS when the repair is closed, there can be a speedy and high response rate, but the methodology is limiting as you can only really ask a couple of short questions. We use 'satisfied' or 'dissatisfied' responses which doesn't leave room for any nuance. For example, the boiler may have been fixed, but the resident may be unhappy that the operative turned up late and left a mess, but there is no way to capture a degree of unhappiness. We report on these results monthly.

Email surveys

We have an online survey distributed by email which captures more detail on the level of satisfaction via scaled questions and also more detailed additional questions to help us diagnose any underlying issues with the service. We manually send ours on a weekly basis and review results monthly. This method is more detailed but less timely as there is a time lag between repair and response. This is an easy to manage and low cost option but does give lower response rates. 

There is also an option to run this automatically based on a repair trigger to avoid the time-lag from appointments.

Telephone surveys

We originally ran a monthly event-triggered survey via a market research agency by telephone. We received a good response rate and detailed answers, but the cost was high and because we were running it as a monthly survey, there could be several weeks delay between the repair date and the reporting date, which did not meet the needs of the business.

Similarly you could also run this in house to achieve the higher response rate but with this you'd need resource and time to make the calls to customers directly.

Variety

We currently run our annual ‘Tenant Satisfaction Measure’ (TSM) survey via a research agency using a variety of methods – mostly phone, but also email, SMS and postal. This is not event-triggered. There is an initial qualifying question that asks if the resident has had a repair in the last 12 months and only those that answer ‘yes’ go on to give a satisfaction score for the repairs service.

What did we learn from this feedback?

Customers tell us that in general they are satisfied with the repair service they receive from L&Q. This is based on feedback gathered both immediately after the repair, as well as feedback a few weeks after the repair was completed.

Comparing immediate feedback with feedback gathered a few weeks later, it is evident that the satisfaction level drops during those subsequent weeks. This is likely to be due to the development of issues over time e.g. leak appears to be fixed initially, but re-occurs after a week.

The key issues relating to the repair service are:

  • Repair not fully completed - learnings related to needing better diagnosis up front, better stocked vans, building with standardised parts, and allowing appropriate length time slots, so that the correct operative type can attend with the relevant parts/tools and has sufficient time to complete the repair.
  • Poor quality of repair - as above, learnings related to needing to ensure that the correct parts/materials are available and that sufficient time is allocated for the repair, in addition it is vital to ensure that operatives have the appropriate skill level for the job and that monitoring is in place to check on the quality of the work being carried out.
  • Having to contact multiple times regarding repair - learnings related to needing better communications to ensure the resident is aware of the status of their repair request, the need for an online self-service tool where the resident has access to the details of their request and that there are fail-safe mechanisms in place to ensure that when a repair cannot be completed on the first visit, the relevant follow up visit is booked in and communicated to the resident.

What do we do with the feedback?

The customer satisfaction survey results are fed back to the repairs team for review.

Where the resident raises an issue via the immediate SMS survey, the repairs team pick up that query and follow up with the resident as appropriate to address any dissatisfaction.

The results from the surveys are used by the repairs team to determine what improvement actions will be included in their annual ‘Run The Business’ (RTB) plans.

L&Q working

What did we learn?

  • Customer satisfaction scores and feedback will differ greatly depending on whether it is gathered as part of an event-triggered survey or a perception survey
  • Methodology and approach is important – as well as the issues mentioned above, there may also be an impact on scores – respondents tend to give higher scores if speaking to someone on the phone rather than just typing a response in an online survey
  • As mentioned earlier, question wording and response options are important to ensure you obtain the level of detail and depth of understanding needed
  • It is also important to determine up front what you want from a survey e.g. immediate resident specific feedback vs. reflective feedback after 2-3 weeks, but still resident specific vs. high level strategic feedback for planning and business improvement purposes
  • It is useful to view customer satisfaction survey results alongside other internal business data such as call centre or complaints data to provide context and a deeper understanding of the issues.

Explore more topics back in the Knowledge Hub