ARTICLE

Vol. 138 No. 1616 |

DOI: 10.26635/6965.6721

Clinician feedback for bi-annual quality improvement reports generated by the Prostate Cancer Outcomes Registry Australia and New Zealand

The Prostate Cancer Outcomes Registry (PCOR) is an Australian and New Zealand (ANZ) registry that collects health and quality of life information from people with newly diagnosed prostate cancer.

Full article available to subscribers

The Prostate Cancer Outcomes Registry (PCOR) is an Australian and New Zealand (ANZ) registry that collects health and quality of life information from people with newly diagnosed prostate cancer. The New Zealand arm of the registry (PCOR-NZ) has been operational since 2016.1 Data from PCOR-NZ and the six Australian jurisdictional-based PCORs feed into PCOR-ANZ and are used to create standardised benchmarked measures, which are collated into quality improvement (QI) reports. QI reports from PCOR-ANZ are generated by Monash University every 6 months and distributed by the local jurisdictions to participating hospitals and clinicians as confidential reports. In New Zealand, “hospital reports” are distributed to the relevant clinical director, and “clinician reports” are distributed to all clinicians who have signed an agreement to participate in the registry.1,2 While urology QI reports have been distributed in New Zealand since PCOR-NZ’s inception, reports specific to radiation oncology departments and radiation oncologists were only distributed from the end of 2021. The urology QI reports consist of 12 quality indicators that were developed by an international multidisciplinary team using a modified Delphi process and approved by the PCOR-ANZ steering committee.2 Currently, in New Zealand QI reports are sent as PDF files via a secure file transfer protocol for download.

Since 2019, all New Zealand public hospitals have participated in PCOR-NZ.1 Data from these hospitals, in conjunction with around 60% of private clinics, now provide a national repository of information about the health and wellbeing of men diagnosed with and treated for prostate cancer. QI reports combining 3 years of data and comparisons across the full public hospital system have been available since 2022.

The PCOR aims to improve clinician performance and prostate cancer outcomes for patients in Australia and New Zealand.1 The confidential reports from PCOR-NZ provide clinicians with personalised, risk-adjusted performance data based on the patients that clinician has either diagnosed or treated. This allows them to track their patients’ quality of life outcomes over time, identify areas of weakness and compare their performance with the overall performance of clinicians across Australia and New Zealand. Additionally, reports can be used by clinicians and departments to implement changes in clinical practice, assess the impact these changes have made or act as external audits to ensure clinicians are providing the best possible care. Clinicians are not permitted to use the report information for advertising or marketing.

This study aims to: 1) assess clinician perspectives on methods of report distribution, 2) assess the clinical value and utility of the PCOR reports from the perspective of New Zealand urologists, and 3) identify any barriers that may prevent clinicians from engaging with the reports. This is the first study of its kind in Australia and New Zealand.

Methods

All consultant urologists practicing in New Zealand who were receiving QI reports from the PCOR-NZ registry in December 2022 were invited to complete a survey. Semi-structured interviews were also conducted with clinicians who volunteered to give additional feedback. As only a small number of radiation oncology reports had been distributed by the time of this study, the survey and interviews focussed on urologists and urology reports only.

The authors identified 54 consultant urologists practicing in New Zealand, 49 of whom were receiving scheduled QI reports. The five urologists who were excluded from this study either did not contribute to the registry, rarely diagnosed/treated prostate cancer or had not consented to participate in the QI reporting programme.

The survey consisted of seven sections, each assessing the clinicians’ opinion on different aspects of the QI reports.

Section one gathered clinical practice information from participating urologists.

Section two investigated the effectiveness of QI report distribution, frequency, accessibility and notification, as well as clinicians’ preferred methods of receiving reports, and the devices they used to access reports.

Section three focussed on identifying any concerns related to data privacy and security.

Section four inquired about the length, layout and presentation of data in the reports.

Section five ranked the clinical value of items within QI reports.

Section six evaluated clinicians’ report utilisation, including discussion with colleagues, auditing practice, identifying areas for improvement, monitoring patient outcomes, performance tracking and peer comparison.

Section seven gauged the value of reports in auditing and improving clinical practice.

In December 2022, emails were sent to eligible urologists, outlining the aims of the project and encouraging them to participate. Each email contained a personalised URL link to the survey and an invitation to participate in an interview to further express their thoughts on the QI reports.

Between December 2022 and March 2023, reminder emails were sent fortnightly to clinicians who had not responded.

Semi-structured interviews with volunteer clinicians were conducted over Zoom, with an average duration of 25 minutes. A list of questions was used to guide the interviews; however, participants were encouraged to freely express their thoughts if they were relevant to the project aims. With clinician consent, Zoom audio was recorded and used to generate transcripts that were then reviewed by the lead author.

PCOR-NZ provided information about report access, i.e., opening their PDF report sent via email, related to all participating clinicians. Clinicians signed an agreement to participate in PCOR-NZ, and this study was undertaken as part of quality improvement activities to ensure that the quality and method of reports being distributed was acceptable, appropriate and of value to participating clinicians. Additionally, clinicians consented to participate in this study when they completed the survey.

Data analysis

Survey data were analysed using frequency distribution tables and percentage calculations. Interview data were assessed by thematic analysis using interview transcripts to identify patterns and group responses into related themes.

Results

Response rate

Thirty-three of the 49 (67%) urologists who participate in PCOR-NZ completed the survey, and four participated in interviews. Among the 33 responding urologists, 27 (82%) had opened their most recent QI report (Table 1). Eight (50%) of the 16 clinicians who did not participate in the survey also accessed their most recent QI report. A total of 77% of urologists who had viewed their last report completed the survey. Four clinicians volunteered to be interviewed to provide additional feedback.

View Table 1–2, Figure 1–2.

Survey responses

Most respondents (76%) had over 10 years of experience as a consultant urologist, 73% had a sub-specialty interest in prostate cancer and 88% performed radical prostatectomy as a treatment option for prostate cancer.

All clinicians reported that PCOR-NZ appropriately notifies them whenever a new QI report is released and agreed that sending the QI reports every 6 months is appropriate. Most (91%) could access reports easily. The most popular devices used for viewing reports were laptop (70%), phone (45%) and work desktop (36%).

Most clinicians (91%) expressed no concerns about the privacy and security of their data.

Some clinicians (42%) found the reports too long, but most (91%) indicated that the data were presented clearly and appropriately. Sixty-one percent of clinicians claimed to have read the reports in their entirety, with 25% taking less than 15 minutes to read the reports, 55% taking 15–30 minutes and 20% taking more than 30 minutes.

Half (52%) of clinicians discussed their reports with colleagues, 12% used them in formal meetings and 73% read them as individuals. It should be noted that clinicians could choose more than one response for this question.

Clinicians primarily used the reports to audit their practice (73%), identify areas that need improvement (88%), track their performance over time (73%) and compare their performance with colleagues (73%).

Overall, most clinicians found the QI reports valuable to improve their practice (70%).

Methods of QI report distribution

Sending the reports as PDFs was the most popular medium among clinicians, followed by the option of an interactive website, then a mobile app and finally hard copy printouts (Figure 1).

Clinical value of items in the QI reports

Figure 2 shows that the items clinicians considered most valuable in the reports were: Treatment: positive margins in pathological T2 stage (pT2) disease (97% found this valuable); Patient outcomes: sexual bother and function (97%); Treatment: positive margins in high-risk disease (94%); Patient outcomes: urinary bother vs incontinence (94%); Treatment: positive margins in intermediate disease (88%). Additionally, Patient outcomes: urinary bother vs obstruction (70%); and Patterns of your treatment (70%) ranked well among clinicians.

Conversely, the items clinicians found least valuable were: Clinical: prostate-specific antigen (PSA) recorded, Clinical T stage recorded, PSA post radical prostatectomy (RP) (42% found this valuable); and Patient outcomes: bowel bother and function (36% found this valuable).

Discussion

QI reporting is widely used in healthcare to improve quality of care.3,4 The results from systematic reviews over the past 2 decades consistently suggest that QI reporting “generally leads to small but potentially important improvements in professional practice.”5,6 Lingard et al. conclude that surgeon-level QI reporting fosters accountability, enhancing both patient safety and surgical performance.7

This is the first study from PCOR-ANZ to assess urologists’ perspectives on QI reports. It identifies areas that can be improved and offers insight into whether QI reports are useful for monitoring and improving clinical practices.

Aim 1) Assess clinician perspectives on methods of report distribution

Our data suggest that most urologists engage with these reports, with all participating clinicians receiving appropriate notification of new report generation, and over 80% reading their most recent report. Given most clinicians could easily access their reports, we can conclude that distribution via downloadable PDFs is effective—this is reflected in Figure 1, where clinician preference is for PDFs over other forms of report distribution. However, there is interest in more interactive forms of data distribution, such as a website or app. Upon discussing this in interviews, clinicians made it clear they do not think these should replace PDFs but rather supplement them, giving interested clinicians the option to engage further with their data. Interviewed clinicians universally valued the standardisation and benchmarking of current reports as they knew that all clinicians were compared using the same metrics. They hypothesised that most clinicians would not feel the need to engage with their data beyond what was provided in the reports (due to their comprehensive nature) but acknowledged that for those interested, the ability to do so would be valuable.

Additionally, Figure 1 highlights the clear preference for digital over physical methods of report distribution, with hard copy reports being ranked least preferred by most clinicians. When discussing this in our interviews, clinicians stated that they prefer digital reports, citing ease of access and the ability to keep multiple years of QI reports stored in one location. Some Australian jurisdictions within PCOR-ANZ distribute their reports to clinicians via hard copy mail. We can hypothesise that the results of this study may be generalisable to the Australian jurisdictions due to the likely similarities in populations (consultant urologists participating with PCOR); therefore, we suspect a switch to electronic report distribution would be positively received by Australian clinicians.

Aim 2) Assess the clinical value and utility of the PCOR reports

Clinicians find the reports useful for improving their clinical practice. Over 80% of clinicians use the reports to identify potential areas of improvement, and over 70% use them to audit and compare their performance to colleagues. With regards to the contents of reports, there are items that clinicians clearly found useful, and there are items that are more contentious. The top eight items, as displayed in Figure 2, are clearly valued, while the bottom two items are more uncertain, being valued by only 30–40% of clinicians. This tells us that, overall, clinicians find the items included in reports valuable; however, there is still room for improvement. One interviewed clinician stated there would be benefit in conducting a revised Delphi panel to update report items based on this study data. In particular, they would like to see the use of magnetic resonance imaging (MRI) and prostate-specific membrane antigen (PSMA) scanning for staging included in the reports, as these have become a larger part of the clinical decision-making process since the last Delphi panel in 2015.1

One clinician stated that they would like to see data for radical prostatectomies separated into open, laparoscopic and robotic subcategories. They highlighted that clinicians in larger urban centres likely perform more robotic surgeries than clinicians in smaller centres. Having the data show performance based on operation method, rather than performance overall, would be useful to eliminate any ambiguity as to whether the method of surgery affects overall performance. This would allow clinicians to assess whether they are better/worse at one type of surgery compared to the total pool of clinicians—highlighting in more detail areas of strength, and areas that require improvement. It is important to recognise that robotic vs open prostatectomies may have different patient outcomes, and that an individual’s QI data would reflect this.8 Clinicians could compare their own robotic vs their own open prostatectomy data to see discrepancies in performance based on method of surgery, and could compare their robotic and open data to the overall clinician data to assess their open/robotic performance compared to their peers. In other words, this would allow clinicians to answer two questions: “How does my open/robotic performance compare to my peers?” and “How much of a discrepancy is there between my own open and robotic performance?”.

Another point raised by clinicians was what to do if the reports identified that they were underperforming in any areas. Clinicians stated they would first scrutinise the data to see if there was a statistical explanation for the result (e.g., low sample size making results less precise). If the result was likely to be accurate, then clinicians expressed interest in seeking advice from others who performed well in this area. Data are anonymised, so clinicians do not know who to contact outside of those willing to discuss results within their own local departments. One clinician indicated that being able to contact PCOR-NZ to discuss their data was helpful. An opportunity to be linked with top-performing clinicians who might be willing to offer advice to those who sought it was suggested as a tool to support improvement.

Aim 3) Investigate barriers that prevent engagement with reports and the registry

Clinicians commonly cited length of reports as a barrier to engagement—with reports being approximately 50 pages long. One interviewed clinician stated that the reports had too much unnecessary information, making the important data difficult to find: “The way the information is written and displayed is quite confusing … well, it’s not confusing, but it takes a lot to get through the information.” Another clinician stated: “I think the reports are too long for everyone to look at. But I think if you’re engaged enough to look at it, they’ve got a good amount of data that’s there.” Clinicians felt that the reports would benefit from having a summary of key information available at the beginning. It is apparent that 80% of clinicians spend less than 30 minutes reading their reports, so it is important to ensure that the reports readily provide key information to readers.

Conclusion

QI reports are effectively distributed to urologists, with good clinician engagement. PDFs are preferred; however, there is interest in having additional interactive forms of data distribution. Reports are useful for auditing and improving practice, with most items deemed valuable. Suggested feedback to improve reports includes differentiating between open, laparoscopic and robotic subcategories; updating the report items via a revised Delphi panel; and having a pathway for clinicians to contact others for peer review and advice. Report length appears to be a barrier to clinician engagement; most clinicians found that the data were presented clearly and appropriately, but there was a need for more succinct presentation of key results. The findings of this study may be generalisable to other types of reports generated by PCOR-NZ, and there may be value in a similar review being undertaken by the PCOR jurisdictions across Australia to ascertain the views of clinicians there in relation to their reports.

Strengths and limitations

A high proportion of urologists who participate in PCOR-NZ responded to the survey. However, the study has several limitations. It excluded New Zealand radiation oncologists and Australian clinicians, who also receive PCOR QI reports and may have differing views. Secondly, clinicians who volunteered to be interviewed are more likely to be more closely engaged with PCOR and may be more likely to give positive feedback. Furthermore, interviews were conducted with only four clinicians, meaning their feedback may not be reflective of all clinicians.

Aim

This study aims to 1) assess clinician perspectives on methods of report distribution, 2) assess the clinical value and utility of the Prostate Cancer Outcomes Registry (PCOR) Quality Indicator (QI) reports for New Zealand urologists, and 3) identify barriers impacting engagement with these reports.

Methods

PCOR-ANZ provides 6-monthly QI reports to participating clinicians and hospitals. New Zealand urologists receiving scheduled reports were surveyed digitally. Interviews were conducted for qualitative feedback.

Results

Thirty-three of 49 (67%) eligible urologists participated in this study. One hundred percent (n=33) of clinicians received notifications for new QI reports, 42% (n=14) finding them too lengthy. Seventy-six percent (n=25) and 70% (n=23) found the reports valuable for auditing and improving their practice, respectively.

Conclusion

Report distribution and data presentation are effective. PDFs are preferred by clinicians, but proposed interactive mediums were received positively. Reports are valued for auditing and improving practice. Report length and clinician time constraints are key barriers affecting engagement. A revision of the items included in QI reports would be beneficial to reflect modern practice. There is demand for a pathway to allow clinicians to contact others for peer review and advice.

Authors

Dr Andreas S Nicolaou: House Officer, University of Otago, Christchurch, New Zealand.

Dr Eng Ann Toh, MB ChB, PhD, DPH, MPH: Urology Clinical Research Registrar/Public Health Medicine Registrar, Department of Urology, Christchurch Hospital, Health New Zealand – Te Whatu Ora Waitaha Canterbury, Christchurch, New Zealand.

Judith Clarke, BA, DPH, MPH: Prostate Cancer Outcomes Registry New Zealand (PCOR-NZ) – National Manager; Centre for Health Outcomes Measures New Zealand (CHOMNZ), Christchurch, New Zealand.

Mr Stephen Mark, MB ChB, FRACS, FMGEMS: Prostate Cancer Outcomes Registry New Zealand (PCOR-NZ) Clinical Lead; Consultant Urologist, Department of Urology, Centre for Health Outcomes Measures New Zealand (CHOMNZ), Health New Zealand – Te Whatu Ora Waitaha Canterbury, Christchurch, New Zealand.

Associate Professor Phil Hider, MB ChB, MRNZCGP, FNZCPHM, FAFPHM, PhD: Professor of Population Health/Public Health Physician, Department of Population Health, University of Otago, Christchurch, Christchurch, New Zealand.

Acknowledgements

We would like to thank Movember (funders of PCOR-ANZ), and in particular Emma Todd (Manager, Clinical Quality Improvement at Movember) for her assistance in setting up the study. We would also like to thank our sponsor, the Canterbury Urology Research Trust (CURT) for funding this study, and the Centre of Health Outcomes Measures New Zealand (CHOMNZ) for their assistance throughout the study. Finally, we would like to thank all the urologists who gave their time to complete our survey and participate in interviews; their feedback is greatly appreciated.

Correspondence

Andreas Nicolaou: Department of Urology, Christchurch Hospital, Private Bag 4710, Christchurch 8140.  

Correspondence email

anicolaou2001@gmail.com

Competing interests

SM is the clinical lead for the NZ Prostrate Cancer Register and Chair of the charitable trust that runs registry CHOMNZ.

PH is the PCOR-NZ Steering Committee Chair.

All sources of funding for this project come from the University of Otago Summer Research Scholarship Programme (Christchurch), and the Canterbury Urology Research Trust (CURT).

1)       Mark S, Clarke J, Shand B, et al. Setting up the Prostate Cancer Outcomes Registry of New Zealand: reflecting and influencing clinical practice. N Z Med J. 2021;134(1546):79-88.

2)       Nag N, Millar J, Davis ID, et al. Development of Indicators to Assess Quality of Care for Prostate Cancer. Eur Urol Focus. 2018;4(1):57-63. doi: 10.1016/j.euf.2016.01.016.

3)       Navathe AS, Emanuel EJ. Physician Peer Comparisons as a Nonfinancial Strategy to Improve the Value of Care. JAMA. 2016;316(17):1759-60. doi: 10.1001/jama.2016.13739.

4)       van der Veer SN, de Keizer NF, Ravelli AC, et al. Improving quality of care. A systematic review on how medical registries provide information feedback to health care providers. Int J Med Inform. 2010;79(5):305-23. doi: 10.1016/j.ijmedinf.2010.01.011.

5)       Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;2012(6):CD000259. doi: 10.1002/14651858.CD000259.pub3.

6)       Ivers NM, Grimshaw JM, Jamtvedt G, et al. Growing literature, stagnant science? Systematic review, meta-regression and cumulative analysis of audit and feedback interventions in health care. J Gen Intern Med. 2014;29(11):1534-41. doi: 10.1007/s11606-014-2913-y.

7)       Lingard MCH, Teo Y, Frampton CMA, Hooper GJ. Effect of surgeon-specific feedback on surgical outcomes: a systematic review of the literature. ANZ J Surg. 2024;94(1-2):47-56. doi: 10.1111/ans.18772.

8)       Moretti TBC, Magna LA, Reis LO. Surgical Results and Complications for Open, Laparoscopic, and Robot-assisted Radical Prostatectomy: A Reverse Systematic Review. Eur Urol Open Sci. 2022;44:150-61. doi: 10.1016/j.euros.2022.08.015.