Skip to main content

Table 3 Challenges and factors for improvement to consider when implementing visual individual PROMs feedback in clinical practice

From: Visualization formats of patient-reported outcome measures in clinical practice: a systematic review about preferences and interpretation accuracy

Challenges that may hinder graphic visualization format interpretation

Possible factors to improve graphic visualization format interpretation

Patients and clinicians

Directional inconsistency in longitudinal data (i.e., sometimes higher scores can mean better or worse)

Make use of standard descriptive labels (consider using ‘better*’ instead of ‘normed**’ or ‘more***’ for describing directionality of scores) [34, 37]

 

Preferred by 79% of patients and 90% of clinicians when concerning individual level PROMs data and 100% of clinicians when concerning group level PROMs data

 

Consistent use of clear ratings: higher scores are always better results (i.e. in some frequently used PROMs, higher score are better when scores describe functioning, but lower score are better when symptom burden is described. This causes interpretation challenges) [37]

 

Indicate with an arrow on the y-axis which direction means the score is better [16]

 

Describe directionality by plain text that is understandable despite literacy or education level [5]

 

Provide detailed information on the meaning of high and low score [17]

Interpretation accuracy of what exact PROMs information is represented in the graphic visualization format

Provide an instructive aid for patients and clinicians [2]

 

Use simple iconography for demonstrating single PROMs values [25]

 

Use brief definitions of different PROMs domains/values [25]

 

Limit the number of presented symptoms per graphic visualization format [35]

No ‘one-size-fits-all’ solution

Make use of a dynamic dashboard, which can display multiple types of visualization strategies. Thereby, you provide users the ability to select a preferred format instead, including the ability to add or remove dashboard elements such as error bars and shading [3, 6, 28]

 

Developing a clinic-based video tutorial for the dashboard to explain what is shown on the dashboard and how the patient and clinician can customize the dashboard to their needs [28]

Patients

Interpretation accuracy of what exact PROMs information is represented in the graphic visualization format

Ask patients to prioritize their symptoms, to avoid an overload of information [35]

Timing of providing feedback on PROMs visualization

Provide feedback immediately after assessment, and before consultation, to significantly improve assessment experience when providing combined graphical and tailored text-based feedback [14, 26]

Patients ‘ opposition to PROMs use in clinical practice

Ask permission to the patient to receive their own results and/or the results of the general population [19]

 

Provide information so patients know what PROMs data might show and how their practice might change [36]

 

Tell patients that data is trustworthy and are handled confidentially [19, 36]

 

Do not provide anonymous feedback [19]

 

Visualize as transparently as possible what type of care is delivered [19]

Clinicians

Interpretation accuracy of what exact PROMs information is represented in the graphic visualization format

Eliminate comparison groups or inform comparison group scores with confidence intervals or error bars [3], to better counsel the patients about their score (makes it easier to understand)

 

Link the PROMs outcome scores (scale in the graphic visualization format) to the meaning of the narrative (i.e.; tell the patient that a higher score on the scale means better functioning) [35]

  1. PROMs: patient reported outcome measures
  2. *’Better’ is defined as higher scores indicating “better” outcome
  3. **’Normed’ is defined as normed to the general U.S. population
  4. ***’More’ is defined as higher scores indicating “more” of what was being measured