Skip to main content

Table 2 Example quotes for the top five most prevalent categories mapped to each CFIR domain (including ties)

From: Facilitators and barriers to implementing electronic patient-reported outcome and experience measures in a health care setting: a systematic review

Facilitator (F)/Barrier (B)

Category prevalence (no. of studies)

Example quotes

Domain 1: Intervention characteristics

Key attributes of interventions that influence the success of implementation

 ePROMs/ePREMs monitor changes of patients (F)

12

PROMs also allowed physicians to “track any symptom over time from visit to visit”. [30]

 Graphical visualisations of ePROM results to see trends (F)

10

"Graphical overview over time: clear" […] "Graphical aspect is useful" […] "Nice trend of the wellbeing over time". [8]

 ePROMs time burden (length, repetition, or timing) (B)

10

"Sometimes patients seem a bit overwhelmed by having to answer all of the questions and the broad scope of it. Some of these folks might be better off just skipping it". [24]

 ePROMs/ePREMs facilitate extracting information that might be overlooked or not be uncovered in consultation (F)

10

"During follow-up setting very welcome especially for adolescent who then often struggle with return to society and quality of life problems emerge more often" … "Information that becomes accessible with children that have difficulties talking" […] "With select group of patients where contact does not run smoothly". [8]

 Lack of reliable and robust software and hardware (B)

8

Technical aspects "It takes effort to log in" "I do not receive an automatic message when patients have completed PROMs" "I have to print the ePROfile, because we do not have computers in the consultation room". [35]

 User-friendly software and technology (F)

8

I found the care monitor platform user-friendly. [31]

 Real-time access to ePROMs/ePREMs completion status and results data prior to, and during, consult (F)

8

Clinicians have real-time access to the results, which are graphically displayed inside the hospital information system. [33]

Domain 2: Outer setting

Includes the features of the external context or environment that might influence implementation

 ePROMs/ePREMs amplify patient's voice, facilitate patient-centred care and shared decision-making (F)

13

"For those who want another way to voice their experiences, it’s fantastic because a lot of people... you know, by the time they kind of come to us within their journey of health care and transitioning through the disease process, a lot of people don’t feel like they’ve been listened to". [25]

 ePROMs allow patients to better communicate and prioritise in clinic visits (F)

12

Focuses the session. The use of the measures provided a focus for short-term work: “I found if when I asked a question about ‘which question stood out for you’, not regarding the score so much, then we can talk about it in that way and bring focus to that and that was helpful”. [29]

 Completing ePROMs/ePREMs can be difficult for patients with low language or computer literacy (B)

9

the PRO assessment was only available in English, which excluded patients who were non-English speakers. [30]

 Completing ePROMs/ePREMs can be difficult for patients with physical or cognitive impairments (B)

8

In addition, patients' disease characteristics, cognitive, intellectual, and visual impairment also influenced patients' interest in using the ePSRM tool. “So, people that [have] developmental disabilities of course … they are not fully comprehending what the questions are trying to ask … so patients with dementia, patients with developmental disabilities …”. [28]

 Patients not aware of purpose of ePROMs; needing to have rationale explained to them (B)

7

Patients cited multiple reasons as to why PROs had not been completed, with most citing a lack of understanding regarding their purpose. [41]

Domain 3: Inner setting

Includes the features of the implementing organization that might influence implementation

 Regular training and education to build staff capacity and confidence with the ePROM system (F)

13

Many clinicians were not sufficiently familiar with Skindex-16 and its scoring and recognized that “training would help [to] understand it better.” Clinicians who were familiar with the tool were unsure how to use individual patient scores clinically. [43]

 Integrating ePROMs/ePREMs into existing workflow routine or reconfiguring workflow to ensure integration of ePROMs (F)

14

Reconfigure workflow to ensure integration and access to PRO reports at clinical encounter. [36]

 Staff or volunteers to facilitate ePROM collection (F)

10

Trained volunteers can assist patients with filling out the questionnaires before the doctor's visit. [33]

 Buy-in of leadership (F)

9

Engaged leadership and a willing champion within each individual practice (e. g., quality improvement leader or office manager) helped to maintain momentum, to demonstrate the value of the data for improving quality of care, and to provide audit and feedback to providers and staff. [7]

 Burden on staff facilitating ePROM collection (B)

8

The greatest burden was placed upon the registration staff, having to explain the purpose of the survey instruments and the importance of collection at each (and every) office visit. [9]

 Time consuming and too many ePROMs/ePREMs (B)

8

Reasons why clinicians were not satisfied with the PROMs were too many PROMs. [35]

 ePROMs increase time efficiency for clinicians in interview process and documentation (F)

8

Saving time and efficiency was also commonly mentioned. The implementation of the ePSRM helped reduce clerical time for both clinicians and nurses. [28]

Domain 4: Characteristics of individuals

Characteristics of individuals involved in implementation that might influence implementation

 Improved prioritisation and targeting of patient-clinician communication (F)

13

Using PROs enabled psychologists to “know patient concerns upfront” and have “more targeted conversations with patients”. [30]

 Not sure how ePROMs/ePREMs can inform clinical decisions (B)

10

The new users to CORE-Net only used the scores to inform them how to proceed in a limited way, for example, in terms of whether to continue work with the clients or to discharge them. They felt quite strongly that the ‘relationship with the client’ and their ‘own subjective measure’ based on their own experience is what informs and ‘influences’ the therapy process and their work, rather than the scores. [29]

 Believing ePROMs/ePREMs not suitable/relevant/valuable (B)

9

Implementation was least successful when physicians did not find PROs valuable. [38]

 Belief that ePROMs/ePREMs not clinically valid or lack accuracy (B)

5

Physicians were sceptical about the validity of PRO scores due to “little research around it [PRO] to help frame decision making or thoughts around it”. [30]

 Belief that ePROMs/ePREMs duplicate clinical interview so are redundant (B)

5

Clinicians also expressed ambivalences in meso-level purposes of PROMs and PREMs use, asserting that their palliative expertise already encompassed routine conversational and observational quality of life assessments, and engendered robust interdisciplinary communication. [25]

 Buy-in of clinical staff (F)

5

Individual interviews with nurses and physicians suggested that […] physician buy-in was key to successful PRO implementation. [38]

 Clinician’s lack of knowledge and content of ePROMs/ePREMs (B)

5

Most Medical Assistants had little knowledge of the content of the assessment and thus had a hard time explaining how patients would reap direct benefits from completing it. “We [administer PROs] because we are required to do so”. [30]

Domain 5: Process

Includes strategies or tactics that might influence implementation

 Presence of local staff champions to support/motivate peers and advocate for PROM usage (F)

9

A willing champion within each individual practice (e. g., quality improvement leader or office manager) helped to maintain momentum, to demonstrate the value of the data for improving quality of care, and to provide audit and feedback to providers and staff. [7]

 Engagement/involvement of stakeholders (F)

9

For successful implementation, it takes effort to motivate all health care providers, administrative employees, and technology providers. [33]

 Ongoing monitoring of implementation through regular audits, and regular feedback to users (F)

4

Audit and feedback: Performance reports designed with stakeholders and feedback to disease site teams for population-based QI; Systems to track progress and identify targets for improvement. [36]

 Pre-implementation testing, especially of usability (F)

3

We found conducting a pilot phase to be very helpful in reducing any 'teething problems'. [33]

 Project managers/coordinators skilled in knowledge translation and facilitating practice change (F)

3

Site coordinators skilled in knowledge translation and facilitating practice change were considered key to successful implementation. [36]