Skip to main content

Considerations for conducting and reporting digitally supported cognitive interviews with children and adults

Abstract

Background

Cognitive interviewing is a well-established qualitative method used to develop and refine PRO measures. A range of digital technologies including phone, web conferencing, and electronic survey platforms can be leveraged to support the conduct of cognitive interviewing in both children and adults. These technologies offer a potential solution to enrolling underrepresented populations, including those with rare conditions, functional limitations and geographic or socioeconomic barriers. In the aftermath of the COVID-19 pandemic, the use of digital technologies for qualitative interviewing will remain essential. However, there is limited guidance about adapting cognitive interviewing procedures to allow for remote data capture, especially with children.

Methods

Synthesizing the literature and our research experiences during the COVID-19 pandemic, we examine considerations for implementing digitally supported cognitive interviews with children, adolescents, and adults. We offer recommendations to optimize data quality and empirical rigor and illustrate the application of these recommendations in an ongoing cognitive interviewing study to develop and refine a new pediatric PRO measure.

Results

Good research practices must address participant and researcher preparation for study-related procedures and should anticipate and pre-emptively manage technological barriers. Field notes should detail interview context, audio/video cues, and any impact of technological difficulties on data quality. The approaches we recommend have been tested in an ongoing cognitive interviewing study that is enrolling children/adolescents with cGVHD ages 5–17 and their caregivers [NCT 04044365]. The combined use of telephone and videoconferencing to conduct cognitive interviews remotely is feasible and acceptable and yields meaningful data to improve the content validity of our new PRO measure of cGVHD symptom bother.

Conclusion

Digitally supported cognitive interviewing procedures will be increasingly employed. Remote data collection can accelerate accrual, particularly in multi-site studies, and may allow for interviewer personnel and data management to be centralized within a coordinating center, thus conserving resources. Research is needed to further test and refine techniques for remote cognitive interviewing, particularly in traditionally underrepresented populations, including children and non-English speakers. Expansion of international standards to address digitally supported remote qualitative data capture appears warranted.

Background

Patient-reported outcome (PRO) measures are essential tools to capture patient-centered endpoints in both observational research and clinical trials. Cognitive interviewing is a well-established qualitative method used to develop and refine PRO instruments. However, it can be challenging to sample geographically and socio-economically diverse individuals, children, and those with rare conditions or physical/functional limitations for qualitative research, including cognitive interviews [1]. Remote methods (which include telephone, web conferencing, and other social media and messaging platforms) offer a potential solution to enrolling these underrepresented populations [2]. Remote methods also allow for interviewer personnel and data management to be centralized within the study coordinating center. That centralization is particularly useful for multi-site research as it avoids the need to recruit, train, and supervise interviewers in multiple study sites. Centralization may thus serve to enhance efficiency, conserve resources, and improve methodologic quality. Innovative methods, including successfully adapting technology to address these challenges and facilitate the conduct of rigorous qualitative research, are warranted.

The use of remote methods to collect interviewer-administered survey data has been well described [3]. Similarly, there is a growing literature on leveraging social media, texts, blogs, chats, and instant messages to capture qualitative data online [4]. A small methods literature supports the feasibility, acceptability, and meaningfulness of remote qualitative interviews as an alternative to focus groups and individual interviews conducted in-person [5, 6]. However, the adaptation of traditional face-to-face cognitive interviewing principles to the remote environment, particularly with children and adolescents, has not been well described. To address this knowledge gap, this paper summarizes considerations and strategies for designing, performing, and reporting cognitive interviews conducted remotely using digital technologies. We illustrate the application of good research practices in an ongoing study to develop a new pediatric PRO measure. Lessons learned and key considerations to strengthen the empirical rigor of remote cognitive interviews are discussed.

Methods

Cognitive interviewing aims to evaluate and iteratively refine a PRO measure by gathering direct input from respondents about item content, comprehension, ease of response and format [7]. Cognitive interviewing addresses several areas that often need improvement during PRO instrument development, including clarity and comprehension, cognitive recall burden, response choices, ease of judgement, and questionnaire formatting and layout [8, 9]. The overall purpose of cognitive interviewing is to minimize measurement error by determining that research participants interpret question concepts as intended and can provide accurate responses.

While there can be variation in the structure and sequence of cognitive interviewing techniques, the basic structure involves two parts. In the first part, survey questions are administered to the respondent. This is followed by a semi-structured debriefing interview where the respondent is encouraged to reflect and provide feedback on the comprehension of each survey question, the clarity of interpretation, and their ease in selecting a response.

Traditional cognitive interviewing methods require adaptation to the remote environment, due to the active, structured, and highly reciprocal process that occurs between study participant and researcher. The need to administer survey questions prior to debriefing can make remote cognitive interviews less amenable to being exclusively telephone-based. As such cognitive interviews are greatly aided by the inclusion of the visual component that videoconferencing technologies offer.

Results

We illustrate the principles underlying the implementation of digitally supported cognitive interviewing in a multi-site study to develop, refine and test a new symptom scale for children and adolescents. The study is enrolling a sample of pediatric transplant survivors ages 5–17 with chronic graft-versus-host-disease (cGVHD) and their parents/caregivers (NCT04044365). The rare patient population, geographic dispersion, and the need to centralize methodologic expertise in conducting cognitive interviews with children motivated our use of remote interviewing methods. In designing our approach, we also had to accommodate several contextual challenges. These included: (1) the intricacies of interviewing children at different developmental stages, (2) inclusion of a child-parent dyadic interview component, (3) the requirement to balance participant burden with the need to debrief on a large number of PRO items, and (4) study participants’ prominent illness severity.

Implementation of remote cognitive interviewing methods

In this study, we employ both synchronous and asynchronous digital approaches to support remote data collection. To fulfill the first part of the cognitive interview, the child completes the symptom scale facilitated by a combination of screensharing on the videoconferencing platform and telephone for audio. The child views each PRO item on their computer screen and provides their verbal response, while the interviewer notes any difficulties such as hesitancy or indicators of confusion (such as changing answers). During the second part of the cognitive interview, to facilitate recall and engagement, screen sharing is used to revisit the items that the child experienced as problematic. Child-parent dyadic debriefing is also incorporated to identify and explore areas of miscomprehension that may be indicated by discordant ratings between child and parent. To accomplish this, the parent completes the caregiver proxy survey in advance of the interview. Completing this asynchronously both conserves time during the child interview and ensures that any discordant ratings are available so that the interviewer can return to these items when child-parent are jointly debriefed.

Figure 1 depicts the flow of data collection and integration of technological approaches. The three technologies are complementary and synergistic and were chosen with intentionality to address the study aims. There are a number of videoconferencing software systems from which to choose. Features that were important in our study included compliance with Health Insurance Portability and Accountability Act (HIPAA) guidelines, ease of screensharing, and the simplicity of a single-click access without the participant requiring an account or a password protected log-in. We also considered the user experiences of both interviewers and study participants, as gathered during pretesting. Inclusion of the telephone component reduces some of the technical complexity for both child and parent and facilitates participation by respondents with limited broadband access. Since video may not be consistently employed throughout the interview, verbal cues such as silence, which may indicate that the child is becoming frustrated, fatigued, or experiencing distress or disengagement, are closely monitored. Screensharing offers a visual component that encourages child engagement. The material presented during screensharing incorporates features such as embedded animation and markers to track progress. These features promote rapport and allow the child to feel a sense of control over the interview process. As with all cognitive interviewing, to mitigate social desirability biases, our interview guide reinforces that there are no right or wrong answers. The interviewer avoids evaluative language (e.g., “good answer”), using encouraging language instead (e.g., “this is very helpful information”). We have found that this combination of synchronous and asynchronous remote strategies ensures that both the child report and the parent perspective are captured in a fully independent manner.

Fig. 1
figure 1

Flow of data collection

Our remote cognitive interviewing procedures have enabled participant recruitment at more than ten centers while allowing for interviewers with unique expertise in interviewing children to be centralized at the coordinating center. Digital technologies have successfully facilitated child engagement during the cognitive interview, even among children as young as 5–7 years. Our experiences support the feasibility and acceptability of conducting digitally supported cognitive interviews, and our findings have offered meaningful insights about the comprehension, clarity, and ease of response of this new pediatric symptom scale.

Considerations to strengthen empirical rigor

Migrating cognitive debriefing interviews to a fully remote methodology requires that several considerations be addressed (see Table 1). Participant access to a computer and broadband internet (mediated by geographic factors and socioeconomic status), data security through online platforms, rapport, participant fatigue and engagement (mediated by age and time looking at screens), digital literacy, digital failures and resultant data loss should all be considered as potential limitations to be mitigated [10]. Having a second researcher present during the remote interview offers several advantages. These include sharing of technical tasks, providing the primary interviewer with suggestions for additional probing, and helping to manage data collection [10, 11].

Table 1 Considerations for conducting digitally supported cognitive interviews

A semi-structured interview guide is an essential component of all rigorously conducted cognitive interviews; it helps to ensure that the process is systematic and well-documented [7, 8]. This standardization is also critically important since technological challenges and procedural interruptions may occur more frequently with remote cognitive interviewing and can be distracting for both interviewer and participant. Procedural interruptions include environmental distractions, interviewee reluctance to speak freely due to the presence of family members, intrusions resulting from day-to-day activities in the home, and difficulties with phone or internet connectivity and audio/video quality. The interview guide, which may be electronic, or paper based, serves to prompt the interviewer to document the various forms of cognitive difficulties (e.g., clarity, comprehension, ease of response) that occur during the interview, along with relevant visual or auditory indicators of these difficulties (e.g., hesitation) [2, 11, 12]. The interview guide should also offer structured fields to record environmental conditions, participant engagement, technological aspects (e.g., type of device(s) utilized by the participant, use of video versus audio only), and any problems or difficulties encountered during the interview. To facilitate interpretation of results, the published report should summarize the technological and contextual interview, and detail any associated limitations in sampling, such as participant exclusion or withdrawal. Interviewer proficiencies that strengthen the empirical rigor of digitally supported cognitive interviews include strong knowledge of cognitive interviewing principles, a capacity for agile navigation within and between digital platforms, and responsiveness to unique participant challenges including technical difficulties [10].

Discussion

The COVID-19 pandemic has stimulated opportunities to maximally leverage digital technologies to facilitate research participation and support data collection, and it is anticipated that these approaches will continue to be relevant [12]. Synthesizing participant experiences with digitally supported cognitive interviews across studies could produce new insights into how best to adapt our methods for specific study populations and topic areas. Methodologic questions for future research and policy development include: Who participates in this research and who declines, and for what reasons? Does study participation, respondent engagement, and data quality vary by age, disease type, digital literacy, educational attainment, language literacy/acculturation, or other participant characteristics? What are the best practices for obtaining electronic consent? To what extent might digitally supported cognitive interview methods introduce bias, and what strategies are effective in limiting potential sources of bias? Can the cognitive interview data that is captured remotely, and the data collected during an in-person interview be pooled for analysis? Methodologic standards and best practices for cognitive interviewing [13, 14] should, in future iterations, address considerations for conducting digitally supported cognitive interviews. Lastly, there is a need to test, refine and scale recently described technological innovations that support inclusion of study participants who do not have access to computer hardware or internet connectivity [15].

Conclusion

This paper has highlighted considerations and illustrated strategies for adapting cognitive interviewing methods to a remote environment. As technology evolves, opportunities exist to extend the application of these approaches and refine their use in diverse research contexts. Remote cognitive interviewing methods have broad applicability for PRO researchers, particularly those studying rare conditions and recruiting populations who have traditionally been underrepresented in research. This approach also allows for interviewers and data management to be centralized; this may be particularly useful in enhancing efficiency in multi-site studies. Our experiences demonstrate that digital technologies can be successfully implemented to support remote conduct of cognitive interviews, including with children and adolescents, while preserving the methodologic principles that ensure optimal data quality and empirical rigor.

Availability of data and materials

Not applicable.

Abbreviations

PRO:

Patient-reported outcome

cGVHD:

Chronic graft-versus-host-disease

References

  1. Ellard-Gray A, Jeffrey NK, Choubak M, Crann SE (2015) Finding the hidden participant: solutions for recruiting hidden, hard-to-reach, and vulnerable populations. Int J Qual Methods 14(5):1609406915621420. https://doi.org/10.1177/1609406915621420

    Article  Google Scholar 

  2. Thunberg S, Arnell L (2021) Pioneering the use of technologies in qualitative research—a research review of the use of digital interviews. Int J Soc Res Methodol. https://doi.org/10.1080/13645579.2021.1935565

    Article  Google Scholar 

  3. Zeleke AA, Naziyok T, Fritz F, Christianson L, Röhrig R (2021) Data quality and cost-effectiveness analyses of electronic and paper-based interviewer-administered public health surveys: systematic review. J Med Internet Res 23(1):e21382. https://doi.org/10.2196/21382

    Article  PubMed  PubMed Central  Google Scholar 

  4. Wilkerson JM, Iantaffi A, Grey JA, Bockting WO, Rosser BR (2014) Recommendations for internet-based qualitative health research with hard-to-reach populations. Qual Health Res 24(4):561–574. https://doi.org/10.1177/1049732314524635

    Article  PubMed  PubMed Central  Google Scholar 

  5. Tuttas CA (2015) Lessons learned using web conference technology for online focus group interviews. Qual Health Res 25(1):122–133. https://doi.org/10.1177/1049732314549602

    Article  PubMed  Google Scholar 

  6. Gill P, Baillie J (2018) Interviews and focus groups in qualitative research: an update for the digital age. Br Dent J. https://doi.org/10.1038/sj.bdj.2018.815

    Article  PubMed  Google Scholar 

  7. Jang MK, Kim S, Collins EG, Quinn LT, Park CG, Ferrans CE (2020) Enriching the quality of cross-cultural instrument development through cognitive interviewing: implications for nursing research. Jpn J Nurs Sci 17(2):e12301. https://doi.org/10.1111/jjns.12301

    Article  PubMed  Google Scholar 

  8. Beatty PC, Willis GB (2007) Research synthesis: the practice of cognitive interviewing. Public Opin Q 71(2):287–311. https://doi.org/10.1093/poq/nfm006

    Article  Google Scholar 

  9. Kamp K, Wyatt G, Dudley-Brown S, Brittain K, Given B (2018) Using cognitive interviewing to improve questionnaires: an exemplar study focusing on individual and condition-specific factors. Appl Nurs Res 43:121–125. https://doi.org/10.1016/j.apnr.2018.06.007

    Article  PubMed  Google Scholar 

  10. Roberts JK, Pavlakis AE, Richards MP (2021) It’s more complicated than it seems: virtual qualitative research in the covid-19 era. Int J Qual Methods 20:16094069211002960. https://doi.org/10.1177/16094069211002959

    Article  Google Scholar 

  11. Glassmeyer DM, Dibbs R-A (2012) Researching from a distance: using live web conferencing to mediate data collection. Int J Qual Methods 11(3):292–302. https://doi.org/10.1177/160940691201100308

    Article  Google Scholar 

  12. Howlett M (2021) Looking at the ‘field’ through a zoom lens: methodological reflections on conducting online research during a global pandemic. Qual Res. https://doi.org/10.1177/1468794120985691

    Article  Google Scholar 

  13. Matza LS, Patrick DL, Riley AW, Alexander JJ, Rajmil L, Pleil AM et al (2013) Pediatric patient-reported outcome instruments for research to support medical product labeling: report of the ispor pro good research practices for the assessment of children and adolescents task force. Value Health 16(4):461–479. https://doi.org/10.1016/j.jval.2013.04.004

    Article  PubMed  Google Scholar 

  14. Patrick DL, Burke LB, Gwaltney CJ, Leidy NK, Martin ML, Molsen E et al (2011) Content validity—establishing and reporting the evidence in newly developed patient-reported outcomes (pro) instruments for medical product evaluation: ispor pro good research practices task force report: part 1–eliciting concepts for a new pro instrument. Value Health 14(8):967–977. https://doi.org/10.1016/j.jval.2011.06.014

    Article  PubMed  Google Scholar 

  15. Shepperd JA, Pogge G, Hunleth JM, Ruiz S, Waters EA (2021) Guidelines for conducting virtual cognitive interviews during a pandemic. J Med Internet Res 23(3):e25173. https://doi.org/10.2196/25173

    Article  PubMed  PubMed Central  Google Scholar 

  16. Thayer EK, Pam M, Al Achkar M, Mentch L, Brown G, Kazmerski TM et al (2021) Best practices for virtual engagement of patient-centered outcomes research teams during and after the covid-19 pandemic: qualitative study. J Particip Med 13(1):e24966. https://doi.org/10.2196/24966

    Article  PubMed  PubMed Central  Google Scholar 

  17. Archibald MM, Ambagtsheer RC, Casey MG, Lawless M (2019) Using zoom videoconferencing for qualitative data collection: perceptions and experiences of researchers and participants. Int J Qual Methods 18:1609406919874596. https://doi.org/10.1177/1609406919874596

    Article  Google Scholar 

  18. Lobe B, Morgan D, Hoffman KA (2020) Qualitative data collection in an era of social distancing. Int J Qual Methods 19:1609406920937875. https://doi.org/10.1177/1609406920937875

    Article  Google Scholar 

  19. Brothers KB, Clayton EW, Goldenberg AJ (2020) Online pediatric research: addressing consent, assent, and parental permission. J Law Med Ethics 48(1):129–137

    Article  Google Scholar 

  20. DeMuro CJ, Lewis SA, DiBenedetti DB, Price MA, Fehnel SE (2012) Successful implementation of cognitive interviews in special populations. Expert Rev Pharmacoecon Outcomes Res 12(2):181–187. https://doi.org/10.1586/erp.11.103

    Article  PubMed  Google Scholar 

  21. Upadhyay UD, Lipkovich H (2020) Using online technologies to improve diversity and inclusion in cognitive interviews with young people. BMC Med Res Methodol 20(1):159. https://doi.org/10.1186/s12874-020-01024-9

    Article  PubMed  PubMed Central  Google Scholar 

  22. Chiumento A, Machin L, Rahman A, Frith L (2018) Online interviewing with interpreters in humanitarian contexts. Int J Qual Stud Health Well-Being 13(1):1444887. https://doi.org/10.1080/17482631.2018.1444887

    Article  PubMed  PubMed Central  Google Scholar 

  23. Rosser BRS, Capistrant B (2016) Online versus telephone methods to recruit and interview older gay and bisexual men treated for prostate cancer: findings from the restore study. JMIR Cancer 2(2):e9. https://doi.org/10.2196/cancer.5578

    Article  PubMed  PubMed Central  Google Scholar 

  24. Han J, Torok M, Gale N, Wong QJ, Werner-Seidler A, Hetrick SE et al (2019) Use of web conferencing technology for conducting online focus groups among young people with lived experience of suicidal thoughts: mixed methods research. JMIR Ment Health 6(10):e14191. https://doi.org/10.2196/14191

    Article  PubMed  PubMed Central  Google Scholar 

  25. Mealer M, Jones J (2014) Methodological and ethical issues related to qualitative telephone interviews on sensitive topics. Nurse Res 21(4):32–37. https://doi.org/10.7748/nr2014.03.21.4.32.e1229

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This research was supported by the Cancer MoonshotSM in the Intramural Research Program of the NIH, National Cancer Institute, Center for Cancer Research.

Author information

Authors and Affiliations

Authors

Contributions

All authors participated in the concept and design of this work, and the drafting and revising of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sandra A. Mitchell.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed and approved by the National Instiututes of Health Institutional Review Board.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fry, A., Mitchell, S.A. & Wiener, L. Considerations for conducting and reporting digitally supported cognitive interviews with children and adults. J Patient Rep Outcomes 5, 131 (2021). https://doi.org/10.1186/s41687-021-00371-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41687-021-00371-5

Keywords