Skip to main content

Cognitive interviewing methodology in the development of a pediatric item bank: a patient reported outcomes measurement information system (PROMIS) study

Abstract

Background

The evaluation of patient-reported outcomes (PROs) in health care has seen greater use in recent years, and methods to improve the reliability and validity of PRO instruments are advancing. This paper discusses the cognitive interviewing procedures employed by the Patient Reported Outcomes Measurement Information System (PROMIS) pediatrics group for the purpose of developing a dynamic, electronic item bank for field testing with children and adolescents using novel computer technology. The primary objective of this study was to conduct cognitive interviews with children and adolescents to gain feedback on items measuring physical functioning, emotional health, social health, fatigue, pain, and asthma-specific symptoms.

Methods

A total of 88 cognitive interviews were conducted with 77 children and adolescents across two sites on 318 items. From this initial item bank, 25 items were deleted and 35 were revised and underwent a second round of cognitive interviews. A total of 293 items were retained for field testing.

Results

Children as young as 8 years of age were able to comprehend the majority of items, response options, directions, recall period, and identify problems with language that was difficult for them to understand. Cognitive interviews indicated issues with item comprehension on several items which led to alternative wording for these items.

Conclusion

Children ages 8–17 years were able to comprehend most item stems and response options in the present study. Field testing with the resulting items and response options is presently being conducted as part of the PROMIS Pediatric Item Bank development process.

Background

The Patient Reported Outcomes Measurement Information System (PROMIS) project, a National Institute of Health Roadmap for Medical Research initiative, was developed to advance the science and application of patient-reported outcomes (PRO) in chronic diseases [1]. The process of developing item banks for PROMIS includes literature review, focus groups, and individual cognitive interviews [24]. Among the qualitative methods, cognitive interviewing allows direct input from respondents on the item content, format, and understandability. This method has emerged as an essential component in the development of a number of standardized measures [57].

The cognitive interviewing methodology for PROMIS was designed to elicit input from respondents on all items under consideration for the PROMIS item bank [3]. The pediatric cognitive interviewing methodology followed the general principles of the PROMIS Network [3], with the necessary adaptations required for children as young as 8 years of age, relying in part on the cognitive interviewing methodology utilized in the development of the PedsQL™ instruments [8] and the work of Willis [9].

The cognitive interviewing methodology is designed to assess the cognitive processes underlying respondents' comprehension and generation of answers to questionnaire items within an information processing conceptual model [10]. The intent of cognitive interviewing is to determine what the respondent thinks or comprehends a particular item is asking (what do specific words and phrases in the item stem mean to the respondent); the processes used by the respondent to retrieve relevant information from autobiographical memory; the decision or judgment processes used to conceive an answer; and the process of formulating a response to the item stem [1013].

Although there are two major types of cognitive interviewing methods (think-aloud and respondent debriefing), the PROMIS cognitive interviews employed the respondent debriefing technique [7]. In this technique, after a participant completes the questionnaire, an interviewer probes for specific information on what types of difficulties respondents experienced while completing the items, and the basis for the response for each item [9]. Cognitive probes elicit information regarding the clarity and rationale of the directions, the meaning of the items, the appropriateness of the response choices, and overall comments on the relevance and complexity of the questionnaire [12, 13].

The primary objective of this study was to conduct cognitive interviews with children and adolescents to gain feedback on items measuring physical functioning, emotional health, social health, fatigue, pain, and asthma-specific symptoms.

Methods

Item development

The PROMIS Pediatrics project focused on the development of PRO item banks across several health domains for youth ages 8–17 years. Initially, PROMIS focused on the measurement of generic health domains that are important across a variety of illnesses, including physical function, pain, fatigue, emotional distress, and social function [2]. Since asthma is the most common chronic disease of childhood, and PRO measurement is an essential component of evaluation of outcomes for children with asthma [1416], asthma was an excellent chronic condition for the initial development of the PROMIS pediatrics disease specific item bank.

The PROMIS item bank was developed using a strategic item generation methodology. A series of focus groups were conducted to generate themes and domains [4]; a literature review was conducted to identify existing pediatric health questionnaires; and discussions with health care and research personnel (including physicians, psychologists, social workers, epidemiologists and nurses) were utilized to identify an initial item pool of over 3345 items. These items were "binned" (i.e., items were classified into domains according to their content) and "winnowed" (items were eliminated that either lacked face validity for the domain or were very similar to a more ideally worded item) [2, 3] by the PROMIS pediatric project team. Items were rewritten or modified to adhere to a set of formatting requirements accepted by the PROMIS development team (e.g., use of past tense, 7 day recall period, standard response options (see Table 1 for response options utilized)). Cognitive interviews were conducted on the resulting 318 items across 6 domains, after which 35 items were revised and underwent a second round of cognitive interviews. The final item set contained 293 items across 6 domains (Physical Function = 70 items; Emotional Health = 49 items; Social Health = 74 items; Fatigue = 39 items; Pain = 27 items; Asthma = 34 items).

Table 1 Item response options

Participants

To participate in the cognitive interviews at The Children's Hospital at Scott and White (S&W) and the University of North Carolina (UNC), participants needed to meet the following criteria: between the ages of 8 and 17 years inclusive; speak and read English; provide informed assent prior to study entry; and provide parent or guardian informed consent. We also recruited children with asthma to review all domain items and asthma-specific items. Participants were not eligible for the study if they had any concurrent medical, psychiatric or cognitive conditions that, in the investigator's opinion, would interfere with participation in this study.

Purposive sampling was used to recruit a total of 28 children and adolescents from the UNC (6 with asthma; 22 without asthma) hospital and community clinics and 37 children and adolescents from the general pediatric clinic at S&W (16 with asthma; 21 without asthma), who participated in the first round of cognitive interviews. For the second round of cognitive interviews, 18 children and adolescents from S&W and 5 children from UNC participated (11 of these 23 participated in first round interviews). Table 2 lists the demographic characteristics of the first round cognitive interview participants from each site. For each domain questionnaire, the cognitive interview sample included at least 2 children 8 or 9 years of age, 1 adolescent between 13 and 18 years, 2 children of non-white ethnicity, and 1 child of white/Caucasian ethnicity. These categories were not exclusive. For example, a Latina girl age 8 would fulfill both the racial/ethnic requirement and the age requirement.

Table 2 Participant demographics and clinical characteristics for first round cognitive interviews

Recruitment procedures

At both UNC and the S&W, potential participants were identified through review of clinic appointment books. A research assistant then mailed an informational letter to the child's parent to inform them about the study. Those who were interested in participating contacted the study coordinator to schedule their interview time. If the child was deemed eligible to participate in the cognitive interview and the parents agreed to allow their child to participate, they were scheduled for an interview date. At the time of the interview, a trained research assistant obtained parental informed consent and the children signed an assent document. All child participants received a $25 gift card in return for their time and effort. Children were allowed to take a break or end the interview at any time, although no children ended the interview prematurely. The study protocols were approved by the institutional review boards at UNC (#05-1431) and at S&W (#05-0077).

Cognitive interviewing process

The interviewers utilized for this study underwent an extensive training session (16 hours) that included general information on cognitive interview theory and procedures, as well as pediatric specific procedures. Interviewers were graduate students in social work or research nurses all who had experience working with children in pediatric research settings. All interviewers were trained by a pediatric psychologist with extensive experience in children's therapy and qualitative questionnaire development. Interviews were conducted in a comfortable environment and breaks were offered for the children.

We applied a sampling scheme that allowed each participant to be interviewed on approximately 30 items rather than all 318 items. Each child evaluated items from only one or two domains and only one response scale. By this method, all items in the bank were reviewed by at least 5 participants (59% of items were reviewed by 5 participant; 34% were reviewed by 6 participants; 7% were reviewed by 7 participants) meeting the target demographic characteristics outlined above (see Participants Section). During the cognitive interviews, participants were asked to provide verbal open-ended feedback on each item regarding response categories, time frame, item interpretation and overall impression of domain content and coverage.

Parents were asked to complete a sociodemographic form which contained information regarding the child's age, gender, ethnicity, living situation, and chronic health condition(s) as well as the parent/guardian's employment and education. Parents of children with asthma also completed an asthma form, which contained information about the number of days and nights in the previous week the child had coughing, wheezing, or shortness of breath, the number of times in the previous week the child used rescue medication, and the types of medications the child was taking. These demographic characteristics are described in Table 2.

Other than the children with asthma who underwent the cognitive interview on the asthma-specific item set, participants were randomly assigned to receive an item set (approximately 30 items) selected from one of the domains. Prior to the cognitive interview, participants completed an item set through paper and pencil administration. A research assistant trained in cognitive interviewing techniques then reviewed each item stem and item response with the child and began the interview using standardized questions (see Table 3) for each item. A subset of participants were asked questions about preference of item tense (past vs. present). The participant's comprehension or interpretation of the item along with their preferences on recall options and recall time period was elicited. All participant answers were recorded on a computerized spreadsheet. At the end of the interview, participants completed the Wide Range Achievement Test-3 Reading Subtest (WRAT) as a gross measure of reading ability [17]. Interviews were also audio-taped to ensure accuracy of interviewer notes.

Table 3 Cognitive interview questions

Data analysis and item revision

After each interview, project personnel completed a summary statement for each item and the child's comments. After completing all initial cognitive interviews for an item, project personnel compiled reports that included all comments for an item. The item development team then reviewed all of the comments to determine issues with formatting, item comprehension, instructions, tense, and response options (see Table 4). Items deemed problematic by two or more children of any age were revised for clarity. Other items similar to those revised after the initial interview process were also changed by project personnel to maintain consistency across item stems or wording. In all, 35 items were revised as a result of the first round of cognitive interviews.

Table 4 Common issues identified by participants in first round of interviews

To ensure comprehension of the 35 revised items, a second set of cognitive interviews was conducted. Project personnel then reviewed the revised items and participants' responses from the second review. Items that continued to be problematic to research participants after the second round were eliminated from the item bank. Table 5 shows the 22 items that were retained in the final item bank and revised after the second round of cognitive interviews, along with the reasons for revising the items.

Table 5 PROMIS pediatric revised items and reasons for revision

Results

Children who participated in the cognitive interviews spent approximately 1 hour with each interviewer, with some children (for example, younger children who took breaks) requiring additional time. In general, even children as young as 8 could understand the majority of the items (293/318 = 92%) and response options, indicating that they could think about and discuss their own health. Although younger children had a more difficult time with specific words, they understood the purpose of the items and response options and were able to provide alternatives using their own vocabulary. They also had no difficulty understanding that they needed to answer questions while thinking about specific recall periods. Older children seemed to clearly understand the majority of items and response options, and had fewer comprehension difficulties than younger children.

Tables 4 and 5 outline common issues identified by participants. Specific words (i.e., irritable, stressed) were difficult to comprehend for some children and items were sometimes too vague or ambiguous to be clearly understood. The majority of items (92%) were retained in the item banks for further large scale testing.

There was no indication that children had difficulty with the response options, except that younger children seemed to misunderstand the word "difficulty". When questioned, children were able to distinguish between the different response options, indicating that they could clearly identify variable levels of functioning, so the word "difficulty" was changed to "trouble" in subsequent cognitive interviews. Additionally, 48/53 (91%) of the children reported that the 7 day recall period meant the previous 7 days, and they responded to items accordingly. A subset of children were probed on present and past tense preferences for the item stems; 8 preferred the present tense, 8 preferred the past tense, and 9 had no stated preference when referring to the past 7 days. Participants had an overall positive opinion of the items and did not provide any suggestions for additional content that was not included in the current item banks.

Discussion

These results confirm that children ages 8–17 can talk about and respond to items asking them about their health and well-being. They can also offer unique insight into the understandability of the items. These findings are consistent with other studies [5, 6]. The majority of the items were well comprehended by all age groups, but we also identified several terms that were not well understood by younger children. Items containing difficult words or vague concepts were readily identified by the children and led to important questionnaire changes.

We also received valuable feedback on the format of the questionnaire, including increasing the font size for ease of readability, shortening the instructions, and putting the recall period in bold type. For some children, certain items were not applicable to them; for example, one child didn't have a computer at home, so he could not answer items related to computer use. Similarly, items that asked about walker or wheelchair use were not applicable to the majority of children interviewed, so feedback was limited for these items.

The sample included an almost equal distribution of children in different age groups, and represented a diverse population. One benefit of the sample is that it included a number of children with asthma, ensuring that comments from children with the most common chronic disease in the United States were included. The sample was well balanced for socioeconomic status and race/ethnicity, which is a strength of this study.

Our study is similar to other cognitive interview studies for children's PRO instrument development. For example, we found that younger children had more difficulty understanding specific item words than older children, particularly for words such as "irritable", "nervous" and "worried". Children in our study also had difficulty understanding ambiguous terms or phrases such as "did things" and "activities". These findings are consistent with other studies of child-reported health outcomes [5, 18, 19]. Additionally, like other studies, the children in our study reported few issues with the response formats using up to 5 response options, and were able to respond to items within the recall period [5]. On occasion, the PROMIS pediatrics item development team had to decide what to do if a suitable synonym or content description was not available for substitution when a word was not well understood by some children. For example, the idea of "worry" is important content for the anxiety domain and it was kept in the item bank even though some children noted problems. These items will be reviewed again after large scale testing is completed and final decisions for these items will be made at that time.

Our study has several limitations. First, each item received a minimum of 5 cognitive interviews. Although we felt this was sufficient, some authors suggest that 10 – 15 interviews are better [9]. Because of experience on previous scale development projects [5, 18, 19] with very similar items we felt comfortable performing fewer overall interviews on these items. Since a minimal number of children ages 8 or 9 were required to review the items, some important findings for this age group could be missed. Secondly, as with any qualitative study, the item development team had to make judgments as to the importance of an item problem and whether revisions were necessary. We tried to adhere to the operationalization of two negative comments leading to revision, but all such judgments are inherently qualitative. Our team, however, was interested in identifying the most clear and important items for inclusion and carefully responded to all of the feedback from the children. Lastly, the interview questions about content validity were phrased very broadly and did not add additional information to our previous studies utilizing focus groups [4].

Conclusion

Overall, the findings of the cognitive interviews suggest that children as young as 8 years could respond to items and talk about all aspects of their health and well-being in meaningful ways. They are able to comprehend varying response options on a categorical scale, and can accurately respond to items using a 7-day recall period. Feedback from the children who participated was valuable in creating a set of items to be administered to a wide age range of children. The final item set generated as a result of the cognitive interview process is currently undergoing large scale testing as part of the PROMIS Pediatric Item Bank development process.

Abbreviations

(PROMIS):

Patient Reported Outcomes Measurement Information System

(PROs):

Patient-reported outcomes

(S&W):

Scott and White

(UNC):

University of North Carolina

(WRAT):

Wide Range Achievement Test-3 ReadingSubtest

(PedsQL™):

Pediatric Quality of Life Inventory™.

References

  1. Ader DN: Developing the Patient-Reported Outcomes Measurement Information System (PROMIS). Medical Care 2007,45(Suppl 1):S1-S2. 10.1097/01.mlr.0000260537.45076.74

    Article  Google Scholar 

  2. Cella D, Yount S, Rothrock N, Gershon R, Cook K, Reeve B, Ader DN, Fries JF, Bruce B, Rose M: The Patient-Reported Outcomes Measurement Information System (PROMIS): Progress of an NIH Roadmap Cooperative Group during its first two years. Medical Care 2007,45(Suppl 1):S3-S11. 10.1097/01.mlr.0000258615.42478.55

    Article  PubMed Central  Google Scholar 

  3. DeWalt D, Rothrock N, Yount S, Stone AA, PROMIS Cooperative Group: Evaluation of Item Candidates: The PROMIS qualitative item review. Medical Care 2007,45(Suppl 1):S12-S21. 10.1097/01.mlr.0000254567.79743.e2

    Article  PubMed Central  Google Scholar 

  4. Walsh TR, Irwin DE, Meier A, Varni JW, DeWalt D: The use of focus groups in the development of the PROMIS Pediatric Item Bank. Qual Life Res 2008, 17: T725–735. 10.1007/s11136-008-9338-1

    Article  Google Scholar 

  5. Varni JW, Seid M, Rode CA: The PedsQL™: Measurement model for the Pediatric Quality of Life Inventory. Medical Care 1999, 37: 126–139. 10.1097/00005650-199902000-00003

    Article  CAS  Google Scholar 

  6. Rebok G, Riley A, Forrest CB, Starfield B, Green BF, Robertson J, Tambor E: Elementary school-aged children's reports of their health: A cognitive interviewing study. Quality of Life Research 2001, 10: 59–70. 10.1023/A:1016693417166

    Article  CAS  Google Scholar 

  7. Woolley ME, Bowen GL, Bowen NK: The development and evaluation of procedures to assess child self-report item validity. Educational and Psychological Measurement 2006, 66: 687–700. 10.1177/0013164405282467

    Article  PubMed Central  Google Scholar 

  8. Varni JW, Seid M, Kurtin PS: PedsQL 4.0: reliability and validity of the Pediatric Quality of Life Inventory version 4.0 Generic Core Scales in healthy and patient populations. Med Care 2001,39(8):800–812. 10.1097/00005650-200108000-00006

    Article  CAS  Google Scholar 

  9. Willis GB: Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage Publications; 2005.

    Google Scholar 

  10. Jobe JB: Cognitive psychology and self-reports: Models and methods. Quality of Life Research 2003, 12: 219–227. 10.1023/A:1023279029852

    Article  Google Scholar 

  11. Tourangeau R, Rips LJ, Rasinski K: The Psychology of Survey Response. Cambridge: Cambridge University Press; 2000.

    Chapter  Google Scholar 

  12. Sudman S, Bradburn NM, Schwarz N: Thinking about answers: The application of cognitive processes to survey methodology. San Francisco: Jossey-Bass; 1996.

    Google Scholar 

  13. Schwarz N, Sudman N, (eds): Answering questions: Methodology for determining cognitive and communicative processes in survey research. San Francisco: Jossey-Bass; 1996.

  14. Chan KS, Mangione-Smith R, Burwinkle TM, Rosen M, Varni JW: The PedsQLTM: Reliability and validity of the short-form generic core scales and asthma module. Med Care 2005, 43: 256–265. 10.1097/00005650-200503000-00008

    Article  Google Scholar 

  15. Guyatt GH, Juniper EF, Griffith LE, Feeny DH, Ferrie PJ: Children and adult perceptions of childhood asthma. Pediatrics 1997,99(2):165–168. 10.1542/peds.99.2.165

    Article  CAS  Google Scholar 

  16. Juniper EF, Guyatt GH, Feeny DH, Ferrie PJ, Griffith LE, Townsend M: Measuring quality of life in children with asthma. Qual of Life Res 1996,5(1):35–46. 10.1007/BF00435967

    Article  CAS  Google Scholar 

  17. Wilkinson GS: WRAT3 Wide Range Achievement Test Administration Manual. Wilmington: Wide Range, Inc; 1993.

    Google Scholar 

  18. Woolley MF, Bowen GL, Bowen NK: Cognitive Pretesting and the Developmental Validity of Child Self-Report Instruments: Theory and Applications. Research on Social Work Practice 2004, 14: 191–200. 10.1177/1049731503257882

    Article  PubMed Central  Google Scholar 

  19. Ravens-Sieberer U, Gosch A, Rajmil L, Erhart M, Bruil J, Duer W, Auquier P, KIDSCREEN Group: KIDSCREEN-52 Quality of Life Measure for Children And Adolescents. Expert Review of Pharmacoeconomics and Outcomes Research 2005, 5: 353–364. 10.1586/14737167.5.3.353

    Article  Google Scholar 

Download references

Acknowledgements

We would like to acknowledge the contributions of Jin-Shei Lai PhD, Esi DeWitt MD, Kelli Scanlon, Kelly Williams and Tasha Burwinkle PhD for their contributions to reviewing items and cognitive interview data. We would like to acknowledge the contribution of Harry A. Guess, MD, PhD to the conceptualization and operationalization of this research prior to his death.

This work was funded by the National Institutes of Health through the NIH Roadmap for Medical Research, Grant 1U01AR052181-01. Information on the Patient-Reported Outcomes Measurement Information System (PROMIS) can be found at http://nihroadmap.nih.gov/ and http://www.nihpromis.org.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Debra E Irwin.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

All authors have made substantial contributions to conception and design, or acquisition of data, or analysis and interpretation of data, been involved in drafting the manuscript or revising it critically for important intellectual content; and have given final approval of the version to be published.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Irwin, D.E., Varni, J.W., Yeatts, K. et al. Cognitive interviewing methodology in the development of a pediatric item bank: a patient reported outcomes measurement information system (PROMIS) study. Health Qual Life Outcomes 7, 3 (2009). https://doi.org/10.1186/1477-7525-7-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1477-7525-7-3

Keywords