|
|
Does the Structure of Clinical Questions Affect the Outcome of Curbside Consultations With Specialty Colleagues?
George R. Bergus, MD;
Christina S. Randall, PhD;
Suzanne D. Sinift, MA;
David M. Rosenthal, PhD
Arch Fam Med. 2000;9:541-547.
ABSTRACT
| |
Background Clinical questions frequently arise during the practice of medicine, and primary care physicians frequently use curbside consultations with specialty physicians to answer these questions. It is hypothesized that well-formulated clinical questions are more likely to be answered and less likely to receive a recommendation for formal consultation.
Objective To assess the relationship between the structure of clinical questions asked by family physicians and the response of specialty physicians engaged in curbside consultations.
Design and Participants A case series of clinical questions asked during informal consultations between 60 primary care and 33 specialty physicians using an e-mail service. Curbside consultation questions were sent, using e-mail, to academic specialty physicians by primary care physicians (faculty, residents, and community practitioners) in eastern Iowa.
Main Outcome Measures Questions were analyzed to determine the clinical task and to identify 3 components: an intervention, a comparison, and an outcome. Consultants' responses were analyzed to identify whether questions were answered and whether consultants recommended formal consultation.
Results There were 708 questions in this analysis: 278 (39.3%) were diagnosis questions, 334 (47.2%) were management questions, 57 (8.0%) were prognosis questions, and 39 (5.5%) were requests for direction. Clinical questions were less likely to go unanswered or receive a recommendation for formal consultation when the question identified the proposed intervention (odds ratio, 0.54; 95% confidence interval, 0.34-0.86; P=.006) and desired outcome (odds ratio, 0.46; 95% confidence interval, 0.29-0.69; P<.001). Only 271 (38.3%) of 708 curbside consult questions identified both of these components.
Conclusion Medical specialists' responses to curbside consultation questions seem to be affected by the structure of these clinical questions.
INTRODUCTION
CLINICAL QUESTIONS frequently arise during the practice of medicine.1 Depending on the methods of researchers, primary care physicians generate between 0.7 and 18.5 questions for every 10 patients cared for in the office setting.2-3 When unanswered, these questions represent knowledge gaps that potentially impact quality of medical care.4-5
Recently, the structure of clinical questions has gained attention, and health professionals are advised to formulate clinical questions using a standardized approach.6-7 Well-formulated questions are thought to be those identifying an intervention of concern (eg, a treatment or a diagnostic test), the hoped-for outcome of the intervention, and, if applicable, a comparison intervention. Questions containing these components are hypothesized to be more "answerable."
The importance of well-formulated questions has largely been in the context of using the medical literature to fill knowledge gaps.6-8 However, primary care physicians infrequently use literature searches9-12 and are more likely to use informal consultations to answer their clinical questions.3, 9, 12 Thus, the curbside consult is an important means of answering clinical questions.3, 9-11,13-15
We thought it possible that the structure of a clinical question would affect the response of a specialty physician involved in a curbside consultation. We hypothesized that well-formulated clinical questions would be more likely to obtain definite answers. By better understanding the relationship between the structure of clinical questions and the responses of specialty colleagues engaged in curbside consults, it might be possible for primary care physicians to make better use of this information resource.
MATERIALS AND METHODS
The clinical questions used for this study were those asked by primary care physicians using an e-mailbased informal consultation service. The E-mail Consult Service (ECS) links primary care clinicians across Iowa with 33 specialty physicians and other health professionals (including a family therapist, a nutritionist, and a microbiologist) at the University of Iowa, Iowa City. The details of this service have been described previously.16 The ECS allows primary care physicians to send clinical questions to consultants specifically recruited for this service. Neither the primary care clinicians nor the e-mail consultants were advised on the structure of their e-mail communications. Users of the ECS were aware that their questions and answers were freely available to other primary care physicians using this service, but users were not aware of our specific study or hypotheses. All questions posed by 60 family physicians, practicing within the 1612-km2 county where the medical school is located, sent via the ECS between May 1996 and May 1999 were analyzed for this project. The physicians were faculty members or residents in a family practice training program or full-time community practitioners.
PROCEDURES
Questions asked during e-mail consultations were identified and analyzed using a taxonomy based on that proposed by Sackett.7 Each question was parsed to identify the 3 components of the taxonomy: an intervention of interest, a comparison, and a clinical outcome. Questions were also placed into 1 of 8 task categories previously identified by Sackett et al: clinical findings, diagnostic tests, etiology, differential diagnosis, prognosis, treatment, prevention, and self-improvement. Questions about whether a specific patient required a formal consultation or to whom to refer the patient were placed into the category of self-improvement. A ninth task category, request for direction, was also created. Questions in this category contained a description of a clinical situation but did not identify 1 of the 8 clinical tasks. Instead, the primary care practitioner typically asked a general question, such as "What do you think?" "Any ideas?" "What would you suggest?"
Two of us (G.R.B. and C.S.R.) independently analyzed each question using the taxonomy shown in Table 1. All discrepancies were reviewed and discussed until consensus was reached. The questions were analyzed before the consultants' answers were reviewed. Responses from the consultants were then analyzed to identify whether a question was answered and whether a consultant recommended formal consultation. The value for whether the consultant answered a question was 0.73 and for whether the consultant requested a formal consultation was 0.86. Thus, there was substantial to near-perfect agreement between the reviewers on these end points.17 Last, we recorded whether the consultant requested additional information in response to a clinical question. The thoroughness and accuracy of the answers or the appropriateness of consultants' requests for formal consultation were not analyzed.
|
|
Table 1. Taxonomy of Medical Questions for the E-mail Consult Service Structure of Questions Project*
|
|
|
Eighty-nine questions that were not about specific patients were excluded from this study because they could not result in the recommendation that a specific patient be sent for consultation. The 62 questions in the domain of self-improvement were also excluded because almost all pertained to referral. In addition, because we hypothesized that the training status of the physician (board certified vs postgraduate resident) might impact the response of the consultant, we excluded 31 questions that could not be attributed to a specific individual.
END POINTS
Outcomes of interest included whether a question was answered or whether the consultant requested a formal consultation. A priori, we also decided to combine these end points into a single outcome for a third analysis. We reasoned that when consultants had difficulty understanding a question they might not answer the question or might handle the situation by recommending formal consultation. In addition, we concluded, based on interviews with the primary care physicians who used the ECS, that both outcomes were considered "nondefinitive" from the primary care physician's perspective and that these were outcomes they wanted to avoid.
STATISTICAL ANALYSIS
Univariate associations between the individual characteristics of the questions and the end points were analyzed using 2 and t tests. Mulivariate analysis to investigate the associations between the components of a question (intervention, comparison, and outcome) and the consultants' responses were undertaken using logistic regression. A logistic model was also created that included the components of the questions, the training status of the questioner, whether the consultant believed there was adequate clinical information (assessed by whether the consultant asked for additional information), and the specialty domain of the consultants. Consultants were categorized into the domains of adult medicine, pediatrics, obstetrics and gynecology, surgery, and other (Table 2).
|
|
Table 2. Frequency Table of the Questions Posed Using the E-mail Consult Service by Specialty Domains and Consultants' Responses
|
|
|
A quality-of-question score was generated based on the components associated with the consultants' responses using logistic regression. Individual components were given scores weighted by their odds ratios (ORs) and summed for each question. The relationship between these question scores and outcomes were further assessed using the Armitage test for trend in proportions. A t test was used to compare the quality-of-question scores of board-certified physicians and trainees. Analyses were performed using statistical software (NSCC 2000; NCSS Statistical Software, Kayesville, Utah).
RESULTS
There were 708 questions in this analysis: 278 (39.3%) in the area of diagnosis, 334 (47.2%) in the area of management, and 57 (8.0%) in the area of prognosis; 39 questions (5.5%) were categorized as requests for direction because they did not identify the clinical area or task of concern and forced the consultant to formulate the question that needed to be answered. Three hundred eighty-one questions (53.8%) were posed by board-certified family physicians, and the remaining 327 (46.2%) were posed by postgraduate physician trainees.
Five hundred nine questions (71.9%) identified an intervention, 200 (28.2%) contained a comparison intervention, and 343 (48.4%) identified the sought after outcome. One hundred twenty-two questions (17.2%) specified none of these elements, 224 (31.6%) specified one, 258 (36.4%) specified 2, and 104 (14.7%) specified all 3.
The consultants answered all but 48 (6.8%) of the curbside consult questions and recommended formal consultation in response to 86 of the questions (12.1%). In total, 121 questions (17.1%) posed by primary care practitioners resulted in a nondefinitive outcome, meaning that the question went unanswered or received a recommendation for formal consultation. Board-certified physicians were more likely to receive answers to their questions than were trainees (95.8% vs 90.2%; P<.01), but consultants recommended formal consultations at a similar rate (12.3% vs 11.9%; P=.87) for these 2 groups. Both groups of primary care physicians also received a similar percentage of nondefinitive responses (15.5% vs 19.0%; P=.22). Consultants asked for additional information in response to 72 questions (10.2%) and were more likely to ask a trainee for additional information than a board-certified family physician (14.1% vs 7.1% of questions; P=.002).
Questions in which primary care providers identified the desired outcome were less likely to go unanswered than were those not identifying the desired outcome (OR, 0.51; 95% confidence interval [CI], 0.27-0.94; P=.03). The presence or absence of a proposed intervention or comparison intervention was not related to having the question answered by the consultant (P=.42 and P=.25, respectively). Questions were less likely to result in a recommendation for formal consultation when the question identified the proposed intervention (OR, 0.54; 95% CI, 0.34-0.86; P=.006) and desired outcome (OR, 0.46; 95% CI, 0.29-0.72; P=.004). The presence or absence of a comparison intervention was not related to this outcome (P=.12).
A curbside consult question was less likely to go unanswered or to receive a recommendation for a formal consultation when the question identified the proposed intervention (OR, 0.54; 95% CI, 0.34-0.86; P=.006) and desired outcome (OR, 0.46; 95% CI, 0.29-0.69; P<.001). The presence of a comparison intervention was not related to this outcome (P=.48). The areas of the clinical task, the training status of the physician asking the question, and whether the consultant requested additional information were not related to a nondefinitive outcome (P>.05 for all factors). The specialty domain of the consultant was not associated with nondefinitive outcomes except for surgical consultants. Compared with specialists in adult medicine, surgeons were more likely to respond with nondefinitive answers (OR, 3.0; 95% CI, 1.8-5.2; P<.001). Surgeons were similar to other specialties in their frequency of not answering questions (P=.85) but were much more likely to recommend a formal consultation (P<.001).
Further analysis helped detail the association between specifying an intervention or an outcome in a question and the response of the consultants. Because the ORs for these components and obtaining a nondefinitive outcome were similar, both of these question components were given a value of 1. Thus, questions were given a score of 0 if neither an intervention nor an outcome was specified, a score of 1 if either was specified, and a score of 2 if both were specified. Examples of questions taken from curbside consults, and their associated scores, are shown in Table 3. The presence of a comparison intervention was not included in this model because this component was not significantly associated with any of the end points.
|
|
Table 3. Examples of Clinical Questions Asked by Family Physicians and Their Associated Quality Scores*
|
|
|
There was a strong association between the total quality score of a question and whether there was no answer or a recommendation for formal consultation (Armitage test for trend, P<.001). When neither question component was present, 29.4% of the questions went unanswered or received a recommendation for a formal consultation. When both components were present in a question, only 10.0% of the questions resulted in a nondefinitive outcome (Table 4). Questions with a score of 0 and categorized as a request for direction were more likely to result in a nondefinitive outcome than were questions with a score of 0 that identified the clinical task of concern (39.5% vs 25.0%). However, this difference did not achieve statistical significance (P=.10).
|
|
Table 4. Association Between the Presence of an Identifiable Intervention or Outcome in a Clinical Question and the Consultant's Response
|
|
|
Questions posed to consultants in the 5 specialty domains all had similar quality scores (P=.24). Trainees tended to ask questions with a slightly lower quality score compared with board-certified practitioners (mean score, 1.15 vs 1.25; P=.047). Both groups were equally as likely to state an intervention in their questions (P=.74), but board-certified physicians were more likely to state the desired outcome (P=.002). Trainees and board-certified providers were equally likely to ask a request for direction question (4.3% vs 6.1% of their questions, respectively; P=.23).
COMMENT
Consultants' responses to informal or curbside consultation questions from primary care physicians were strongly associated with the structure of the clinical questions. Primary care physicians were more likely to obtain an answer and less likely to get a recommendation for a formal consultation when their questions clearly identified a proposed intervention and the desired outcome. An example of such a question is "Will the addition of a -blocker lengthen the life of a 58-year-old woman with moderate congestive heart failure who is already taking an angiotensin-converting enzyme inhibitor?" This question can be compared with a less well-formulated one such as "What should I do for a 58-year-old woman with moderate congestive heart failure who is already taking an angiotensin-converting enzyme inhibitor?" Nearly 30% of the questions that did not clearly identify an intervention and desired outcome went unanswered or received a recommendation for a formal consult. In contrast, when both were specified, only 10% of clinical questions resulted in this outcome.
The association between the structure of questions and consultants' responses was independent of the training status of the asking physician and uniform over most consulting domains. Although our findings suggest that primary care physicians can affect the consultants' answers by how they structure their clinical questions, we also found that primary care clinicians did not routinely ask well-formulated clinical questions. Overall, about 40% of the curbside consult questions clearly identified an intervention and desired outcome, and there was little difference in how experienced clinicians and physicians still in training structured their questions. This finding suggests that how questions are structured might not be related to general medical knowledge and that even experienced physicians may benefit from training in structuring their clinical questions. (A tutorial on formulating clinical questions is available on the Internet at http://fpinfo.medicine.uiowa.edu/tutorial/intro_questions.htm.)
The demonstrated associations are consistent with the literature on problem solving. It has long been held that formulating an answerable question is a fundamental problem-solving skill because well-structured problems are more solvable than are ill-structured ones.18-20 Well-formulated clinical questions might require less effort on the consultant's part to answer and thus are more likely to be answered. The association between the structure of a question and the recommendation for a formal consultation might be linked to perception of expertise. An attribute of expertise, including medicine expertise, is the ability to formulate well-defined expressions of ill-defined problems.21-22 Consultants might interpret a primary care physician's ability to structure a well-defined question about a clinical problem as evidence that he has sufficient expertise to manage the problem. This explanation deserves further study.
Limitations of our research need to be noted. The first is that clinical questions studied for this analysis are those posed using an e-mailbased service. It is possible that consultants respond differently to e-mail questions than they do in-person or on the telephone.23 When a clinical question is posed using e-mail, consultants have less immediate access to additional clinical information but more time to ponder the question and compose an answer. Whether the structure of clinical questions affects the responses of consultants engaged in face-to-face exchanges with primary care physicians deserves study.
Second, this is an observational study. Although we documented an association between the structure of a question and a consultant's response, we cannot determine whether the association is one of cause and effect. However, the association between the quality of question and the responses of consultants is strong and hierarchical in nature. In addition, the association between the quality of question and consultants' responses is independent of the training status of the questioner or the clinical task.
Third, we did not assess the satisfaction of the consultants with how individual questions were formulated by family physicians or whether their answers satisfied individual questioners. We also do not have information on whether the consultant's recommendations were followed and, therefore, cannot assess the association between the structure of clinical questions and clinical outcomes of patients.
In conclusion, the structure of questions asked during curbside consultations was associated with whether consultants answered a question or requested a formal consultation. Only 38% of questions contained the 2 key components of well-structured questions. Although experienced family physicians asked slightly higher-quality questions than did trainees, our findings suggest that many physicians might benefit from additional training on how to ask clinical questions.
AUTHOR INFORMATION
Accepted for publication February 14, 2000.
This project was supported in part by Grant for Graduate Training 2 5D15PE10299 from the Human Resources and Services Administration, Department of Health and Human Services, Rockville, Md.
We thank the many faculty and staff members of the University of Iowa, Iowa City, who expertly served as e-mail consultants. Their enthusiastic willingness to engage in this new form of curbside consultation made this project possible. We also thank the Information Systems staff members at the University of Iowa College of Medicine for their many hours of technical support and problem solving in support of the E-mail Consult Service.
Corresponding author and reprints: George R. Bergus, MD, Department of Family Medicine, the University of Iowa College of Medicine, 200 Hawkins Dr, 01105 PFP, Iowa City, IA 52242 (e-mail: george-bergus{at}uiowa.edu).
From the Department of Family Medicine, University of Iowa College of Medicine, Iowa City.
REFERENCES
| |
1. Smith R. What clinical information do doctors need? BMJ. 1996;313:1062-1068.
FREE FULL TEXT
2. Timpka T, Arborelius E. The GP's dilemmas: a study of knowledge need and use during health care consultations. Methods Inf Med. 1990;29:23-29.
ISI
| PUBMED
3. Ely JW, Burch RJ, Vinson DC. The information needs of family physicians: case-specific clinical questions. J Fam Pract. 1992;35:265-269.
ISI
| PUBMED
4. Chambliss ML, Conley J. Answering clinical questions. J Fam Pract. 1996;43:140-144.
ISI
| PUBMED
5. Gorman PN, Ash J, Wykoff L. Can primary care physicians' questions be answered using the medical journal literature? Bull Med Libr Assoc. 1994;82:140-146.
ISI
| PUBMED
6. Richardson WS, Wilson MC, Nishikawa J, Hayward RS. The well-built clinical question: a key to evidence-based decisions [editorial]. ACP J Club. 1995;123:A12-A13.
7. Sackett DL. Evidence-Based Medicine: How to Practice and Teach EBM. New York, NY: Churchill Livingstone Inc; 1997.
8. Ebell M. Information at the point of care: answering clinical questions. J Am Board Fam Pract. 1999;12:225-235.
PUBMED
9. Ely JW, Osheroff JA, Ebell MH, et al. Analysis of questions asked by family doctors regarding patient care. BMJ. 1999;319:358-361.
FREE FULL TEXT
10. Haug JD. Physicians' preferences for information sources: a meta-analytic study. Bull Med Libr Assoc. 1997;85:223-232.
ISI
| PUBMED
11. Cullen R. The medical specialist: information gateway or gatekeeper for the family practitioner. Bull Med Libr Assoc. 1997;85:348-355.
PUBMED
12. Connelly DP, Rich EC, Curley SP, Kelly JT. Knowledge resource preferences of family physicians. J Fam Pract. 1990;30:353-359.
ISI
| PUBMED
13. Dee C, Blazek R. Information needs of the rural physician: a descriptive study. Bull Med Libr Assoc. 1993;81:259-264.
ISI
| PUBMED
14. Keating NL, Zaslavsky AM, Ayanian JZ. Physicians' experiences and beliefs regarding informal consultation. JAMA. 1998;280:900-904.
FREE FULL TEXT
15. Kuo D, Gifford DR, Stein MD. Curbside consultation practices and attitudes among primary care physicians and medical subspecialists. JAMA. 1998;280:905-909.
FREE FULL TEXT
16. Bergus GR, Sinift SD, Randall CS, Rosenthal DM. Use of an e-mail curbside consultation service by family physicians. J Fam Pract. 1998;47:357-360.
ISI
| PUBMED
17. Sackett DL. Clinical Epidemiology: A Basic Science for Clinical Medicine. 2nd ed. Boston, Mass: Little Brown & Co Inc; 1991.
18. Schoenfeld AH. Learning to think mathematically: problem solving, metacognition, and sense-making in mathematics. In: Grouws D, ed. Handbook for Research on Mathematics Teaching and Learning. New York, NY: Macmillan Publishing Co Inc; 1992:334-370.
19. Pólya G. How to Solve It: A New Aspect of Mathematical Method. 2nd ed. Garden City, NY: Doubleday & Co Inc; 1957.
20. Heylighen F. Formulating the problem of problem-formulation. In: Trappl R, ed. Cybernetics and Systems '88. Dordrecht, the Netherlands: Kluwer Academic Publishers; 1988:949-957.
21. Bedard J, Chi MT. Expertise. Curr Directions Psychol Sci. 1992;1:135-139.
FULL TEXT
22. Elstein AS, Shulman LS, Sprafka SA. Medical Problem Solving: An Analysis of Clinical Reasoning. Cambridge, Mass: Harvard University Press; 1978.
23. Golub RM. Curbside consultations and the viaduct effect [editorial; comment]. JAMA. 1998;280:929-930.
FREE FULL TEXT
THIS ARTICLE HAS BEEN CITED BY OTHER ARTICLES
|
Weighing the Evidence: PICO Questions: What Are They, and Why Bother?
Moyer
AAP Grand Rounds 2008;19:2-2.
FULL TEXT
Towards evidence-based medicine for paediatricians
Phillips
Arch. Dis. Child. 2007;92:1036-1036.
FULL TEXT
Towards evidence-based medicine for paediatricians
Phillips
Arch. Dis. Child. 2007;92:817-817.
FULL TEXT
Patient-Care Questions that Physicians Are Unable to Answer
Ely et al.
J. Am. Med. Inform. Assoc. 2007;14:407-414.
ABSTRACT
| FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2007;92:644-644.
FULL TEXT
Towards evidence-based medicine for paediatricians
Phillips
Arch. Dis. Child. 2007;92:266-266.
FULL TEXT
Towards evidence-based medicine for paediatricians
Phillips
Arch. Dis. Child. 2006;91:1033-1033.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2006;91:859-859.
FULL TEXT
How to read your journals:.
Phillips
Arch. Dis. Child. 2006;91:789-789.
FULL TEXT
Towards evidence based medicine for paediatricians.
Phillips
Arch. Dis. Child. 2006;91:532-532.
FULL TEXT
The wisdom of Archimedes
Phillips
Arch. Dis. Child. 2006;91:95-96.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2006;91:74-74.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2006;91:74-74.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2005;90:1308-1309.
FULL TEXT
Test/don't test?
Phillips
Arch. Dis. Child. 2005;90:1308-1308.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2005;90:642-642.
FULL TEXT
Answering Physicians' Clinical Questions: Obstacles and Potential Solutions
Ely et al.
J. Am. Med. Inform. Assoc. 2005;12:217-224.
ABSTRACT
| FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2005;90:319-320.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2005;90:99-100.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2004;89:881-882.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2004;89:683-684.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2004;89:489-490.
FULL TEXT
The need for needs assessment in continuing medical education
Norman et al.
BMJ 2004;328:999-1001.
FULL TEXT
Towards evidence based medicine for paediatricians
Arch. Dis. Child. 2004;89:286-287.
FULL TEXT
Towards evidence based medicine for paediatricians
Arch. Dis. Child. 2004;89:81-82.
FULL TEXT
Towards evidence based medicine for paediatricians
Arch. Dis. Child. 2003;88:1131-1132.
FULL TEXT
Towards evidence based medicine for paediatricians
Arch. Dis. Child. 2003;88:831-832.
FULL TEXT
Evidence-based Medicine: Asking The Answerable Question (Question Templates as Tools)
Onady and Raslich
Pediatr. Rev. 2003;24:265-268.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2003;88:638-638.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2003;88:454-454.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2003;88:234-234.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2003;88:82-83.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2002;87:411-411.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2002;87:258-258.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2002;87:77-77.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2002;86:380-380.
FULL TEXT
Obstacles to answering doctors' questions about patient care with evidence: qualitative study
Ely et al.
BMJ 2002;324:710-710.
ABSTRACT
| FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2002;86:59-59.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2001;85:431-434.
FULL TEXT
Towards evidence based medicine for paediatricians
Phillips
Arch. Dis. Child. 2001;85:252-255.
FULL TEXT
|