Science Question

Description

Part 1 Discussion 2Please refer to Slide 7 of the Week 3 slide deck for this reply.If you had to make an informed decision, as a health professional, in moving forward on a treatment, intervention, or policy, which 1 or 2 of these considerations would be important for you, and why?

Don't use plagiarized sources. Get Your Custom Assignment on
Science Question
From as Little as $13/Page

Unformatted Attachment Preview

1
Thinking Scientifically:
Evidence-Based Practices
Evidence-Based Public Health (MEDS 4053)
Kelley A. Carameli, DrPH
Week 3
2
What is Evidence?
Defining Evidence
• Evidence: The available facts or information on whether a belief or
concept is true or valid.
▪ Law: expert/witness testimony, forensics
▪ Medicine: epidemiologic (quantitative), experts
▪ Public Health: epidemiologic (quantitative), program or policy
evaluations, qualitative (key informants, document studies, clinical
or public health practice)
• Evidence is used in decision-making and priority setting.
• Evidence is most useful and effective when:
▪ Based on systematic assessment or evaluation
▪ Data are timely or relevant to issue at hand
▪ Real-world application
▪ Triangulated: quantitative, qualitative, and cultural/geographic
3
Evidence: Informed by Science
How do we obtain public health evidence?
• By using science, or scientific methods.
What is science, or the scientific method?

A way of systematic thought and investigation to obtain reliable information.
Obtaining knowledge through evidence. Communicating knowledge to
others. Using methods that can be replicated (checked, valid, objective).

A process of thinking and answering questions by formulating questions and
using a set of rules for inquiry or answering questions.
Posing questions to understand relationships. Testing the proposed
relationship against reality. Determining if something happened.
• Additional consideration in scientific inquiry:
– Reasoned judgment: using the best available knowledge to inform
decision-making in the absence of complete evidence.
– Opinion: personal view of reality. Custom: social influence on reality.
4
Thinking Scientifically in Health
Thinking Scientifically in Public Health
• A way of systematic thought and investigation to obtain reliable
information.
• A process of thinking and answering questions by formulating questions
and using a set of rules for inquiry or answering questions.
We often start
and stop here
We also need to
progress to here
5
Thinking Scientifically in Health
When to Apply Evidence-Based Approaches
• Use evidence-based (scientific) approaches in public health:
– when conducting literature reviews for grant proposals
– when evaluating the effectiveness and cost-benefit of health programs
– when establishing new health programs
– when policies are being implemented
– when scientific evidence is important to support decision-making.
Are there ever disadvantages to using an evidence-based approach?
Case example: A public health department wants to be innovative in
establishing new programs. It is particularly interested in developing a
program quickly to address transgender bullying in its county even though
evaluative information is not readily available on successful interventions
for this group. Should it proceed without evidence? What are the
advantages and/or disadvantages?
6
When to Balance Science + Action
Balancing Evidence/Scientific Approaches + Public Health Actions
• Be realistic…every decision may not have robust evidence. Instead:
– Use and weight all available information
• Magnitude of the problem, known risk determinants?
• Stakeholder opinion, existing practices/traditions?
– Balance program implementation against its fidelity (original design)
vs. reinvention (new setting, local context)
• Recognize trade-offs – urgent action may limit ability for evidence-based
decisions, but poorly informed action may also be hard to change
• Considerations for decision-makers:
• Size/scope of problem
• Intervention effectiveness
• Intervention cost, value, alternatives
• Equitable
• Preventability
• Benefits and Harms
• Acceptability (culture, values)
• Sustainable, Appropriate
7
When to Balance Science + Action
Considerations in evidence-based decision making:
Size/scope of problem
Intervention
effectiveness
Intervention cost
Equitable
Preventability
Benefits and Harms
Acceptability
Sustainable,
Appropriate












Is it important?
What is the public health burden?
Does it work in real-world settings?
Is there better evidence for an alternative?
Is it affordable?
Does it distribute resources fairly?
What is the efficacy?
Can it work in an “ideal” circumstance?
What are the consequences? Trade-offs?
Is it consistent with community priorities, culture,
values, the political situation?
Are resources and incentives likely needed to
support/maintain the intervention?
Is it likely to work in this setting? Others?
Anderson et al., 2005, American Journal of Preventive Medicine
8
Benefits + Limits of Using Evidence
Benefits
Limits


Evidence shown for one setting or
time may not translate to another
– Social context/culture shapes
behavior, as does new data
– Pap/prostate screening, DARE

Some outcomes easier to measure
– Standard: tobacco use, vaccines
– Mixed: cultural competency,
health literacy

Socio-political values shape public
health and what we measure
– Risk-reduction vs. abstinence
(sexual health, substance use)
– Natural experiments vs.
randomized control groups


Greater likelihood of implementing
more effective policies/ programs
– Better use of resources (staff,
materials, time, etc.)
– More informed (and productive)
public health workforce
Greater likelihood of impacting
change in public health issues
– Change informed by evidence
– Change may be reproducible
(systematic)
Access to higher-quality
information. Learn what works!
9
Looking at the Scientific Evidence
Scientific Evidence and Public Health Action
• The higher-quality information needed for public health action comes from
research studies or program evaluations to learn what ‘works’:
1.
2.
3.
4.
5.
Understand etiologic links between behaviors and health.
Develop testable methods (valid, reliable) for measuring behavior.
Identify the factors that influence the behavior.
Determine whether public health interventions are successful.
Translate/Disseminate research findings into practice.
Most public health actions (programs, policies) are based on the
presumption that the behavior-health relationships
(i.e., etiological links) are causal.
Type 1 Evidence
10
Looking at the Scientific Evidence
Levels of Evidence-Based Data
Type 1: Something should be done.
Most common: Medicine/Epi.
Type 2: Specifically, this should be done.
Public health practitioners have
most interest here.
Type 3: Context for how an intervention is done.
From Brownson RC, et al. Evidence-Based Public Health: A Fundamental Concept for Public Health Practice, 2009.
11
Looking at the Scientific Evidence
Scientific Evidence and Public Health Action
• The higher-quality information needed for public health action comes from
research studies or program evaluations to learn what ‘works’:
1.
2.
3.
4.
5.
Understand etiologic links between behaviors and health.
Develop testable methods (valid, reliable) for measuring behavior.
Identify the factors that influence the behavior.
Determine whether public health interventions are successful.
Translate/Disseminate research findings into practice.
Most public health actions (programs, policies) are based on the
presumption that the behavior-health relationships
(i.e., etiological links) are causal.
Type 1 Evidence
1
2
3
12
Looking at the Scientific Evidence
Levels of Evidence-Based Data
Type 1 “Something should be done”
• Size, strength, or causal relationship between the behavior (or risk
factor/determinant) and health (or disease)
– Magnitude of issue: number, incidence, prevalence
– Severity: morbidity, mortality, disability
– Preventability: deaths averted, effectiveness, economic impact
• Clinical designs (randomized, experimental) focused on evidence of causality:
– Consistency: association is observed in different settings, populations, methods.
– Strength: size of the relative risk estimate.
– Temporality: time relationship between risk factor onset and disease onset.
– Dose-response: dose of the exposure and magnitude of relative risk estimate.
– Biologic plausibility: biological mechanism between risk factor and disease
outcome.
– Experimental evidence: findings from a prevention trial; random assignment.
13
Looking at the Scientific Evidence
Levels of Evidence-Based Data
Type 1 “Something should be done”
• Size, strength, or causal relationship between the behavior (or risk
factor/determinant) and health (or disease)
– Magnitude of issue: number, incidence, prevalence
– Severity: morbidity, mortality, disability
– Preventability: deaths averted, effectiveness, economic impact
From Brownson RC, et al. Evidence-Based Public Health. New York: Oxford University Press, 2003.
14
Looking at the Scientific Evidence
Levels of Evidence-Based Data
Type 2 “This should be done” (specific intervention)
• Relative effectiveness of intervention on the risk factors.
• Population designs (non-experimental) to show evidence of intervention
effectiveness (intervention → increases childhood vaccination)
– Evidence-based (peer-reviewed, systematic, external validity)
– Efficacious (peer review, research-tested, external validity)
– Promising (formative or summative evaluation, theory-consistent, lacks peer
review)
– Emerging (ongoing or in-progress evaluation, theory-consistent)
• Explores which intervention option (or combination) is more effective
and/or cost-effective
– Client reminder, Community edu., Health insurance, Vaccines in schools
15
Looking at the Scientific Evidence
Levels of Evidence-Based Data
Type 3 “How to take specific action”
(context of intervention)
• Adapt/translate evidence into
population-level intervention or policy.
• Program/policy may work in one
context, but not another. Consider
contextual domains (→).
• New programs/policies (i.e.,
innovations) may incur unintended
consequences of action.
– Political, social, or structural
domains
– E.g., school vaccine policy →
reduces measles rates)
16
Sources of Scientific Evidence
Scientific evidence is relative to the time, culture, and context.
– Public health decisions may be based on the ‘best possible’ (reasoned
judgement) and not always the ‘best available’ evidence.
– Important to consider triangulated evidence (e.g., mixed-methods).
– Looking for an intervention’s ‘active ingredients’ – transferability by context.
Quantitative Evidence


Shows how variables are
related; large sample sizes
Surveys, surveillance:
If X, then Y and Z
Qualitative Evidence


Explores why relationships
exist; smaller sample sizes
Interviews, case studies:
Why Y? Why Z? What makes
them similar/different?
17
Sources of Scientific Evidence
Analytic Tools for Obtaining Evidence Used in Public Health
Systematic Reviews/Guides


Synthesis of existing or state-of-the art research or
literature.
Translate evidence to local action
Public Health Surveillance


Ongoing, systematic data collection and analysis on disease
/ injury.
Data tracking (e.g., tobacco sales)
Economic Evaluation

Relative value, cost-benefit of action
Health Impact Assessment


Probable effect of public health policy/programs in nonhealth sectors
Impact = “5 A-Day” on agri. production
Participatory Approaches

Soliciting stakeholder (local) input
18
How to Think Scientifically
Processes for “Thinking Scientifically” in Public Health
1. Define and quantify the issue

What is the size of the public health problem?
2. Gather evidence to inform public health action

When reviewing evidence consider:

What are the results? How precise? Similar across studies?

Are the results valid? Is the assessment reproducible?
Was the methodological quality sound?

How can the results be applied to public health actions?
Are the benefits worth the costs and potential risks?
3. Translate evidence into practice



Inputs
Are there effective programs for addressing this problem?
What information about the local context is needed?
Is this particular policy or program worth doing?
4. Disseminate evidence-based findings and practices
Process
Outputs
Outcome
Impact
19
How to Think Scientifically
Processes for “Thinking Scientifically” in Public Health
Case Example: State Regulation/Firearm Homicide (Irvin et al., 2014)
20
How to Think Scientifically
Processes in “Thinking Scientifically” in Public Health
1. Define and quantify the issue
Inputs
A. What is the health issue? (problem statement)

Develop a concise statement of the issue being considered
▪ What is the issue and why care?
▪ Who is the population(s) affected?
▪ What is the size / scope of the problem?
▪ What prevention opportunities currently exist?
▪ Who are the key stakeholders?
B. Quantify the issue (counts, incidence, prevalence)


Look to existing research for baseline data – descriptive data, vital
statistics, surveillance systems, surveys/national studies
What patterns exist? By person (gender, race/ethnicity, place
(geography), or time (seasonal variation).
C. Use the literature to shape the issue (logic model)
21
How to Think Scientifically
Processes in “Thinking Scientifically” in Public Health
2. Gather evidence to inform health, program, and/or policy change



Look to existing research literature (peer-reviewed, testable)
Initiate own research or evaluation
Does this practice help alleviate the health issue? How? Why?
Inputs
A. What factors affect the health issue? (hypothesis)


Descriptive to understand why or how; single concept
Relational to understand connections; multiple concepts
B. How to measure this relationship? (operationalization)


Concept / Construct (real, phenomena)  Metric / Variable
(validity or ‘accuracy’, reliability or ‘consistency’; IV and DV)
Look to existing research (and theory) for measurement options
C. What is learned from the data? (analysis, interpretation)


Data relationships or trends/patterns (significance)
Critical review – method, measures, theory, field comparisons
22
How to Think Scientifically
Processes in “Thinking Scientifically” in Public Health
3. Translate evidence into practice (evidence-based decision-making)
A. How can it be applied? (translational research)



What are the “real world” applications learned from the literature?
– Prioritize findings.
Process
– Identify barriers: resources, political, cultural.
Blend what is known from medicine, public health, and
other disciplines
Incorporate input from community-based stakeholders (e.g.,
Outputs
expert panels, policy makers, coalitions.
B. Develop an action plan and implement intervention(s)

Consider short- and long-term goals or changes
C. Evaluate program or policy

Apply quantitative and qualitative techniques
Outcome
Processes for “Thinking
Scientifically” in Public
Health
Case Example: State
Regulation/ Firearm
Homicide (Irvin et al., 2014)
3. Translate evidence
into practice
• Any patterns?
• Does this affect our
logic model?
23
24
How to Think Scientifically
Processes in “Thinking Scientifically” in Public Health
4. Disseminate evidence-based findings and practices




Peer-review journals,
conferences/meetings
Media, local
interactions, word-ofmouth
Policies, programs
Consider these
elements when sharing
findings to enhance
stakeholder decisionmaking →
Outputs
Outcome
Impact
Annu. Rev. Public Health 2009.30:175-201. Downloaded from www.annualreviews.org
Access provided by University of Cincinnati on 08/20/22. For personal use only.
ANRV370-PU30-10
ARI
15 February 2009
12:1
Evidence-Based Public
Health: A Fundamental
Concept for Public
Health Practice
Ross C. Brownson,1 Jonathan E. Fielding,2
and Christopher M. Maylahn3
1
Prevention Research Center in St. Louis, George Warren Brown School of Social Work,
Department of Surgery and Alvin J. Siteman Cancer Center, Washington University School
of Medicine, Washington University in St. Louis, St. Louis, Missouri 63110;
email: [email protected]
2
Los Angeles Department of Health Services, Los Angeles, California 90012; School of
Public Health, University of California, Los Angeles, California 90095-1772;
email: jfi[email protected]
3
Office of Public Health Practice, New York State Department of Health, Albany,
New York 12237; email: [email protected]
Annu. Rev. Public Health 2009. 30:175–201
Key Words
First published online as a Review in Advance on
January 14, 2009
disease prevention, evidence-based medicine, intervention,
population-based
The Annual Review of Public Health is online at
publhealth.annualreviews.org
This article’s doi:
10.1146/annurev.publhealth.031308.100134
c 2009 by Annual Reviews.
Copyright
All rights reserved
0163-7525/09/0421-0175$20.00
Abstract
Despite the many accomplishments of public health, a greater attention to evidence-based approaches is warranted. This article reviews
the concepts of evidence-based public health (EBPH), on which formal
discourse originated about a decade ago. Key components of EBPH
include making decisions on the basis of the best available scientific
evidence, using data and information systems systematically, applying program-planning frameworks, engaging the community in decision making, conducting sound evaluation, and disseminating what is
learned. Three types of evidence have been presented on the causes of
diseases and the magnitude of risk factors, the relative impact of specific interventions, and how and under which contextual conditions interventions were implemented. Analytic tools (e.g., systematic reviews,
economic evaluation) can be useful in accelerating the uptake of EBPH.
Challenges and opportunities (e.g., political issues, training needs) for
disseminating EBPH are reviewed. The concepts of EBPH outlined in
this article hold promise to better bridge evidence and practice.
175
ANRV370-PU30-10
ARI
15 February 2009
12:1
INTRODUCTION
Public health research and practice are credited
with many notable achievements, including
much of the 30-year gain in life expectancy in
the United States over the twentieth century
(124). A large part of this increase can be
attributed to provision of safe water and
food, sewage treatment and disposal, tobacco
use prevention and cessation, injury prevention, control of infectious diseases through
immunization and other means, and other
population-based interventions (34).
Despite these successes, many additional
opportunities to improve the public’s health
remain. To achieve state and national objectives for improved population health, more
widespread adoption of evidence-based strategies has been recommended (19, 57, 64, 109,
119). Increased focus on evidence-based public health (EBPH) has numerous direct and indirect benefits, including access to more and
higher-quality information on what works, a
higher likelihood of successful programs and
policies being implemented, greater workforce
productivity, and more efficient use of public
and private resources (19, 77, 95).
Ideally, public health practitioners should always incorporate scientific evidence in selecting
and implementing programs, developing policies, and evaluating progress (23, 107). Society pays a high opportunity cost when interventions that yield the highest health return
on an investment are not implemented (55). In
practice, intervention decisions are often based
on perceived short-term opportunities, lacking
systematic planning and review of the best evidence regarding effective approaches. These
concerns were noted two decades ago when
the Institute of Medicine determined that decision making in public health is often driven
by “crises, hot issues, and concerns of organized interest groups” (p. 4) (82). Barriers to
implementing EBPH include the political environment and deficits in relevant and timely
research, information systems, resources, leadership, and the required competencies (4, 7, 23,
78).
Annu. Rev. Public Health 2009.30:175-201. Downloaded from www.annualreviews.org
Access provided by University of Cincinnati on 08/20/22. For personal use only.
EBPH: evidencebased public health
176
Brownson
·
Fielding
·
Maylahn
It is difficult to estimate how widely
evidence-based approaches are being applied.
In a survey of 107 U.S. public health practitioners, an estimated 58% of programs in
their agencies were deemed evidence-based
(i.e., using the most current evidence from peerreviewed research) (51). This finding in public health settings appears to mirror the use
of evidence-based approaches in clinical care.
A random study of adults living in selected
metropolitan areas within the United States
found that 55% of overall medical care was
based on what is recommended in the medical literature (108). Thacker and colleagues
(159) found that the preventable fraction (i.e.,
how much of a reduction in the health burden is estimated to occur if an intervention is
carried out) was known for only 4.4% of 702
population-based interventions. Similarly, costeffectiveness data are reported for a low proportion of public health interventions.
Several concepts are fundamental to achieving a more evidence-based approach to public
health practice. First, we need scientific information on the programs and policies that are
most likely to be effective in promoting health
(i.e., undertake evaluation research to generate sound evidence) (14, 19, 45, 77). An array
of effective interventions is now available from
numerous sources including the Guide to Community Preventive Services (16, 171), the Guide
to Clinical Preventive Services (2), Cancer Control PLANET (29), and the National Registry
of Evidence-Based Programs and Practices (142).
Second, to translate science to practice, we need
to marry information on evidence-based interventions from the peer-reviewed literature with
the realities of a specific real-world environment (19, 69, 96). To do so, we need to better define processes that lead to evidence-based
decision making. Finally, wide-scale dissemination of interventions of proven effectiveness
must occur more consistently at state and local
levels (91). This article focuses particularly on
state and local public health departments because of their responsibilities to assess public
health problems, develop appropriate programs
ANRV370-PU30-10
ARI
15 February 2009
12:1
or policies, and assure that programs and policies are effectively implemented in states and
local communities (81, 82).
We review EBPH in four major sections that
describe (a) relevant background issues, including concepts underlying EBPH and definitions
of evidence; (b) key analytic tools to enhance the
adoption of evidence-based decision making;
(c) challenges and opportunities for implementation in public health practice; and (d ) future
issues.

Annu. Rev. Public Health 2009.30:175-201. Downloaded from www.annualreviews.org
Access provided by University of Cincinnati on 08/20/22. For personal use only.

EVOLUTION OF THE TENETS
OF EVIDENCE-BASED
PUBLIC HEALTH
Formal discourse on the nature and scope of
EBPH originated about a decade ago. Several
authors have attempted to define EBPH. In
1997, Jenicek defined EBPH as the “conscientious, explicit, and judicious use of current best
evidence in making decisions about the care
of communities and populations in the domain
of health protection, disease prevention, health
maintenance and improvement (health promotion)” (84). In 1999, scholars and practitioners in Australia (64) and the United States (23)
elaborated further on the concept of EBPH.
Glasziou and colleagues posed a series of questions to enhance uptake of EBPH (e.g., “Does
this intervention help alleviate this problem?”)
and identified 14 sources of high-quality evidence (64). Brownson and colleagues described
a six-stage process by which practitioners can
take a more evidence-based approach to decision making (19, 23). Kohatsu and colleagues
broadened earlier definitions of EBPH to include the perspectives of community members,
fostering a more population-centered approach
(96). In 2004, Rychetnik and colleagues summarized many key concepts in a glossary for EBPH
(141). There appears to be a consensus among
investigators and public health leaders that a
combination of scientific evidence and values,
resources, and context should enter into decision making (Figure 1) (19, 119, 141, 151, 152).
In summarizing these various attributes of
EBPH, key characteristics include
Making decisions using the best available
peer-reviewed evidence (both quantitative and qualitative research),
Using data and information systems systematically,
Applying program-planning frameworks
(that often have a foundation in behavioral science theory),
Engaging the community in assessment
and decision making,
Conducting sound evaluation, and
Disseminating what is learned to key
stakeholders and decision makers.
Accomplishing these activities in EBPH is
likely to require a synthesis of scientific skills,
enhanced communication, common sense, and
political acumen.
Defining Evidence
At the most basic level, evidence involves “the
available body of facts or information indicating whether a belief or proposition is true or
valid” (85). The idea of evidence often derives
from legal settings in Western societies. In law,
evidence comes in the form of stories, witness accounts, police testimony, expert opinions, and forensic science (112). For a public health professional, evidence is some form
of data—including epidemiologic (quantitative)
data, results of program or policy evaluations,
and qualitative data—for uses in making judgments or decisions (Figure 2). Public health
evidence is usually the result of a complex cycle of observation, theory, and experiment (114,
138). However, the value of evidence is in the
eye of the beholder (e.g., usefulness of evidence
may vary by stakeholder type) (92). Medical evidence includes not only research but characteristics of the patient, a patient’s readiness to
undergo a therapy, and society’s values (122).
Policy makers seek out distributional consequences (i.e., who has to pay, how much, and
who benefits) (154), and in practice settings,
anecdotes sometimes trump empirical data (26).
Evidence is usually imperfect and, as noted by
Muir Gray, “[t]he absence of excellent evidence
does not make evidence-based decision making
www.annualreviews.org • Evidence-Based Public Health
177
ANRV370-PU30-10
ARI
15 February 2009
12:1
Annu. Rev. Public Health 2009.30:175-201. Downloaded from www.annualreviews.org
Access provided by University of Cincinnati on 08/20/22. For personal use only.
Best available
research evidence
Environment and
organizational
context
Decision-making
Population
characteristics,
needs, values,
and preferences
Resources,
including
practitioner
expertise
Figure 1
Domains that influence evidence-based decision making [from Spring et al. (151, 152)].
• Scientific literature in systematic
reviews
• Scientific literature in one or more
journal articles
• Public health surveillance data
• Program evaluations
• Qualitative data
Objective
– Community members
– Other stakeholders
• Media/marketing data
• Word of mouth
• Personal experience
Figure 2
Different forms of evidence. Adapted from Chambers & Kerner (37).
178
Brownson
·
Fielding
·
Maylahn
Subjective
ANRV370-PU30-10
Annu. Rev. Public Health 2009.30:175-201. Downloaded from www.annualreviews.org
Access provided by University of Cincinnati on 08/20/22. For personal use only.
Table 1
ARI
15 February 2009
12:1
Comparison of the types of scientific evidence
Characteristic
Type One
Typical data/
relationship
Size and strength of preventable
risk—disease relationship (measures
of burden, etiologic research)
Relative effectiveness of public
health intervention
Information on the adaptation and
translation of an effective
intervention
Common
setting
Clinic or controlled community
setting
Socially intact groups or
community wide
Socially intact groups or
community wide
Example
Smoking causes lung cancer
Price increases with a targeted
media campaign reduce smoking
rates
Understanding the political
challenges of price increases or
targeting media messages to
particular audience segments
Quantity
More
Less
Less
Action
Something should be done
This particular intervention
should be implemented
How an intervention should be
implemented
impossible; what is required is the best evidence
available not the best evidence possible” (119).
Several authors have defined types of scientific evidence for public health practice
(Table 1) (19, 23, 141). Type 1 evidence defines the causes of diseases and the magnitude, severity, and preventability of risk factors and diseases. It suggests that “something
should be done” about a particular disease or
risk factor. Type 2 evidence describes the relative impact of specific interventions that do
or do not improve health, adding “specifically,
this should be done” (19). There are different
sources of Type 2 evidence (Table 2). These
categories build on work from Canada, the
United Kingdom, Australia, the Netherlands,
and the United States on how to recast the
strength of evidence, emphasizing the weight
of evidence and a wider range of considerations beyond efficacy. We define four categories
within a typology of scientific evidence for
decision making: evidence-based, efficacious,
promising, and emerging interventions. Adherence to a strict hierarchy of study designs may
reinforce an inverse evidence law by which interventions most likely to influence whole populations (e.g., policy change) are least valued
in an evidence matrix emphasizing randomized
designs (125, 127). Type 3 evidence (of which
we have the least) shows how and under which
contextual conditions interventions were implemented and how they were received, thus
Type Two
Type Three
informing “how something should be done”
(141). Studies to date have tended to overemphasize internal validity (e.g., well-controlled
efficacy trials) while giving sparse attention to
external validity (e.g., the translation of science to the various circumstances of practice)
(62, 71).
Understanding the context for evidence.
Type 3 evidence derives from the context of
an intervention (141). Although numerous authors have written about the role of context in
informing evidence-based practice (32, 60, 77,
90, 92, 93, 140, 141), there is little consensus
on its definition. When moving from clinical
interventions to population-level and policy interventions, context becomes more uncertain,
variable, and complex (49). One useful definition of context highlights information needed
to adapt and implement an evidence-based intervention in a particular setting or population
(141). The context for Type 3 evidence specifies five overlapping domains (Table 3). First,
characteristics of the target population for an
intervention are defined such as education level
and health history (104). Next, interpersonal
variables provide important context. For example, a person with a family history of cancer
might be more likely to undergo cancer screening. Third, organizational variables should be
considered. For example, whether an agency
is successful in carrying out an evidence-based
www.annualreviews.org • Evidence-Based Public Health
179
ANRV370-PU30-10
Annu. Rev. Public Health 2009.30:175-201. Downloaded from www.annualreviews.org
Access provided by University of Cincinnati on 08/20/22. For personal use only.
Table 2
ARI
15 February 2009
12:1
Typology for classifying interventions by level of scientific evidence
Considerations for the level of scientific
evidence
Category
How established
Evidencebased
Peer review via systematic or
narrative review
Based on study design and execution
External validity
Potential side benefits or harms
Costs and cost-effectiveness
Community Guide
Cochrane reviews
Narrative reviews based on published
literature
Data source examples
Effective
Peer review
Based on study design and execution
External validity
Potential side benefits or harms
Costs and cost-effectiveness
Articles in the scientific literature
Research-tested intervention
programs (123)
Technical reports with peer review
Promising
Written program evaluation
without formal peer review
Summative evidence of effectiveness
Formative evaluation data
Theory-consistent, plausible, potentially
high-reach, low-cost, replicable
State or federal government reports
(without peer review)
Conference presentations
Emerging
Ongoing work, practicebased summaries, or
evaluation works in progress
Formative evaluation data
Theory-consistent, plausible, potentially
high-reaching, low-cost, replicable
Face validity
Evaluability assessmentsa
Pilot studies
NIH CRISP database
Projects funded by health foundations
a
A preevaluation activity that involves an assessment is an assessment prior to commencing an evaluation to establish whether a program or policy can be
evaluated and what might be the barriers to its evaluation (145).
program will be influenced by its capacity (e.g.,
a trained workforce, agency leadershi