Description
My subject for this paper and outline of about reduce suicide in Baltimore county. Everything thing you’d need for the outline and the paper is posted below. Thank you for choosing this paper.
Unformatted Attachment Preview
Types of Evaluation
Once you’ve determined which program activities in your logic model should be evaluated, you can begin to identify the
types of evaluation you can conduct.
What are the most common types of evaluation?
There are several types of evaluations that can be conducted. Some of them include the following:
t Formative evaluation ensures that a program or program activity is feasible, appropriate, and acceptable before it is
fully implemented. It is usually conducted when a new program or activity is being developed or when an existing one
is being adapted or modified.
t Process/implementation evaluation determines whether program activities have been implemented as intended.
t Outcome/effectiveness evaluation measures program effects in the target population by assessing the progress in the
outcomes or outcome objectives that the program is to achieve.
t Impact evaluation assesses program effectiveness in achieving its ultimate goals.
Process Evaluation determines whether program activities have been implemented as intended and resulted in certain
outputs. You may conduct process evaluation periodically throughout the life of your program and start by reviewing the
activities and output components of the logic model (i.e., the left side).
Results of a process evaluation will strengthen your ability to report on your program and use information to improve
future activities. It allows you to track program information related to Who, What, When and Where questions:
t To whom did you direct program efforts?
t Where did your program activities take place?
t What has your program done?
t What are the barriers/facilitators to implementation
of program activities?
t When did your program activities take place?
Outcome Evaluation measures program effects in the target population by assessing the progress in the outcomes that
the program is to address. To design an outcome evaluation, begin with a review of the outcome components of your logic
model (i.e., the right side).
Some questions you may address with an outcome evaluation include:
t Were medical providers who received intensive STD training more likely to effectively counsel, screen and treat
patients than those who did not?
t Did the implementation of STD counseling in community-based organizations result in changes in knowledge,
attitudes, and skills among the members of the target population?
t Did the program have any unintended (beneficial or adverse) effects on the target population(s)?
t Do the benefits of the STD activity justify a continued allocation of resources?
For more information and examples, see Step 3.1 in the Practical Use
of Program Evaluation among STD Programs manual.
http://www.cdc.gov/std/program/pupestd.htm
National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention
Division of STD Prevention
CS249668
Types and Uses of Evaluation
In order to plan the evaluation in accord with the most appropriate evaluation method, it is necessary to understand
the difference between evaluation types. There are a variety of evaluation designs, and the type of evaluation should
match the development level of the program or program activity appropriately. The program stage and scope will
determine the level of effort and the methods to be used.
Evaluation Types
When to use
What it shows
Why it is useful
Formative Evaluation
Evaluability Assessment
Needs Assessment
t During the development of a
new program.
t When an existing program is
being modified or is being used
in a new setting or with a new
population.
t Whether the proposed program
elements are likely to be needed,
understood, and accepted by the
population you want to reach.
t The extent to which an evaluation
is possible, based on the goals and
objectives.
t It allows for modifications to
be made to the plan before
full implementation begins.
t Maximizes the likelihood that
the program will succeed.
Process Evaluation
Program Monitoring
t As soon as program
implementation begins.
t During operation of an existing
program.
t How well the program is working.
t The extent to which the program is
being implemented as designed.
t Whether the program is accessible an
acceptable to its target population.
t Provides an early warning for
any problems that may occur.
t Allows programs to monitor
how well their program plans
and activities are working.
Outcome Evaluation
Objectives-Based
Evaluation
t After the program has made
t The degree to which the program
contact with at least one person
is having an effect on the target
or group in the target population. population’s behaviors.
t Tells whether the program is
being effective in meeting it’s
objectives.
Economic Evaluation:
Cost Analysis,
Cost-Effectiveness Evaluation,
Cost-Benefit Analysis,
Cost-Utility Analysis
t At the beginning of a program.
t During the operation of an
existing program.
t What resources are being used in a
program and their costs (direct and
indirect) compared to outcomes.
t Provides program managers
and funders a way to assess
cost relative to effects. “How
much bang for your buck.”
Impact Evaluation
t During the operation of an
existing program at appropriate
intervals.
t At the end of a program.
t The degree to which the program
t Provides evidence for use in
meets its ultimate goal on an overall
policy and funding decisions.
rate of STD transmission (how
much has program X decreased the
morbidity of an STD beyond the study
population).
It is important to note the usefulness of conducting process evaluation while you are implementing outcome
evaluation. If the outcome evaluation shows that the program did not produce the expected results, it may be due
to program implementation issues. Therefore, it is recommended that if you conduct outcome evaluation, you also
implement process evaluation.
TIP: Learn more about types of evaluations in the Program Operations Guidelines for STD Prevention
manual on program evaluation. http://www.cdc.gov/std/program/ProgEvaluation.pdf
Types of Evaluation
Major Sections of the Implementation and Evaluation Paper
•
•
•
•
•
Section 1: Introduction
o 1-3 paragraph summary of issue. (included in theory paper)
§ National and community-level statistics
§ Brief description of problem
o 1-3 paragraph summary of program. (included in theory paper)
Section 2: Charts – Three column implementation charts—as many as it takes. Use the
chart from the Theory paper and incorporate any suggested changes.
o One goal per chart
o One objective per row
o Evaluation column should also be completed
Section 3: Summary of evaluation techniques (2-4 pages): describe techniques and what
you hope to achieve. You must demonstrate understanding of different evaluation and
assessment strategies.
Section 4: 3 paragraph summary of the entire paper.
References
Example Work Plan Using Hypothetical Mammography Program
Task
PLANNING Stage
Write job description for health educator
By when/by whom
List churches to be involved with training
City Church Organization
Project Coordinator by 10/15/10
Outline what is to occur at each Sunday
Brunch
Develop Assessment/Evaluation Tools
Project Coordinator on 10/10/10
Project Coordinator by 10/1/10
Grant written
Grant submitted
Project Coordinator and TU on 10/15/10 and
1/1/11
10/1/11 through 12/1/11 by Project
Coordinator
10/1/11 through 11/1/11 by Grant Team
Grant Team designee by 3 pm on 11/5/11
IMPLEMENTATION STAGE
Grant secured
Establish advisory board meetings
12/15/11
1/15/12- Project Coordinator
Organize advisory board list
Advisory Board meets with Grant Team
2012 – 2014: March, June, Nov, Feb, June
Sept – Grant Team
Advertise position for health educator
12/22/12 by Project Coordinator
Hire health educator
Data collection from health sites
1/31/12 by Project Coordinator
3/1/12, 5/1/12 and 11/12/12 by
Health Educators and Project Coordinator
Feb-March 2012 Health Educator and
Advisory Board
March 2012, Health Educator
Develop material to be circulated at churches
Materials Approved by advisory board
Printed
Set up speakers for Sunday brunches
March 2012-March 2014 (Scheduled weekly)
Health Educator and TU intern
Purchase supplies/rewards for participants
March 2012, TU intern and Health Educator
Printing of material for churches
March 2012, Project Coordinator
Distribution of material to churches
March 2012, June 2012, Sept 2012, Dec
2012, March 2013
Baltimore City Church Coalition and Health
Educator
Grant visit
EVALUATION Stage
Formative-evaluations—after each brunch.
To be determined with granting agency and
Project Coordinator
March 2012-March 2013 by Health Educator,
Project Coordinator, and TU intern
Summative evaluation—data from health
centers and pre-post test
Project Coordinator and Health Educator in
March 2013 and March 2014, May 2014 and
Nov 2014
Data analysis
Statistician and Grant Team Dec 2014
Reports for Granting Agency
Monthly by Project Coordinator
Final Report
Grant Team 1/12/15
Evaluation Narrative Example
The goal of the grant is to increase the effectiveness of collaborative community-based
interventions, implemented at the grassroots level, on reducing health disparities among racial
and ethnic minority populations (African American and Latino youths), and to demonstrate the
effectiveness of the collaborative partnership approach’s when it comes to HIV/AIDS.
Formative and summative evaluation will occur throughout the program for each
presentation and activity. Every participant with complete various assessments throughout the
program. To receive input from the population, data will be scored and examined. Data will be
collected from peer educators, health educators, and advisory board members. Formative data
on participation frequency will also be collected by keeping track of the number of participants in
attendance at the information sessions. This data will provide the health educator with
information on the progress of the program, what changes have occurred, and if any changes
still need to be made.
The comparison of pre- and post-test data is used in the summative evaluation to
measure attitudes of minority youth regarding HIV including, safe sex practices, the importance
of HIV testing, and HIV testing center locations. Health educators and other staff members will
collect data on the number of participants in the program, and to determine if there is an
increase in test scores.
The pre-post test data will be analyzed to determine if attitudes of minority youth
regarding HIV knowledge, safe sex practices, the importance of HIV testing, and location of HIV
testing centers have changed during and after the program. Analyzing data on the number of
HIV tests performed in the participating communities will help determine if there was a
significant change (before, during and after) the program.
The data collected will then be used to demonstrate the effectiveness of the program by
decreasing the number of new cases of HIV in minority youth ages 13-19. The information
gathered will be shared with the grant agency and this will help other organizations to develop
HIV-related programs in their own communities.
Directions & Rubric for Implementation & Evaluation Paper
1. Readings:
a. CHEP book: Chapter 8 Implementation Processes and pages 229-231 in Chapter 9
b. CTB: https://ctb.ku.edu/en/table-of-contents/evaluate/evaluation/evaluation-plan/main
2. Sections (write sections a and b in full sentences/paragraphs):
a. Program description:
i. Describe your entire program, including recruitment strategies, how many people you plan to serve, how many
events/sessions are part of your program, modality (e.g., in-person, virtual), how long events/sessions will last,
how often they will be held, and over how many weeks/months they will occur
ii. Explain what will happen during your events/sessions and strategies for motivating and retaining participants
(i.e., keeping them in the program) and targeting changes in knowledge, attitudes, behaviors, and/or skills. You
do not need to describe every session.
iii. Make sure your program is evidence- and theory-based. Re-read your papers on previous programs (or find new
resources if needed) and your theory/model. Your program should have strategies that address the main
components of your theory/model and strategies that have worked in other health promotion programs.
b. Description of evaluation plan: include formative, process, and impact/outcome evaluation strategies.
i. For each of the three types of evaluation: name the evaluation type, describe the main purpose of that type of
evaluation, and then describe your strategies. Use your own words to describe the purpose of each type of
evaluation; do not copy from the slides or your readings.
ii. Include at least two formative evaluation strategies and at least two outcome/impact strategies, I expect you will
have more than two process evaluation strategies
c. Planning charts: include revised goals, objectives, and evaluation strategies
d. Timeline (i.e., Gantt chart)
i. Look at the example in the slides. Work tasks/activities should be listed in chronological order on the left side.
Months of the year are the column headings across the top. Fill cells with a color to indicate the months when
that work task will take place. Work tasks/activities could be one month or many months. You can have more
than one task/activity going on during certain months. Do not leave any month blank. You have 24 months to
create, implement, and evaluate your program, so leave time at the beginning and end to complete critical
tasks/activities (e.g., hiring and recruiting participants in the beginning and analyzing data and writing a final
report at the end).
Content Criteria
Narrative
• Program description: clearly explains what the program will include, the
nature of the events/sessions, and theory- and evidence-based strategies
• Evaluation plan: identifies each type of evaluation (3), describes why each
type is used, and has relevant strategies that will be used to evaluate the
program
/10
/10
Planning chart
• A table for each goal (2 total) with SMART objectives and relevant
evaluation strategies
• The content aligns with the other parts of the assignment
/10
Timeline/Gantt chart
• Timeline clearly demonstrates timing of work that will be achieved within
the grant period
• Tasks/activities are listed in chronological order
• The timeline aligns with the other parts of the assignment
/10
Subtotal:
Format Criteria
Spelling and grammar
/40
/4
APA format is used correctly throughout the paper including cover page, references,
etc.
/3
Style
Paper is well organized with a beginning, middle, and end. Paragraphs follow a
logical sequence introduced in first paragraph.
/3
Subtotal:
Total Points
/10
/50
Purchase answer to see full
attachment