20.04.2014 Views

Writing an Evaluation Plan 2012.pdf - Office of Research and ...

Writing an Evaluation Plan 2012.pdf - Office of Research and ...

Writing an Evaluation Plan 2012.pdf - Office of Research and ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Writing</strong> <strong>an</strong> <strong>Evaluation</strong> Pl<strong>an</strong><br />

Guadalupe Corral, PhD<br />

<strong>Office</strong> <strong>of</strong> <strong>Research</strong> & Sponsored Projects<br />

Proposal Development Team


Why Evaluate?<br />

• Funding sources require it… <strong>an</strong>d you also need to<br />

know<br />

• To improve your program/project<br />

• To see the impact <strong>of</strong> your program


Where do you start?


Analyzing the Solicitation<br />

• What is the solicitation asking for?<br />

• Differs within <strong>an</strong>d between agencies<br />

• NSF<br />

Describe how the research <strong>an</strong>d education will be<br />

evaluated (internally <strong>an</strong>d/or externally).


Analyzing the Solicitation<br />

• NSF<br />

All proposals must include <strong>an</strong> appropriate evaluation pl<strong>an</strong>. A number <strong>of</strong><br />

resources for developing evaluation pl<strong>an</strong>s are available at<br />

http://caise.insci.org/resources including the 2010 User-Friendly H<strong>an</strong>dbook for<br />

Project <strong>Evaluation</strong>, Framework for Evaluating Impacts <strong>of</strong> Informal Science<br />

Education Projects (Framework), <strong>an</strong>d the Impacts <strong>an</strong>d Indicators Worksheet.<br />

<strong>Evaluation</strong> design: <strong>Evaluation</strong> questions, design, data collection methods,<br />

<strong>an</strong>alyses, <strong>an</strong>d reporting/dissemination strategies must be detailed in the<br />

evaluation pl<strong>an</strong>, including formative <strong>an</strong>d summative evaluation goals <strong>an</strong>d<br />

strategies that seek to <strong>an</strong>swer the evaluation questions. The evaluation design<br />

must emphasize the coherence between the proposal goals <strong>an</strong>d evidence <strong>of</strong><br />

meeting such goals, <strong>an</strong>d must be appropriate to the type, scope, <strong>an</strong>d scale <strong>of</strong> the<br />

proposed project. Logic models or theories <strong>of</strong> action, as <strong>an</strong> example, c<strong>an</strong> help<br />

describe the project inputs, outputs, outcomes <strong>an</strong>d impacts. All project types<br />

must include a summative evaluation by <strong>an</strong> external evaluator. NOTE:<br />

details <strong>of</strong> the evaluation pl<strong>an</strong> may be included as a Supplementary Document.


Analyzing the Solicitation<br />

• NIH<br />

<strong>Evaluation</strong> Pl<strong>an</strong> (6 pages, total)<br />

Provide a comprehensive <strong>Evaluation</strong> Pl<strong>an</strong> to be used to monitor the conduct <strong>an</strong>d track the<br />

progress <strong>of</strong> proposed TCC research, implementation <strong>an</strong>d dissemination activities.<br />

Describe how the evaluation will be conducted, the principal measures <strong>an</strong>d metrics to be used,<br />

<strong>an</strong>d the potential sources <strong>of</strong> data. Also include a detailed self-evaluation pl<strong>an</strong> to assess<br />

achievement <strong>of</strong> short- <strong>an</strong>d long-term TCC goals. The Administrative Core is responsible for<br />

implementing the <strong>Evaluation</strong> Pl<strong>an</strong>.<br />

Since the major purpose <strong>of</strong> the evaluation is to provide information to assist with TCC pl<strong>an</strong>ning<br />

<strong>an</strong>d m<strong>an</strong>agement, the pl<strong>an</strong> should address both administrative <strong>an</strong>d scientific function <strong>an</strong>d<br />

accomplishments. The <strong>Evaluation</strong> Pl<strong>an</strong> should address the following areas <strong>of</strong> particular<br />

import<strong>an</strong>ce: tr<strong>an</strong>slational activities; scope <strong>an</strong>d impact <strong>of</strong> research; innovation; collaboration <strong>an</strong>d<br />

communication; integration <strong>an</strong>d synergy; <strong>an</strong>d funds m<strong>an</strong>agement. Describe timelines, key<br />

milestones <strong>an</strong>d expected outcomes for each area as appropriate.<br />

While evaluation should be a continuous process, a formal evaluation by <strong>an</strong> outside,<br />

independent group selected by TCC leadership <strong>an</strong>d approved by NIMHD staff should be<br />

conducted at least every two years. TCCs may also be called upon to gather data <strong>an</strong>d participate in<br />

the development <strong>of</strong> a national TCC Program evaluation.


Types <strong>of</strong> <strong>Evaluation</strong>s<br />

• Internal<br />

• Independent<br />

• External


Types <strong>of</strong> <strong>Evaluation</strong>s<br />

• Formative <strong>Evaluation</strong>, two components:<br />

• Implementation (or process)<br />

evaluation<br />

• Progress evaluation<br />

‣Provides information to improve<br />

programs


Types <strong>of</strong> <strong>Evaluation</strong>s<br />

• Summative <strong>Evaluation</strong><br />

• Did the program meet its goals <strong>an</strong>d objectives?<br />

• What evidence c<strong>an</strong> serve to show this?<br />

• Baseline information<br />

• Summative information


Developing the <strong>Evaluation</strong> Pl<strong>an</strong><br />

• Program context<br />

• What is the problem or need for the program?<br />

• What are the goals <strong>an</strong>d objectives <strong>of</strong> your program?<br />

• Who is involved?<br />

• What activities will take place?<br />

• How will you measure progress <strong>an</strong>d impact?


Selecting Indicators for Assessment<br />

• What do you expect to see if program/project is<br />

correctly implemented <strong>an</strong>d progresses toward<br />

stated objectives?<br />

Implementation:<br />

• Recruitment<br />

• Selection<br />

Participation:<br />

• Activities<br />

• Assessments<br />

* <strong>Research</strong>ers:<br />

• Program pl<strong>an</strong><br />

• Meetings


Select Appropriate Indicators


Qu<strong>an</strong>titative Data<br />

• Provide for easy comparisons; c<strong>an</strong> come from<br />

existing or created sources<br />

• Records<br />

• Surveys<br />

• Learning assessments<br />

‣ Thought should be given to validity <strong>an</strong>d reliability


Qualitative Data<br />

• Provide for descriptions about program activities,<br />

context, <strong>an</strong>d particip<strong>an</strong>ts’ behaviors<br />

• Document review<br />

• Observations<br />

• Focus Groups<br />

• Interviews<br />

• Open-ended questions on surveys<br />

‣ Have guidelines in place


Create a ‘Logic’ Model<br />

• Illustrate the relationship among your<br />

program/project elements:<br />

• Inputs: Resources necessary for program<br />

implementation<br />

• Activities: Interventions that will be<br />

implemented to achieve outcomes<br />

• Outputs: Direct products obtained as a result <strong>of</strong><br />

program activities


Create a ‘Logic’ Model<br />

• Outcomes: The impacts, ch<strong>an</strong>ges, or results <strong>of</strong><br />

the program activities <strong>an</strong>d outputs; link to your<br />

objectives <strong>an</strong>d your goals<br />

Short-term:<br />

• knowledge<br />

• skills<br />

• attitudes<br />

• motivation<br />

• awareness<br />

Intermediate-term:<br />

• behaviors<br />

• practices<br />

• policies<br />

• procedures<br />

Long-term:<br />

• environmental<br />

• social conditions<br />

• economic<br />

• conditions<br />

• political<br />

• conditions


Create a ‘Logic’ Model


Create a ‘Logic’ Model


Keep it Simple!<br />

• Don’t assume<br />

• Don’t get too complex<br />

• Don’t get too f<strong>an</strong>cy<br />

• Don’t overdo it


Keep it Simple!


Tips for writing your pl<strong>an</strong><br />

• Know your audience<br />

• Drop the jargon <strong>an</strong>d be straightforward, clearly<br />

stating what you pl<strong>an</strong> to do<br />

• Try to incorporate both qu<strong>an</strong>titative <strong>an</strong>d<br />

qualitative data collection methods<br />

• Be honest, note <strong>an</strong>y challenges you may face <strong>an</strong>d<br />

how you may overcome these


<strong>Writing</strong> your pl<strong>an</strong><br />

•Two examples


Budgeting for the <strong>Evaluation</strong><br />

• May r<strong>an</strong>ge from 5 – 10% (or more)<strong>of</strong> the gr<strong>an</strong>t<br />

amount<br />

<strong>Evaluation</strong> Activities<br />

• Consultation <strong>an</strong>d <strong>an</strong>alysis<br />

• Development <strong>of</strong> pl<strong>an</strong><br />

• Literature review<br />

• Coordination with<br />

stakeholders<br />

• Data requests<br />

• Periodic <strong>an</strong>d final reports<br />

• Etc.<br />

• Development <strong>of</strong> measures<br />

• Surveys<br />

• Interview <strong>an</strong>d focus<br />

group questions & logistics<br />

• Tests<br />

• Etc.<br />

• Data collection<br />

• Data <strong>an</strong>alysis & results


Finding <strong>an</strong> Evaluator<br />

• Americ<strong>an</strong> <strong>Evaluation</strong> Association:<br />

• http://www.eval.org/find_<strong>an</strong>_evaluator/evaluators_<br />

found.asp?where=TX<br />

• Institute <strong>of</strong> Org<strong>an</strong>izational <strong>an</strong>d Program<br />

<strong>Evaluation</strong> <strong>Research</strong>:<br />

• http://www.cgu.edu/pages/506.asp


Sources<br />

• Bond, Sally L., Boyd, Sally E., <strong>an</strong>d Rapp, Kathleen A. 1997. Taking Stock: A<br />

Practical Guide to Evaluating Your Own Programs. Horizon <strong>Research</strong>,<br />

Inc., Chapel Hill, NC.<br />

• Frechtling Westat, Joy. 2002. The 2002 User Friendly H<strong>an</strong>dbook for Project<br />

<strong>Evaluation</strong>. National Science Foundation.<br />

• McCawley, Paul F. 2001. The Logic Model for Program Pl<strong>an</strong>ning <strong>an</strong>d<br />

<strong>Evaluation</strong>. University <strong>of</strong> Idaho Extension.<br />

• National Institute <strong>of</strong> Environmental Health Sciences & U.S. Department <strong>of</strong><br />

Health <strong>an</strong>d Hum<strong>an</strong> Services. 2012. Partnerships for Environmental<br />

Public Health <strong>Evaluation</strong> Metrics M<strong>an</strong>ual NIH. Publication No. 12-7825.<br />

• Shackm<strong>an</strong>, Gene, 2009. What is Program <strong>Evaluation</strong>? A beginners guide. The<br />

Global Social Ch<strong>an</strong>ge <strong>Research</strong> Project.


Questions?

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!