Evaluation Plan Guidance Page 45

ADVERTISEMENT

EVALUATION PLAN GUIDANCE
SOCIAL INNOVATION FUND
Implementation Questions
Program implementation questions are clearly stated.
When feasible, the implementation questions should address the level of program-like services received by the comparison
group.
Implementation questions do not include impact questions.
Contribution of the Study
The contribution to understanding that the evaluation will make is clearly stated.
The level of evidence the program is targeting is described.
How the proposed evaluation meets the criteria for this level of evidence is included.
Impact Evaluation Design Selection
The SEP clearly identifies the study design selected.
The description of the design draws upon previous research or literature, where available.
The SEP presents a rationale for the design selected.
The SEP justifies the target level of evidence based on a discussion of internal and external study validity.
Randomized Between-Groups Design (if applicable)
Unit of random assignment is clearly identified (and aligned with the unit of analysis).
Procedures to conduct the random assignment, including who implemented the random assignments, how the procedures were
implemented, and procedures used to verify that probability of assignment groups, are described and generated by random
numbers.
Blocking, stratification, or matching procedures used—to improve precision in the estimate of the program effect or to balance
groups on measured characteristic(s)—are described.
The program group and to the extent possible, the control group conditions are described.
Any concerns that proposed strategies or approaches will lead to nonequivalent groups are discussed.
Between-Groups Design- Formed by Matching (if applicable)
Unit of matching is clearly identified (and aligned with the unit of analysis).
Procedures to carry out the matching to form a comparison group are described.
A precedent in the literature for including the variables used in the matching is included.
Methods used to form the proposed comparison group are described such that the validity of the matching is explained.
Reasons why the comparison group might differ from the treatment group and threaten internal validity, and the ways in which
the proposed methods adjust for those differences, are discussed.
Between-Groups Design- Formed by Cut-off Score (RDD) (if applicable)
Measure and cutoff score are clearly identified (and aligned with the unit of analysis).
Cutoff score is clearly delineated and justified.
Methods used to apply the cutoff score are described in detail.
B.5

ADVERTISEMENT

00 votes

Related Articles

Related forms

Related Categories

Parent category: Education