61395 - TEGL - FY 2025 RESEA Funding Guidance - Attachment V (Accessible PDF).pdf

ETA Advisory
ETA Advisory File Text
Attachment V V-1 Reemployment Services and Eligibility Assessments RESEA Program Evaluation Report Recommendations to Support National Analyses State evaluation findings will be useful for program operators and policymakers. For this reason it is important that your RESEA program evaluation reports are well-documented and contain The interventions being tested The context of the evaluation and Impact and analysis results with well-documented statistical details. Complete consistent and high-quality program evaluation reports can enable various types of evidence syntheses and other secondary data analysis including meta-analyses that can empirically synthesize information from across multiple relevant evaluations. These types of syntheses can support RESEA evidence building by enabling a broader understanding of effective interventions which will improve CLEAR s ability to rate RESEA interventions for states use. Such analyses can also identify gaps in knowledge where states could stand up new studies and inform program operators and policy makers interested in continuous improvement of the RESEA program. The Department s ability to facilitate or conduct successful secondary analyses such as meta- analyses to benefit states is dependent on states program evaluation reports including consistent information about the key evaluation components described below. At a minimum each state must include the following information in an evaluation report as appropriate to evaluation design. States are not limited to reporting on the below information and should work with their independent evaluator to ensure reports include appropriate information based on the evaluation design used. STUDY CHARACTERISTICS AND CONTEXT What elements of the program that are being studied must if appropriate to the evaluation design be included to conduct a meta- analysis Intervention and Comparison Conditions What interventions program policy practice etc. does the study evaluate What specific services or activities did the intervention consist of Was there any adjustment or adaptation implemented in the study What services if any did the comparison group receive Setting Where did the study take place What are the key characteristics of the setting urban suburban or rural state etc. . In what years did the study take place Study Sample Attachment V V-2 Who participated in the study How were they selected and recruited What were the ages of participants What were the criteria for participation in the program or the intervention What are their socio-demographic characteristics STUDY DESIGN AND ANALYSIS The following are the elements that a meta-analysis would need to consider about the program that is being studied including the study s setting and sample of study participants. Study Design What was the study s design e.g. randomized experiment quasi-experimental design descriptive If the study includes an impact evaluation how were the units e.g. individuals groups of individuals assigned to the program with a description of the control comparison condition e.g. random assignment matched comparison Measure Identify the measurement instrument if any and data source self-reports administrative data for the measures. Identify the timing of all measurements in the study including any pre-tests. Baseline Equivalence Provide information needed to assess baseline equivalence of program and comparison groups. Evaluators should provide information needed to assess baseline equivalence of program and comparison groups on demographics and on key characteristics that may predict the outcome s of interest. For outcomes such as earnings where pre-intervention measures are available and relevant equivalence should be shown on those measures. For analysis of employment evaluators should show equivalence on available measures of employment history and earnings. For analyses of unemployment compensation UC duration equivalence on UC profiling scores is important because that is a measure of expected risk of benefit exhaustion maximum UC duration . Equivalence on other measures related to pre-claim employment history and prior UC claims may be important as well. Methods of Data Analysis Describe the analytical models or methods used to estimate impacts. Specify the variable if any that were included as controls in the analysis. Attachment V V-3 Specify the unit of analysis e.g. cluster individual and if applicable how clustering was addressed. Missing Data How did the analysis account for missing data if any Specify the type of data baseline outcome or both for which missing data methods were used. IMPACT ANALYSIS RESULTS As appropriate to the evaluation design the evaluation must report the following for each outcome measure and each subgroup as available Sample size for the treatment group. Sample size for the control comparison group. Unadjusted treatment group mean outcome. Unadjusted control comparison group mean outcome. Unadjusted treatment group standard deviation. Unadjusted control comparison group standard deviation. Impact estimate with information on how it was computed if other than raw difference in means and associated p-value. Standardized difference. If any information from unadjusted sample sizes group means standard deviations are missing the following should be documented from a study s report Coefficient from the impact estimation model. Standard error of the impact and if the standard error is unavailable the specific p-value associated with the impact estimate . For additional information on communicating and reporting study findings please see the Reemployment Services and Eligibility Assessment RESEA Toolkit on the WorkforceGPS site along with other evaluation technical assistance resources.