The National Evaluation Of The Trade Adjustment Assistance Community College Career Training Grants

Sponsoring Agency: U.S. Department of Labor

Project Officer: Erika Liliedahl

Performance Period: October 2012 – September 2016

Project Summary:
In 2013, the U.S. Department of Labor’s (DOL) Chief Evaluation Office and Employment and Training Administration contracted with The Urban Institute (UI) and its partners – NORC at the University of Chicago, Capital Research Corporation (CRC), George Washington University (GWU) and Jobs for the Future (JFF) – to conduct the national evaluation of the Trade Adjustment Assistance Community College Career Training (TAACCCT) grants program. The TAACCCT grants program, one of President Obama’s community college initiatives, was authorized under an amendment to the Trade Act of 1974 and funded at $2 billion over four years in the Health Care and Education Reconciliation Act of 2010. The intent of the TAACCCT grants is to provide funding to community colleges and other higher education institutions to expand and improve their ability to deliver education and career training programs that can be completed in two years or less. These projects build the colleges’ capacity to serve workers who are eligible under the TAA for Workers program and other workers in need of training and provide them with training for high-wage, high-skill occupations. A total of 128 grants in two rounds have been awarded since September 2011 and a third round of grants will be awarded in the fall of 2013. The period of performance for the national evaluation is 48 months, ending in September 2016. The key research questions to be addressed by the national evaluation are:

    • What service delivery and/or system reform innovations resulted in improved employment outcomes and increased skills for participants?

Under what conditions can these innovations most effectively be replicated?

  • What are the types of emerging ideas for service delivery change and/or system reform that seem the most promising for further research? Under what conditions are these ideas most effective?
  • What directions for future research on the country’s public workforce system, and workforce development in general, were learned?

Each of these questions is to be answered using a mix of quantitative and qualitative research methods, using documents and data generated by the grantees – individual-level student data, proposals, grant reporting, and evaluation reports – and collected through qualitative implementation data from structured fieldwork and surveys. The team will use four evaluative methods to address the research questions for the national TAACCCT evaluation –a formal implementation analysis, a performance assessment, an evaluability assessment, and an outcome/impact analysis.

To document and assess the implementation of the grants, the team will conduct a formal implementation analysis of the service delivery approaches and systems reformed – or the “inputs” as described in the conceptual framework – developed through the grants. The implementation analysis will also highlight how the grants addressed the needs of the key stakeholders and contextual factors that may have affected the implementation of the grants. The team will use structured fieldwork to selected grants and a web-based survey of all participating colleges, supported by various grantee documents, for this analysis. The evaluation team will also assess the overall performance of the grants to describe the grant “outputs” from the conceptual framework using grantee quarterly and annual reports and the third-party evaluation reports developed throughout the grant.

The evaluation team will also examine the “evaluability” of the grants to: 1) determine opportunities for cross-site analysis of training impacts; and 2) make recommendations for more rigorous evaluation designs for similar grant initiatives. The evaluability assessment will involve conducting calls to grantees and third-party evaluators to clarify our understanding of the grant activities and evaluation designs. The structured fieldwork will also be used to examine the degree of similarity of the grant activities for pooling and to assess how more rigorous evaluation designs, especially experimental design, could be implemented. The evaluation team will then conduct a descriptive outcome and non-experimental cross-site impact analysis of selected grantee interventions to determine the impact of the community college interventions on participants’ short-run education outcomes and longer-run employment outcomes.

The team will prepare a detailed interim report and final report, both of which will be written to communicate the evaluation’s findings to a broad audience and to inform future research and demonstrations for DOL investment. The first interim report will be based on the summary of grantee information, analysis of the Round 1 survey results, and a summary of the third-party evaluation reports submitted to date. The team will finalize its analyses and prepare a final report that describes the full results of the implementation analysis including site visits and survey of all rounds, the performance assessment, and the non-experimental cross-site impact analysis.