Sectoral Training: Does it Work?

The idea of “sectoral training” is that everyone needs skills for a well-paid career, but not everyone needs or wants to acquire those skills by getting a four-year undergraduate degree.

Might training focused on becoming employable in a specific high-demand sector of the economy, preferably with an employer standing by and ready to hire, work better for some young adults? The US Department of Education runs the What Works Clearinghouse, which collects studies on various programs and writes up an overview of the results. In November, the WWC published evaluations of two sectoral training programs, Project Quest and Year Up. Neither set of findings is very encouraging–but there is some controversy over whether the WWC is focusing on the proper outcomes.

For background, Project Quest started in San Antonio, Texas, in 1992, and has since spread to some other locations in Texas and Arizona. The program accepts those who are at least 18, and who have a high school degree (or equivalent). The US Department of Education looks at three studies of Project Quest, and describes the intervention in this way:

All three interventions target their efforts on recruiting individuals who are unemployed, underemployed, meet federal poverty guidelines, or are on public assistance. … Project QUEST is a community-based organization that partners with colleges, professional
training institutes, and employers. Participants enroll full-time in an occupational training program. They attend weekly group meetings led by a counselor that focus on life skills, time management, study skills, test-taking techniques, critical thinking, conflict resolution, and workforce readiness skills. Participants who need to improve their basic reading and math skills can complete basic skills coursework prior to enrolling in the occupational program. … Participants typically complete their occupational program within one to three years, depending on the length of the program.

What do the results of the three studies show, according to the US Department of Education?

The evidence indicates that implementing Project QUEST:
• is likely to increase industry-recognized credential, certificate, or license completion
• may increase credit accumulation
• may result in little to no change in short-term employment, short-term earnings, medium-term employment, medium-term earnings, and long-term earnings
• may decrease postsecondary degree attainment

Obviously, this is not especially encouraging. What about the other program, Year Up? The US Department of Education describes the program this way:

Year Up is an occupational and technical education intervention that targets high school graduates to provide them with six months of training in the information technology and financial service sectors followed by a six-month internship and supports to ensure that participants have strong connections to employment. … The evidence indicates that implementing Year Up:
• is likely to increase short-term earnings
• may result in little to no change in short-term employment
• may result in little to no change in medium-term earnings
• may result in little to no change in industry-recognized credential, certificate, or license completion
• may result in little to no change in medium-term employment

Of course, reports like these don’t prove that some other kind of sectoral training program might not work. But they do suggest that some of the more prominent examples of sectoral training aren’t performing as well as hoped. However, Harry J. Holzer believes that the official reports provide are too gloomy. He has written “Do sectoral training programs work? What the evidence on Project Quest and Year Up really shows” (Brookings Institution, January 12, 2022). As background, Holzer is an advocate of sectoral training programs (for example, see his “After COVID-19: Building a More Coherent and Effective Workforce Development System in the United States,” Hamilton Project, February 2021). In this essay, he writes:

I argue that the best available evidence still suggests that Project Quest and Year Up, along with other sector-based programs, remain among our most successful education and training efforts for disadvantaged US workers. While major challenges remain in scaling such programs and limiting their cost, the evidence to date of their effectiveness remains strong, and they should continue to be a major pillar of workforce policy going forward.

Holzer points to other reviews of sectoral training programs that reach much more positive outcomes. For example, Lawrence F. Katz, Jonathan Roth, Richard Hendra and  Kelsey Schaberg have written “Why Do Sectoral Employment Programs Work? Lessons from WorkAdvance” (National Bureau of Economic Research Working Paper, December 2020). They discuss four sectoral training programs with randomized control trial (RCT) evaluations. They write:

We first reexamine the evidence on the impacts of sectorfocused programs on earnings from four RCT-based major evaluations – the SEIS, WorkAdvance, Project Quest, and Year-Up – of eight different programs/providers (with one provider Per Scholas appearing in two different evaluations). Programs are geared toward opportunity youth and young adults (Year Up) or broader groups of low-income (or disadvantaged) adults. Participants are disproportionately drawn from minority groups (Blacks and Hispanics), low-income households, and individuals without a college degree. The sector-focused programs evaluated in these four RCTs generate substantial earnings gains from 14 to 39 percent the year or so following training completion. And all three evaluations with available longer-term follow-ups (WorkAdvance for six years after random assignment, Project Quest for nine years, and Year Up for three years) show substantial persistence of the early earnings gains with little evidence of the fade out of treatment impacts found in many evaluations of past employment programs. Sector-focused programs appear to generate persistent earnings gains by moving participants into jobs with higher hourly wages rather than mainly by increasing employment rates.

Why the difference in findings? Holzer suggest several reasons:

1. The US Department of Education review process for publishing its evaluations is sluggish, and so it leave out at least three recent positive studies of Project Quest and Years Up. These studies also aren’t included in the Katz et al. (2020) review. They are: Roder, Anne and Mark Elliott. 2021. Eleven Year Gains: Project QUEST’s Investment Continues to Pay Dividends. New York: Economic Mobility Corporation; Rolston, Howard et al. 2021. Valley Initiative for Development and Advancement: Three-Year Impact  Report. OPRE Report No. 2021-96, US Department of Health and Human Services; and Fein, David et al. 2021. Still Bridging the Opportunity Divide for Low-Income Youth: Year Up’s Longer-Term Impacts. OPRE Report.

2. The recent studies also have longer follow-up periods, and leaving out these studies means that the WWC summaries don’t include positive long-run effects.

3. With Project Quest, the WWC summary includes some spin-off programs that are similar, but not the same.

4. The WWC apparently has a strict rule in its evaluations that either effects are statistically significant at the 5% level, or worthless. Thus, an effect that is significant at, say, the 6% or 7% level is not viewed as a finding that might be worth more investigation with a larger sample size, but as a purely negative result. There are controversies in economics and statistics over how and when to use a 5% level of significance, but both sides of the controversy agree that this kind of black-and-white use of a rigid standard is not sensible.

5. The WWC also has strict rules about what it counts as evidence. For example, say that the WWC wants to evaluate effects after 3, 5, and 7 years, but a study evaluates the evidence after 4, 6, and 8 years. Holzer says that the WWC would then ignore that study, because it does not “align with WWC’s preferred measures.

At a minimum, concerns like these suggest that sectoral training should not be dismissed based on the US Department of Education website evaluations. These programs have a range of different procedures and are focused on different groups, and there remains much to learn about best practices. But there is a strong case for continuing to expand and study such programs.

Share this article

Leave your comments

Post comment as a guest