pith. machine review for the scientific record. sign in

arxiv: 2604.19178 · v1 · submitted 2026-04-21 · 💰 econ.GN · q-fin.EC

Recognition: unknown

A rapid evaluation of Australia's COVID-era apprentice wage subsidy programs

Ethan Slaven, Patrick Rehill, Peter Bowers

Authors on Pith no claims yet

Pith reviewed 2026-05-10 01:41 UTC · model grok-4.3

classification 💰 econ.GN q-fin.EC
keywords apprenticeshipwage subsidyCOVID-19policy evaluationcommencementsretentioncancellation ratessharp practice
0
0 comments X

The pith

Australia's COVID wage subsidies increased apprenticeship starts by 70% but raised cancellation rates.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper evaluates the Boosting Apprenticeship Commencements and Completing Apprenticeship Commencements wage subsidy programs introduced in 2020. Econometric analysis of administrative data shows the programs produced a large jump in new apprenticeship commencements. Retention rates remained unchanged and cancellation rates rose modestly, pointing to lower expected completion rates than in prior cohorts. Interviews with employers and industry bodies reveal that some non-trade employers converted existing workers to apprentices solely to claim the subsidy without committing to full training. Readers would care because the results show how emergency labor supports can expand volume rapidly while creating incentives that undermine long-term skill outcomes.

Core claim

The BAC and CAC programs produced a 70% increase in apprenticeship commencements. Retention rates did not rise, and cancellation rates increased slightly overall, with a 7% rise for non-trade apprenticeships and a 0.7% fall for trade apprenticeships. The non-trade increase is attributed to sharp practice in which employers shifted current staff onto apprenticeships to capture the subsidy payments with no plan to retain them as apprentices after the subsidy ended. Employers valued the front-loaded payment structure that delivered the largest support when apprentices were newest and least productive.

What carries the argument

Mixed-methods evaluation that pairs econometric models of administrative data on commencements and cancellations with qualitative interviews of employers and peak bodies to detect both aggregate effects and the mechanism of sharp practice.

If this is right

  • Front-loaded subsidy payments gave employers the most help precisely when apprentices were least productive.
  • Rapid rollout produced a rush that strained training providers and created opportunities for sharp practice.
  • Non-trade apprenticeships were more vulnerable to employer conversions than trade apprenticeships.
  • Subsidies can scale apprenticeship volumes quickly but need design features that discourage short-term gaming.
  • Completion rates may fall for the subsidized cohorts relative to earlier ones.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Future crisis subsidies could require a minimum prior employment period before an existing worker qualifies for conversion to an apprentice.
  • Tying a portion of payments to training milestones or completion rather than commencement alone would better align employer incentives.
  • Large sudden increases in training demand without parallel capacity planning risk lowering average training quality across the system.
  • The trade versus non-trade difference suggests that uniform subsidy rules across sectors can produce uneven unintended effects.

Load-bearing premise

The econometric models can separate the effects of the BAC and CAC subsidies from all other COVID-era economic shocks and policy changes occurring at the same time.

What would settle it

Re-estimating the models after adding controls for every concurrent pandemic policy and economic shock; if the 70% commencement increase disappears, the causal claim would be falsified.

Figures

Figures reproduced from arXiv: 2604.19178 by Ethan Slaven, Patrick Rehill, Peter Bowers.

Figure 1
Figure 1. Figure 1: Key policy events and number of apprenticeship commencements by quarter (based on DEWR administrative [PITH_FULL_IMAGE:figures/full_fig_p002_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Comparison of commencements and forecasted commencements without BAC by quarter. [PITH_FULL_IMAGE:figures/full_fig_p009_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Effect of BAC on the number of commencements by trade and non-trade occupations. [PITH_FULL_IMAGE:figures/full_fig_p009_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: BAC quarterly effects on apprenticeship commencements, with seasonal and economic controls (error bars [PITH_FULL_IMAGE:figures/full_fig_p010_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: BAC quarterly effects on apprenticeship commencements by state / territory, with seasonal and economic [PITH_FULL_IMAGE:figures/full_fig_p011_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: BAC’s effect on trade and non-trade occupation commencement rates. Note: the average commencements is [PITH_FULL_IMAGE:figures/full_fig_p013_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Cancellation rate by commencement cohort [PITH_FULL_IMAGE:figures/full_fig_p014_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Observed and counterfactual outcomes across time for cancellations. [PITH_FULL_IMAGE:figures/full_fig_p014_8.png] view at source ↗
Figure 9
Figure 9. Figure 9: The effect of the programs on cancellation rate by jurisdiction and occupation. Subgroups with statistically [PITH_FULL_IMAGE:figures/full_fig_p016_9.png] view at source ↗
read the original abstract

In the midst of the COVID-19 pandemic in 2020, the Australian Government launched two programs to incentivise new apprentices to start and complete apprenticeships -- the Boosting Apprenticeship Commencements (BAC) and Completing Apprenticeship Commencements (CAC) programs. These programs were wage subsidies to encourage employers to take on or retain apprentices. This paper evaluates the impact of these programs on apprenticeship commencements and completions taking a mixed-methods approach combining econometric modelling and interviews with stakeholders including employers and peak bodies. The programs led to a 70\% increase in commencement of apprenticeships but do not seem to have boosted retention rates. There appears to be a small increase in cancellation rates suggesting lower eventual completion rates compared to previous cohorts. Cancellation rates were higher for non-trade commencements (7\% increase) during BAC, but slightly lower for trade commencements (0.7\% decrease). We find this effect in non-trade apprenticeships was likely driven by `sharp practice' where some employers took advantage of the BAC by converting existing employees over to apprenticeships to attract the wage subsidy with no intention of having these employees stay as apprentices beyond the period of the BAC's generous subsidy. While the BAC / CAC were successful in many of their goals, there are several lessons that can be learnt from its design. In particular, the need to implement the program quickly meant early design choices inadvertently encouraged `sharp practice' and a rush for places that placed strain on the training sector. However, employers appreciated the front-loading of payments which provided the most financial support when apprentices were new and at their least productive.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 3 minor

Summary. The manuscript evaluates the Boosting Apprenticeship Commencements (BAC) and Completing Apprenticeship Commencements (CAC) wage subsidy programs launched by the Australian Government in 2020. Employing a mixed-methods design that combines econometric modelling of administrative data with interviews of employers and peak bodies, the paper claims the programs produced a 70% increase in apprenticeship commencements. It reports no improvement in retention rates, a modest rise in cancellation rates (7% for non-trade apprentices, 0.7% decline for trade), which it attributes to 'sharp practice' whereby some employers converted existing workers to apprentices solely to capture the subsidy. The analysis concludes with design lessons, praising front-loaded payments while criticising rushed rollout that strained training providers and encouraged unintended behaviour.

Significance. If the causal estimates hold after proper identification, the paper supplies timely evidence on wage-subsidy effectiveness during crises and documents a concrete mechanism of employer gaming that is policy-relevant for future interventions. The integration of quantitative trends with stakeholder interviews adds qualitative depth uncommon in rapid evaluations. Strengths include direct engagement with administrative records and primary interviews; these elements would be strengthened by explicit robustness checks.

major comments (2)
  1. [Section 4] Section 4 (Econometric Modelling): The headline claim that the programs 'led to' a 70% increase in commencements, together with the findings on unchanged retention and elevated cancellations, rests on an unspecified identification strategy. No difference-in-differences timing, set of time-varying COVID-19 controls (lockdowns, industry demand shocks, other wage supports), or robustness to alternative counterfactuals is described. Without these details the 70% figure cannot be distinguished from the broader 2020 labour-market disruption, rendering the central causal attribution and the 'sharp practice' interpretation untestable.
  2. [Section 5] Section 5 (Results on Cancellations and Retention): The reported differential cancellation effects (7% rise for non-trade vs 0.7% fall for trade) and the attribution to sharp practice lack accompanying model specifications, standard errors, sample sizes, or checks for selection into the subsidy. These omissions are load-bearing because the policy lesson about design flaws hinges on the econometric separation of subsidy-driven behaviour from pandemic-wide trends.
minor comments (3)
  1. [Abstract] Abstract: Headline percentages are presented without any reference to data sources, model specification, or robustness checks; a one-sentence summary of the econometric approach would improve transparency.
  2. [Figures/Tables] Figure and table captions: Pre- and post-program periods, confidence intervals, and exact sample definitions are not uniformly labelled, making it difficult to assess trend breaks visually.
  3. [Methods] Interview protocol: The description of stakeholder interviews omits sample size, selection criteria, and question guide; these details belong in the methods section or an appendix.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their constructive and detailed comments on our mixed-methods evaluation of the BAC and CAC programs. We have revised the manuscript to address the concerns about econometric identification and result transparency, strengthening the presentation of our causal claims and policy lessons without altering the core findings.

read point-by-point responses
  1. Referee: [Section 4] Section 4 (Econometric Modelling): The headline claim that the programs 'led to' a 70% increase in commencements, together with the findings on unchanged retention and elevated cancellations, rests on an unspecified identification strategy. No difference-in-differences timing, set of time-varying COVID-19 controls (lockdowns, industry demand shocks, other wage supports), or robustness to alternative counterfactuals is described. Without these details the 70% figure cannot be distinguished from the broader 2020 labour-market disruption, rendering the central causal attribution and the 'sharp practice' interpretation untestable.

    Authors: We agree that Section 4 would benefit from expanded detail on the identification strategy. In the revised manuscript we now explicitly describe the difference-in-differences design, including the precise timing of the BAC and CAC interventions relative to pre-pandemic trends, the inclusion of time-varying controls for state-level lockdowns, industry demand shocks, and other contemporaneous supports (e.g., JobKeeper), and a set of robustness checks using alternative counterfactuals such as non-subsidised occupations and placebo periods. These additions allow the 70% commencement increase to be distinguished from general 2020 labour-market conditions. The sharp-practice interpretation continues to rest on the joint evidence of the quantitative patterns and the employer/peak-body interviews, which we have now cross-referenced more explicitly with the econometric results. revision: yes

  2. Referee: [Section 5] Section 5 (Results on Cancellations and Retention): The reported differential cancellation effects (7% rise for non-trade vs 0.7% fall for trade) and the attribution to sharp practice lack accompanying model specifications, standard errors, sample sizes, or checks for selection into the subsidy. These omissions are load-bearing because the policy lesson about design flaws hinges on the econometric separation of subsidy-driven behaviour from pandemic-wide trends.

    Authors: We accept that the original presentation of the cancellation and retention results was insufficiently detailed. The revised Section 5 now reports the full regression specifications, standard errors, sample sizes, and selection-robustness checks (including propensity-score weighting and subsample analyses that exclude likely converters). These additions confirm the 7% rise for non-trade apprenticeships and the 0.7% decline for trade apprenticeships, and they help isolate subsidy-driven behaviour from broader pandemic effects. The sharp-practice mechanism is further supported by the qualitative interviews, which we now integrate more tightly with the quantitative findings to justify the design lessons. revision: yes

Circularity Check

0 steps flagged

No significant circularity in empirical evaluation

full rationale

The paper is a mixed-methods empirical evaluation relying on administrative data analysis and stakeholder interviews to assess the impact of wage subsidy programs. No mathematical derivations, self-definitional constructs, fitted parameters renamed as predictions, or self-citation chains are present that would reduce the central claims (such as the 70% commencement increase or retention effects) to their inputs by construction. The econometric modelling is described at a high level without equations or ansatzes that collapse into tautologies, and results are grounded in data comparisons rather than internal redefinitions. This is a standard non-circular empirical study.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

The evaluation rests on standard labor-economics assumptions for causal identification in observational data and on the representativeness of stakeholder interviews.

axioms (1)
  • domain assumption Econometric models can isolate the subsidy effect from concurrent pandemic shocks and other policies
    Invoked when attributing the 70% rise and cancellation changes directly to BAC/CAC.

pith-pipeline@v0.9.0 · 5590 in / 1120 out tokens · 41142 ms · 2026-05-10T01:41:49.418164+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

39 extracted references · 4 canonical work pages

  1. [1]

    and Ivankova, Nataliya V

    Plano Clark, Vicki L. and Ivankova, Nataliya V. , month = jan, year =. Mixed. doi:10.4135/9781483398341 , publisher =

  2. [2]

    Imbens and Donald B

    Imbens, Guido W. and Rubin, Donald B. , year =. Causal. doi:10.1017/CBO9781139025751 , keywords =

  3. [3]

    Variation in policy response to

    Edwards, Ben and Barnes, Roy and Rehill, Patrick and Ellen, Lucy and Zhong, Fiona and Killigrew, Alfie and Gonzalez, Pedro Riquelme and Sheard, Elena and Zhu, Ruibiao and Philips, Toby , year =. Variation in policy response to

  4. [4]

    2024 , annote =

    Labour. 2024 , annote =

  5. [5]

    2021 , publisher =

    Huntington-Klein, Noah , title =. 2021 , publisher =. doi:10.1201/9781003226055 , url =

  6. [6]

    2024 , month = mar, date =

    Design and Implementation of the Australian Apprenticeships Incentive System , author =. 2024 , month = mar, date =

  7. [7]

    2012 , annote =

    Employment. 2012 , annote =

  8. [8]

    2023 , url =

    Australian. 2023 , url =

  9. [9]

    2023 , annote =

    National. 2023 , annote =

  10. [10]

    2024 , annote =

    Strategic. 2024 , annote =

  11. [11]

    2024 , annote =

    Australian. 2024 , annote =

  12. [12]

    and McDonald, R

    Dickie, M. and McDonald, R. and Pedic, F. , url =. A

  13. [13]

    2024 , annote =

    Internet. 2024 , annote =

  14. [14]

    , month = feb, year =

    Karmel, T. , month = feb, year =. Factors

  15. [15]

    and Karmel, T

    Nechvoglod, L. and Karmel, T. and Saunders, J. , year =. The

  16. [16]

    and Yuen, K

    Nelms, L. and Yuen, K. and Pung, A. and Farooqui, S. and Walsh, J. , month = feb, year =. Factors

  17. [17]

    , year =

    Powers, T. , year =. Factors

  18. [18]

    2023 , annote =

    Independent. 2023 , annote =

  19. [19]

    Journal of Vocational Education & Training , author =

    The expansion and contraction of the apprenticeship system in. Journal of Vocational Education & Training , author =. 2021 , pages =. doi:10.1080/13636820.2021.1894218 , language =

  20. [20]

    Rapid evaluation of a

  21. [21]

    2022 , annote =

    Australian. 2022 , annote =

  22. [22]

    and Ackehurst, M

    Stanwick, J. and Ackehurst, M. and Frazer, K. , title =. 2021 , url =

  23. [23]

    2020 , url =

    National Agreement for Skills and Workforce Development Review: Study Report , institution =. 2020 , url =

  24. [24]

    and Oliver, D

    McDowell, J. and Oliver, D. and Persson, M. and Fairbrother, R. and Wetzlar, S. and Buchanan, J. and Shipstone, T. , title =. 2011 , type =

  25. [25]

    , title =

    Owen, M. , title =. 2016 , type =

  26. [26]

    and Lindgren, J

    Laundy, C. and Lindgren, J. and McDougall, I. and Diamond, T. and Lambert, J. and Luciani, D. and De Souza, M. , title =. 2016 , type =

  27. [27]

    , title =

    Bednarz, A. , title =. 2014 , url =

  28. [28]

    , title =

    Schofield, K. , title =. 1999 , type =

  29. [29]

    2010 , type =

    Report 1: Overview of the Australian Apprenticeship and Traineeship System , institution =. 2010 , type =

  30. [30]

    , title =

    Knight, B. , title =. 2012 , type =

  31. [31]

    , title =

    Pfeifer, H. , title =. 2016 , url =

  32. [32]

    The value of apprentices in the care sector: the effect of apprenticeship costs on the mobility of graduates from apprenticeship training

    Schuss, Eric. The value of apprentices in the care sector: the effect of apprenticeship costs on the mobility of graduates from apprenticeship training. Empir. Res. Vocat. Educ. Train

  33. [33]

    On the productivity effects of training apprentices in Hungary: evidence from a unique matched employer--employee dataset

    Cabus, Sofie and Nagy, Eszter. On the productivity effects of training apprentices in Hungary: evidence from a unique matched employer--employee dataset. Empir. Econ

  34. [34]

    and Walter, Thomas and Neureiter, Marcus and Oschmiansky, Frank and Wielage, Nina and Hofmann, Manuela and Schneider-Haase, Torsten , title =

    Bonin, Holger and Fries, Jan and Hillerich, Annette and Maier, Michael F. and Walter, Thomas and Neureiter, Marcus and Oschmiansky, Frank and Wielage, Nina and Hofmann, Manuela and Schneider-Haase, Torsten , title =. 2013 , month = jun, issn =

  35. [35]

    Empirical evaluation of professional traineeships for young people up to 30 years of age

    Hora, Ond r ej. Empirical evaluation of professional traineeships for young people up to 30 years of age. Cent. Eur. J. Publ. Pol

  36. [36]

    2024 , month = jul, url =

    Financial incentives and non-financial incentives for apprenticeships, traineeships and employment: An international literature review , type =. 2024 , month = jul, url =

  37. [37]

    Size of training firms -- the role of firms, luck, and ability in young workers' careers

    M \"u ller, Steffen and Neubaeumer, Renate. Size of training firms -- the role of firms, luck, and ability in young workers' careers. Int. J. Manpow

  38. [38]

    2013 , url =

    Bricklaying Contractors Stepping Up: Training bricklaying apprentices on-the-job has its challenges , author =. 2013 , url =

  39. [39]

    Dynamic models for dynamic theories: The ins and outs of lagged dependent variables

    Keele, Luke and Kelly, Nathan J. Dynamic models for dynamic theories: The ins and outs of lagged dependent variables. Polit. Anal