Seems utterly pointless though...
With the proof of work approach, at least it's demanding the client consume some resources, though the 'right' amount is a tricky question, either it's so trivial as to hardly matter to the scrapers, or it's hard enough to put a dent in the scrapers' build, but human operated low end devices are royally screwed..
Here the crawler simply schedules a resumption and moves on to other work. The crawler doesn't need it right now and it's free for it to wait.
While I despise the captchas from a human perspective, the fact that an LLM can solve the challenge isn't a deal breaker. It doesn't need to be impossible for a non-human to solve, it just has to be too expensive.
It does certainly shift the equation to stuff like proof of work since a computer can solve it anyway, might as well not annoy the human.