Cheers for the info, Tony. I'm fairly sure I'm not actually hitting a memory limit as some of the ones that have failed with 255 will later pass processing the same data. Like you I've managed to get some of them to work by manually restarting them a few times, but some are proving stubborn or failing intermittently.
Not sure if helps, but the 128 failures seem to happen very quickly. I suspect that the process isn't getting as far as actually trying to run any of the scraper code before throwing the error: