Live 2D Class out of memory, big datasets question

  1. Is there a way to use the remaining unclassified particles from a 2D job that ran out of memory on CS Live in a new CS 2D job? I would like to classify the remaining unclassified particles in a separate job and combine with the particles CS Live has already classified.

  2. Is there a way to split datasets live so I don’t run out of memory next time? Or is there a way to cap the size of a particular job and launch multiple ones instead so I don’t run out of memory? The data is already downsampled.

as a follow-up, I can’t export the particles from Live to CS via Live Particle Export. The job runs on the master node which does not have enough memory, even if I try to queue on a node with more memory.

Is there a way around this? Can we override the master node and have the job launch on the selected lane/queue? Or, can we use Scratch instead of RAM for this type of job?