Loading the wrong jobs into scratch? (modifying particle stacks)

solved

#1

Hi there,

I’m using version 2.8.0 and trying to run iterative 2D classifications before generating an ab initio model. I initially ran into issues with our scratch drive being too small to load my entire particle stack, so I split my particles into three. After doing two rounds of 2Ds, I am left with 396563 particles; however, when I try to run the job I am stuck with the same problem of our scratch drive being too small because instead of loading in the selected particles from my previous 2D classification, the program is loading in all of my particles from my local motion correction job. I don’t recall running into scratch space issues like this with a previous dataset I was working on, and I was using a much larger particle stack.


#2

I seem to have solved my own problem here.

I took the particle files out of my 2D classifications and re-ran local motion correction to re-extract smaller particle stacks that I will be able to fit into a single job together. It would definitely be nice if there were a simpler way to make a smaller particle stack from combined jobs.


#3

Hi @seandworkman,

I’m glad you were able to find a solution. Particle stack tools are currently being developed and will hopefully make it to a release soon!