Downsampling problem in combined particle stacks

Hi,

I combined 3 particle stacks from 3 datasets. Before I did that, I made sure that I extracted the particles from each dataset’s micrographs at the same box size of 540 px. Before that for refinement of each stack, I cropped two of the sets from 540 to 256, and in one dataset (my first) I cropped from 512 to 256. After I picked the desired particles from each of these datasets, I extracted each stack from the micrographs at 540 px, and combined using exposure group utilities (median) to average the CTF values.
I ended up with a group of 67k particles that I ran through homogenous refinement, and got a reconstruction @ 2.58 angstroms. I want to classify these particles using 3DVA, but the job fails as the size is too large for my current disk space. I want to downsample these particles to 128 px, but when I try to run the downsample job, I got an error “particles must all have same alignment pixel size”.
I do not understand why the alignment pixel size is different when I extracted each stack at the same size of 540 px. How can I solve this problem?
Now my particle stacks have been exported from different disk locations by creating a soft link in the current processing disk, so I feel that it will be too complicated to reextract the particles at the full size after running the cluster mode of 3DVA, since I won’t know which dataset the particles came from…so does this mean I will have to create soft links for the micrographs too, and use all of them as an input for the extract job??? these will be around 30 k micrographs…!!! So I am not sure how to navigate this. What could be the simplest way to run 3DVA on these combined particles?