Hi,
All my ab initio reconstruction and refinement jobs consistently finish with exactly 159,300 particles, even when I start with different numbers of particles selected from 2D selection jobs, such as 2 million or 5 million particles. Am I missing a parameter in the job builder that controls the number of particles used?
Welcome to the forum @roney.tucker . You may want to have a look at the related discussion in Ab initio reconstruction unused particles.
Just as a followup question - what’s the logic behind only using a subset of particles? I am thinking that the argument would be that using the entire stack would have little effect other than increased processing time - if representative low resolution models cannot be deduced from 150K stack, using 10x more particles isn’t likely to be of help (and the input stack should be already filtered by 2D classification cycles). Is that about right?
Hi @pozharski! The rationale behind using only a subset of particles for a single-class Ab initio Reconstruction job is explained here.
To summarize you’re essentially right – if you’re requesting only 1 class from Ab initio, that means you think your particle stack is mostly clean. To get the best final map possible, you’ll want to perform Homogeneous Refinement of your particles after Ab initio anyway, so we use a subset to make the Ab initio job faster.
Very occasionally, it would be useful to override this & use all particles.
There are some niche cases (very small, featureless membrane proteins) where stochastic gradient descent does better than branch & bound, and local refinement starting from ab initio poses can be useful.
In these cases, it would be useful to be able to force CS to use all particles in a single class ab initio. Granted this is an uncommon circumstance!