Minimize over per-particle scale during alignment option for Local Refinement

Hi All, I have a question about the following option in Local Refinement:
Minimize over per-particle scale during alignment

The default is off. The comment for this option says “Don’t do this in a local refinement because scales will be wrong with very little signal”.

I do see this output for per-particle scale during my Local Refinement:

Should I turn it on to “Minimize over per-particle scale”?

Thanks so much!

Hi @donghuachen,

Generally in a local refinement, because the amount of protein mass inside the mask may be very small, we don’t recommend trying to re-optimize the per-particle scale factors. By default, the local refinement job will use the already-optimized per-particle scale factors from the input particles, and those would have come from eg. a consensus refinement of the whole map, where there is enough signal to perform per-particle scale optimization.

So the histogram you are seeing is showing that the input particles already have a relatively broad distribution of per-particle scale factors (some having as much as 2x the contrast of others) and these factors are being used during reconstruction in local refinement.

You can definitely try to optimize the per-particle scales again during the local refinement (by turning on the parameter you mentioned), but you may see the results get worse. It’s hard to predict what will happen since it depends a lot on your data/mask/protein etc!

Hi @apunjani,

I imported the particles from a Relion star file. I wonder where the per-particle scale factors are from if Cryosparc does not calculate them.

The scale factor for each particle should be 1 if I don’t want to optimize the per-particle scales? Thanks so much!

Hi @apunjani,

I am using v3.1.0 to do Local Refinements (BETA) with the option “Minimize over per-particle scale” off. Got puzzled about why Per particle scale factors mean = 1.0 in some case and why it is a range in other case (see attached images). Thanks.

Hi @donghuachen,

Local refinements will (by default) read the scale factors from the input particles. So, if your input particles came from a refinement where the “Minimize over per-particle scale” parameter was off, the local refinement will keep the default scales of 1.0. If the input particles instead came from a refinement where this parameter was on, then local refinement will keep whatever scales the refinement converged to.

In either case, you can force the scale factors to all be 1.0 by turning on the “Reset input per-particle scale” parameter.

Best,
Michael

Hi @mmclean @apunjani,

When I was doing those NU Refinements (Legacy) before Local Refinements for this project, I never turned on the option “Minimize over per-particle scale”. Then how could my input particles have this “per-particle scale” parameter? Maybe somehow the option “Minimize over per-particle scale” was turned on as the default during some Local Refinements?

Hi @donghuachen,

That behaviour certainly seems odd. Is there a chance that the particles were imported from elsewhere, and hence already had scales written to? Alternatively, if you had run your initial NU refinements with the “Reset input per-particle scale” off, then the scales will be read from the parent job (most likely an ab-initio job), which could also explain why the scales are different.

In any event, at the local refinement stage, you can simply force the scales to all be 1.0 by activating the “Reset input per-particle scale” parameter. This has to be set manually because it is off by default for local refinement.

Best,
Michael