Homogeneous and NU Refinements are worse than Homogeneous Reconstruction?

Hello! I am currently struggling with an issue where for my given particle set, homogeneous reconstruction seems to produce a map with better resolution, a better GSFSC curve, and a better cFAR score than Homogeneous or Non-Uniform Refinement.

Does anyone have an explanation as to why this might occur, and what I could do about it?

1 Like

You have duplicate particles in the homogeneous reconstruction - look at the FSC curve, it does not fall to zero. The estimated resolution and cFAR score for that reconstruction are meaningless (and dangerously misleading) because of this.

Look at the maps - the FSC tells only a small part of the tale.

Also check whether you imposed the correct symmetry during refinement.

Welcome to the forum. :slight_smile: Lots of people happy to help with any questions. :smiley:

3 Likes

Once you’ll have removed the duplicates, try setting maximum alignment resolution to 6 or 4 angstroms (I mean… you can try both) to see how much the noise of high resolution is messing things up.

The first homogenous refinement looks good! But it does have particle duplicates in the half-sets as the FSC does not go below zero. I would recommend using the remove duplicates job and testing different cutoffs until you see the FSC drop below zero in a homogenous refinement job.

The other homogenous and NU refinements do look worse and could be caused by the initial volume not being aligned on a symmetry axis if symmetry is being applied.

I’m sure it is doable to get a good NU refinement with a good cFAR out of this.

Does homogeneous refinement and NU refinement check for duplicate particles and ignore them (as 2D classification does)?

It depends on what you mean by duplicate particles. If you connect multiple stacks of particles to the input of any job, any duplicate particles with the same UID will be filtered out. Check out this post for more information:

It’s important to note that these jobs only detect duplicate particles by checking for duplicate IDs, and they don’t actually compare the particle locations on the micrograph. For example, if the same particle is picked twice, it will get assigned two unique UIDs, and the duplicates won’t get filtered out. You will have to filter out the duplicates using a Remove Duplicate Particles job and specifying the minimum separation distance that must exist for two particle picks to be considered unique particles. As you mentioned, jobs like 2D classification also give you the option to specify a minimum distance between two particles.

If the duplicate particles are the result of a symmetry expansion, these particles cannot be used for any global refinement (heterogeneous, homogeneous, non-uniform). The only jobs that properly handle symmetry expanded particles are local refinement, 3D classification 3DVA, and 3D Flex.

You can find some more helpful information in these threads: