Title says most of it. I am trying to do an ab-initio reconstruction with ~500k particles, and it loads all of them, but then on my cryosparc instance (v4.2.1), it somehow only uses 288300. This has been true of other jobs where the particle numbers are above 288300. I am currently just funneling the unused particles into another ab initio reconstruction, but I was curious if anyone has run into this problem and has any advice on how I could get it to use the full stack? Thank you!
This is expected, see previous discussion here and links within: Ab initio reconstruction unused particles - #2 by wtempel
Ab-initio is very time-consuming, and very prone to error as particles will have a lot of freedom to turn… as far as your particles are randomized in orientation, I see no reason to do ab-initio single class with so many of them - or else please tell me why would you need to do this. Something like 10 k ptcls should be enough for a map at ~12 angstroms, maybe more ptcls if you have a large set and want a representative result. - let me know if I’m missing something here.
Hi Carlos, thank you for the thorough response. The reason i want to do it with one class is that I have found that when i do a multiclass ab-initio that I have a lot of one of my particle orientations end up in one of the classes, so I’ve always thought that means that multi-class ab-initio might not be a good choice.
Hey @gmperez ,
Well, I still don’t understand why do you need to use all of them in the ab-initio reconstruction. If you have a reasonable starting map from a small subset of particles (something that roughly resembles what you expect the protein to be, no high resolution needed, weird blobs allowed), you just need to launch a homogeneous refinement with all particles to have them aligned. From that you can run 3DVA, 3D classifications, 3DFlex… and the 2D classes should look nicer.
Hi Carlos,
Thank you very much for your reply – I will give those options a try!