Hi
When I recently ran the 3Dflex reconstruction, I always encountered the following problem: My previous work ran with 120K particles, and it worked fine. But this time, when I tried 500K particles, it always showed the “numpy.core._exceptions._ArrayMemoryError” error. Does it mean I cannot run it with more data?
Please let me know how I can get through this problem. Thank you for any help on it from you.
How much memory does the system have? Because it is trying to allocate 373GB, as the error message indicates. If running on a cluster, see if you can increase memory allocation, if not, upgrade the memory capacity.
If both of those are impossible, you can bin the particles down, but that will obviously limit the maximum possible resolution in the process.
Thanks, this is quite informative and helpful. Our node contains 256GB RAM. This is the problem. Dose cryosparc 3Dflex reconstruction support running on multiple nodes? If so, I think this problem could be fixed.
Thank you so much for the reply. I appreciate it very much!
In the mean time, I’d try binning by 1.5, see if that will just squeak through on 256GB (possibly running another 2D classification to clean out any lingering lower quality particles as well?)