Hi CyoSPARC,
I have been testing an installation on a SLURM cluster by running the T20s Benchmark Workflow job. I am running this job to ensure CryoSPARC executes correctly before it is opened up to end users.
On a Standalone installation it runs fine (although installed on different hardware).
I have found that after the ‘Blob Picker’ step the whole process fails as the ‘Inspect Picks’ step does not respond within the required 120s.
To get around this I modified the file “cryosparc_master/cryosparc_compute/jobs/workflows/buildrun_bench.py” increasing the assert statements to 360s. Inside the functions: get_curated_blob_picks() and run_rigid_local_motion_bench().
Then I restarted CryoSPARC and started again the job again.
The above code changes worked, however the process than failed again after the ‘Refinement New’ step. Both of the ‘3D Class steps’ failed with the same error.
[CPU: 916.2 MB] Traceback (most recent call last):
File "cryosparc_worker/cryosparc_compute/run.py", line 85, in cryosparc_compute.run.main
File "cryosparc_worker/cryosparc_compute/jobs/class3D/run.py", line 635, in cryosparc_compute.jobs.class3D.run.run_class_3D
File "/mnt/userdata/jvanschy/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.7/subprocess.py", line 411, in check_output
**kwargs).stdout
File "/mnt/userdata/jvanschy/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.7/subprocess.py", line 488, in run
with Popen(*popenargs, **kwargs) as process:
File "/mnt/userdata/jvanschy/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.7/subprocess.py", line 800, in __init__
restore_signals, start_new_session)
File "/mnt/userdata/jvanschy/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.7/subprocess.py", line 1551, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'zip': 'zip'
Any help would be greatly appreciated.
I am running CryoSPARC 3.3.1.
Thanks,
Jay.