Version 5 breaks Slurm scripts

Hi there,

I updated our cluster from v4.7 to v5.0 yesterday and found that we were no longer able to queue jobs to our Slurm scheduler. The job would hit the “Launched” status and hang.

The output from job.log is below:

ERROR: ld.so: object '/home/exx/software/cryosparc/cryosparc_master/.pixi/envs/master/lib/libpython3.12.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored. 

ERROR: ld.so: object '/home/exx/software/cryosparc/cryosparc_master/.pixi/envs/master/lib/libpython3.12.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored. 

ERROR: ld.so: object '/home/exx/software/cryosparc/cryosparc_master/.pixi/envs/master/lib/libpython3.12.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored. 

/home/exx/software/cryosparc/cryosparc_worker/bin/cryosparcw: line 44: /home/exx/software/cryosparc/cryosparc_master/config.sh: No such file or directory 

/home/exx/software/cryosparc/cryosparc_worker/bin/cryosparcw: line 44: install_error: command not found

I was surprised to see the .pixi directory in the errors, because there is no mention of a switch from conda to pixi in the change notes that I could see. Regardless, after doing some poking around online (and consulting the LLM oracles), it seems that that the issue we were running into comes down to differences in how environment variables are handled between the old system and the new one.

By adding the lines:

unset CRYOSPARC_CONFIG_DIR

unset LD_PRELOAD

unset PYTHONPATH

to the cluster_script.sh, we were then able to submit jobs via Slurm as before.

2 Likes

Thanks @seandworkman for the report.
A potential workaround may be to include in the sbatch options section of the script template:

#SBATCH --chdir={{ job_dir_abs }}
#SBATCH --export=NONE
1 Like

Thanks, I will give this a try! Seems much cleaner.

CryoSPARC v5.0.2 has been released and includes the clearing of master environment variables. The
#SBATCH --export=NONE slurm option we proposed earlier is

  • an alternative workaround to this issue
  • also compatible with CryoSPARC v5.0.2.