Local and remote path variables on cryosparc cluster integration

Hi all,
our cluster does not allow to run the cryosparc database on the login nodes (very hard runtime restrictions). So I installed the cryosparc master instance on a local workstation and used sshfs/FUSE to mount the cluster’s storage locally on the master node. Now, obviously absolute paths are different for the local (master) and remote (worker) computers, which runs into problems when the job is submitted:

-------- Submission command:
ssh myuser@mycluster sbatch /Local/app/cryosparc/road_to_cluster/cryosparc_tutorial/P1/J3/queue_sub_script.sh
Failed to launch! 255

I want ssh to run sbatch from a different folder than /Local/app, which corresponds to the local master direction tree

cluster_info.json calls for the variable {{ script_path_abs }} that is defined in the run.py scripts. Before messing the python scripts up; is there a different path_abs I can call instead of script_path_abs to account for the remote filesystem?

Thanks in advance!
Dan

Hi @mannda,

Is it possible to solve the path issue using symbolic links? Maybe you can have the symbolic link to the mounted drive be the same as it is on the remote computers or vice versa?