How to create lanes for cryosparc live running on standalone workstation?

I have a 4x 2080ti workstation with cryosparc installed
When running cryosparc live, I only have 1 lane with all 4 gpus
I would like to split that into 2 lanes of 2 GPUs each for preproc and 2D/3D
How do I do that?

Hi @fjkoh,

To split a workstation into multiple lanes based on the number of GPUs, you can use the cryosparcw connect command and specify the --gpus argument. You would have to run this twice, once for each lane:
https://guide.cryosparc.com/setup-configuration-and-management/how-to-download-install-and-configure/downloading-and-installing-cryosparc#connect-a-managed-worker-to-cryosparc
For example:

./bin/cryosparcw connect --worker <worker_hostname> \
                         --master <master_hostname> \
                         --port <port_num> \
                         --ssdpath <ssd_path> \
                         --gpus 0,1 \
                         --newlane \
                         --lane lane01
# then:
./bin/cryosparcw connect --worker <worker_hostname> \
                         --master <master_hostname> \
                         --port <port_num> \
                         --ssdpath <ssd_path> \
                         --gpus 2,3 \
                         --newlane \
                         --lane lane02

You shouldn’t need to do this since you can specify the same lane for all three stages. Leaving it as one lane will also allow you to be more flexible (e.g. if you want to allocate 4 GPUs to preprocessing for a short period of time before you scale back down to 1 GPU for preprocessing and start 2D/3D).

1 Like

Hi Stephan,

Thanks for the explanation.

@stephan is this still possible with v3? it complains that the (worker?) hostname is already registered?

“ERROR: This hostname is already registered! Remove it first.”

@nimgs-it As a workaround, you could define additional stanzas in ~/.ssh/config on the master node, e.g.

Host worker_alias1
    HostName <worker_hostname>

Host worker_alias2
    HostName <worker_hostname>

And register the worker lanes using those host aliases rather than the actual hostname.

Additionally, depending on how your nodes are setup, you may want to read up on the StrictHostKeyChecking and UserKnownHostsFile ssh settings, to get ssh-login working correctly for cryoSPARC.

Hi @leetleyang

After doing so, the lanes run well, but everytime I launch a job I get a popup window from OpenSSH requesting my ssh password. I have played with the StrictHostKeyChecking and UserKnownHostsFile ssh settings, but those seem to not change that behaviour. I obviously do not want to enter my password for each launched job. Do you know what I might be missing?

Thank you,
André

Hi André,

Have you already double-checked that you have the cryosparc account’s SSH key-pair set up appropriately? It may be worth checking by connecting specifically to the hostname aliases. Assuming that’s where things are tripping up, you may want to either specify the relevant identity file to use in .ssh/config, or include it (-i flag) as part of cryosparcw connect --sshstr.

Cheers,
Yang