Our clusters are configured in LSF and it shows the job id and the queue name for every job submitted. Something like below
Job <17909169> is submitted to queue .
Now cryosparc identifies as the job ID due to which I am not able to query job or kill job
Is it possible if you can share your cluster_info.json and cluster_submission.sh with us?
You can run the command cryosparcm cluster dump to get these files written to the current working directory.
Thanks. This is a bug in cryoSPARC that will be fixed in the next update, which will be released soon. Sorry for any inconvenience in the meantime- you will have to run these commands manually.
Thank you stephan. Is there a timeline estimate for the release of the next update ? Also I want to bring to your attention that cryosparc does not support to like the < (redirect operator) which is kind of the LSF way of submitting jobs (bsub < script.sh)
At the moment we don’t have an exact date, but we should be able to deploy in less than 3 weeks. Also, thanks for bringing this to our attention. Is it possible if you can explain how you currently get around this?
Thats great … For the redirect operator issue my co-worker and I reviewed the command core error and based on that we made some minor hacks to get it to work
That’s exactly what the fix would be. The only difference is that we’d also now create new control blocks for different schedulers, since this part of the code assumed all schedulers supported specifying a bash script as an argument.
As for your original issue, since it seems like you’re comfortable with modifying the code, change the lines in the same file, same function, from:
res = res.strip().split()[-1] # take the last token (to support SLURM)
job_send_streamlog(project_uid, job_uid, "-------- Cluster Job ID: \n%s" % res)
job_send_streamlog(project_uid, job_uid, "-------- Queued on cluster at %s" % str(datetime.datetime.now()))
update_job(project_uid, job_uid, {'cluster_job_id' : res})
to
# Find numeric substrings that may represent the submitted job ID
cluster_job_matches = re.findall('\d+', res)
if len(cluster_job_matches) == 1:
cluster_job_id = cluster_job_matches[0] # take the only numeric substring
else:
cluster_job_id = res.strip().split()[-1] # take the last token
job_send_streamlog(project_uid, job_uid, "-------- Cluster Job ID: \n%s" % cluster_job_id)
job_send_streamlog(project_uid, job_uid, "-------- Queued on cluster at %s" % str(datetime.datetime.now()))
update_job(project_uid, job_uid, {'cluster_job_id' : cluster_job_id})
When we end up releasing an update to cryoSPARC, it will include all of these changes and you won’t have to modify anything in this file again.
If you get to install the patch please let me know if everything is working as intended- you should be able to add the redirect input operator back to your cluster_info.json
I am not sure if I have access to v2.15.2-live_privatebeta when I try to pull the version I get gzip: stdin: unexpected end of file . How can I request access to the private beta ?