In CS 3.0, getting [Errno 2] No such file or directory

Hi folks,

I was using CS 2.15 and then updated to CS 3.0. To keep my jobs I changed the database in the config.sh file from the default cryosparc_database to cryosparc2_database. This gave me my jobs from CS 2.15. But when I try to run a job, I get this error on the web browser:

[Errno 2] No such file or directory: ‘/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw’: ‘/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw’

Any thoughts on how can I fix this error?
Here is some information that may be helpful in troubleshooting:
----------------------------------------------------------------------------
CryoSPARC System master node installed at
/data2/software/cryosparc/cryosparc_master
Current cryoSPARC version: v3.0.0
----------------------------------------------------------------------------

cryosparcm process status:

app                              RUNNING   pid 36021, uptime 0:14:41
app_dev                          STOPPED   Not started
command_core                     RUNNING   pid 35905, uptime 0:14:51
command_rtp                      RUNNING   pid 35940, uptime 0:14:47
command_vis                      RUNNING   pid 35936, uptime 0:14:49
database                         RUNNING   pid 35824, uptime 0:14:54
liveapp                          RUNNING   pid 36046, uptime 0:14:40
liveapp_dev                      STOPPED   Not started
watchdog_dev                     STOPPED   Not started
webapp                           RUNNING   pid 35996, uptime 0:14:43
webapp_dev                       STOPPED   Not started

----------------------------------------------------------------------------

global config variables:

export CRYOSPARC_LICENSE_ID="5c4c2e66-7a61-11ea-aece-47347def02c5"
export CRYOSPARC_MASTER_HOSTNAME="alita"
export CRYOSPARC_DB_PATH="/data2/software/cryosparc/cryosparc2_database"
export CRYOSPARC_BASE_PORT=39000
export CRYOSPARC_DEVELOP=false
export CRYOSPARC_INSECURE=false
export CRYOSPARC_CLICK_WRAP=true

OS: centos 7 and connected via SB Grid

  Need fixed :  {'SSD': False}
  Master direct :  False
   Scheduling job to alita
Failed to connect link: HTTP Error 502: Bad Gateway
Not a commercial instance - heartbeat set to 12 hours.
     Launchable! -- Launching.
Changed job P2.J3 status launched
      Running project UID P2 job UID J3 
        Running job on worker type node
        Running job using:  /home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw
[JSONRPC ERROR  2020-12-14 15:29:18.115404  at  run_job ]
-----------------------------------------------------
Traceback (most recent call last):
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 2120, in run_job
    stdout=joblog, stderr=joblog, close_fds = True)
  File "/data2/software/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/subprocess.py", line 800, in __init__
    restore_signals, start_new_session)
  File "/data2/software/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/subprocess.py", line 1551, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: '/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw': '/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw'
-----------------------------------------------------
[JSONRPC ERROR  2020-12-14 15:29:18.116305  at  scheduler_run ]
-----------------------------------------------------
Traceback (most recent call last):
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 1728, in scheduler_run
    scheduler_run_core(do_run)
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 1946, in scheduler_run_core
    run_job(job['project_uid'], job['uid']) # takes care of the cluster case and the node case
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 124, in wrapper
    raise e
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 2120, in run_job
    stdout=joblog, stderr=joblog, close_fds = True)
  File "/data2/software/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/subprocess.py", line 800, in __init__
    restore_signals, start_new_session)
  File "/data2/software/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/subprocess.py", line 1551, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: '/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw': '/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw'
-----------------------------------------------------
[JSONRPC ERROR  2020-12-14 15:29:18.116630  at  enqueue_job ]
-----------------------------------------------------
Traceback (most recent call last):
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4772, in enqueue_job
    scheduler_run()
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 124, in wrapper
    raise e
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 1728, in scheduler_run
    scheduler_run_core(do_run)
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 1946, in scheduler_run_core
    run_job(job['project_uid'], job['uid']) # takes care of the cluster case and the node case
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 124, in wrapper
    raise e
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "/data2/software/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 2120, in run_job
    stdout=joblog, stderr=joblog, close_fds = True)
  File "/data2/software/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/subprocess.py", line 800, in __init__
    restore_signals, start_new_session)
  File "/data2/software/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/subprocess.py", line 1551, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: '/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw': '/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw'
-----------------------------------------------------
(END)

WebApp Log:

(node:35996) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
ESC[32mReady to serve GridFSESC[39m
==== [projects] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [workspace] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [projects] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [projects] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [workspace] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [projects] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
set_user_viewed_workspace
["5f997ce94ffab3285ea7ae38","P2","W1"]
==== [workspace] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [workspace] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
kill_job
{"project_uid":"P2","job_uid":"J3","killed_by_user_id":"5f997ce94ffab3285ea7ae38"}
clear_job
{"project_uid":"P2","job_uid":"J3"}
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [projects] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [workspace] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [projects] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [projects] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [workspace] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
enqueue_job
{"project_uid":"P2","job_uid":"J3","hostname":"alita","gpus":[0,1,2,3]}
{ code: 500,
  data: null,
  message: 'OtherError: [Errno 2] No such file or directory: \'/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw\': \'/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw\'',
  name: 'OtherError' }
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
enqueue_job
{"project_uid":"P2","job_uid":"J3","lane":"default"}
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [workspace] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
[PUB] job.events.checkpoints: { project_uid: 'P2', job_uid: 'J3', type: 'checkpoint' }
set_user_viewed_job
["5f997ce94ffab3285ea7ae38","P2","W1","J3"]
[PUB] job.events: { project_uid: 'P2', job_uid: 'J3' } 100 0
[PUB] events.countAfterCheckpoint
clear_job
{"project_uid":"P2","job_uid":"J3"}
==== [jobs] project query user  5f997ce94ffab3285ea7ae38 prashant singh true
enqueue_job
{"project_uid":"P2","job_uid":"J3","lane":"default"}
{ code: 500,
  data: null,
  message: 'OtherError: [Errno 2] No such file or directory: \'/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw\': \'/home/singhpk/software/cryosparc/cryosparc2_worker/bin/cryosparcw\'',
  name: 'OtherError' }
Waiting for data... (interrupt to abort)

Hi @prash, if the new worker installation path is at the cryosparc_worker, you just have to re-register the new path with the cryosparcw connect command using the --update flag.

As per this post:

Hi @nfrasser,

Thank you for the quick response. This solution resolved the issue.

To summarize:

  1. I had to first run the install.sh

./install.sh --license <license_id> --cudapath usr/local/cuda

  1. Then I ran the connect command with update flag. Replaced localhost with the name of my computer as listed in the config.sh file found in cryosparc_master folder.

./bin/cryosparcw connect --worker localhost --master localhost --port 39000 --ssdpath /path/to/scratch//space/as/in/the/master/installation/ --update

1 Like