Fresh installation - "Could not connect to command"

Hi everyone,

I’m trying to get a fresh installation of cryoSPARC 2.15 set up on my cluster, and I’m failing - can anyone suggest where I could look next?

Many thanks!

Andrew

Symptom:

Installation of the cryosparc2_master and cryosparc2_worker packages went OK, but when I log in to the GUI, most actions are greyed out - I presume this is because I haven’t yet accepted the license agreement. When I click on “Accept” I get a notification saying “Cannot connect to command”.

Environment:

  • Both the master and worker nodes are running CentOS 7.6
    • (The firewall is currently turned off)
  • The worker nodes have CUDA 10.1 installed
  • The filesystems containing the installation and the database are NFS mounted on all nodes from the cluster’s file-server.

cryoSPARC config:

$ cryosparcm status

CryoSPARC System master node installed at
/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master
Current cryoSPARC version: v2.15.0

cryosparcm process status:

app STOPPED Not started
app_dev STOPPED Not started
command_core RUNNING pid 40836, uptime 0:13:23
command_proxy RUNNING pid 40866, uptime 0:13:20
command_rtp STOPPED Not started
command_vis RUNNING pid 40859, uptime 0:13:21
database RUNNING pid 40387, uptime 0:13:25
watchdog_dev STOPPED Not started
webapp RUNNING pid 40870, uptime 0:13:18
webapp_dev STOPPED Not started


global config variables:

export CRYOSPARC_LICENSE_ID="***********"
export CRYOSPARC_MASTER_HOSTNAME=“lambda.servers.mrc-mbu.cam.ac.uk
#export CRYOSPARC_MASTER_HOSTNAME=“lambda.maas”
#export CRYOSPARC_HOSTNAME_CHECK=“lambda.servers.mrc-mbu.cam.ac.uk
export CRYOSPARC_FORCE_HOSTNAME=true
export CRYOSPARC_DB_PATH="/usr/mbu/cryosparc/database2.15"
export CRYOSPARC_BASE_PORT=39000
export CRYOSPARC_DEVELOP=false
export CRYOSPARC_INSECURE=false
export CRYOSPARC_CLICK_WRAP=true

$ cat cryosparc2_worker/config.sh

export CRYOSPARC_LICENSE_ID="**************"
export CRYOSPARC_USE_GPU=true
export CRYOSPARC_CUDA_PATH="/usr/local/cuda-10.1"
export CRYOSPARC_DEVELOP=false

cryoSPARC log files - (I’ve attempted only to show messages after my most recent start, at 15:02:21 today

Database log:

2020-10-23 15:02:21,260 INFO RPC interface ‘supervisor’ initialized
2020-10-23 15:02:21,260 CRIT Server ‘unix_http_server’ running without any HTTP authentication checking
2020-10-23 15:02:21,261 INFO daemonizing the supervisord process
2020-10-23 15:02:21,268 INFO supervisord started with pid 40385
2020-10-23 15:02:21,491 INFO spawned: ‘database’ with pid 40387
2020-10-23 15:02:22,718 INFO success: database entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:23,654 INFO spawned: ‘command_core’ with pid 40836
2020-10-23 15:02:25,267 INFO success: command_core entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:25,560 INFO spawned: ‘command_vis’ with pid 40859
2020-10-23 15:02:26,561 INFO success: command_vis entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:26,812 INFO spawned: ‘command_proxy’ with pid 40866
2020-10-23 15:02:28,133 INFO success: command_proxy entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:28,371 INFO spawned: ‘webapp’ with pid 40870
2020-10-23 15:02:29,432 INFO success: webapp entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)

Webapp log:

$ cryosparcm log webapp
“key”: “licenseAccepted”,
“value”: true
},
“id”: “3v4ftHJH967RrLSz6”
},
“json”: true
}
{
“method”: “POST”,
“uri”: “http://lambda.maas:39002/api”,
“headers”: {
“CRYOSPARC-USER”: “5f91c87ed1c4c8ca207b020c”
},
“body”: {
“jsonrpc”: “2.0”,
“method”: “set_user_state_var”,
“params”: {
“user_id”: “5f91c87ed1c4c8ca207b020c”,
“key”: “licenseAccepted”,
“value”: true
},
“id”: “pWNTNoqNSXtmQKCw2”
},
“json”: true
}
{
“method”: “POST”,
“uri”: “http://lambda.maas:39002/api”,
“headers”: {
“CRYOSPARC-USER”: “5f91c87ed1c4c8ca207b020c”
},
“body”: {
“jsonrpc”: “2.0”,
“method”: “set_user_state_var”,
“params”: {
“user_id”: “5f91c87ed1c4c8ca207b020c”,
“key”: “licenseAccepted”,
“value”: true
},
“id”: “R3NcZJ79F5pQJyiwi”
},
“json”: true
}
{
“method”: “POST”,
“uri”: “http://lambda.maas:39002/api”,
“headers”: {
“CRYOSPARC-USER”: “5f91c87ed1c4c8ca207b020c”
},
“body”: {
“jsonrpc”: “2.0”,
“method”: “set_user_state_var”,
“params”: {
“user_id”: “5f91c87ed1c4c8ca207b020c”,
“key”: “licenseAccepted”,
“value”: true
},
“id”: “MQzwQ2uNoAZEXxhGX”
},
“json”: true
}
{
“method”: “POST”,
“uri”: “http://lambda.maas:39002/api”,
“headers”: {
“CRYOSPARC-USER”: “5f91c87ed1c4c8ca207b020c”
},
“body”: {
“jsonrpc”: “2.0”,
“method”: “set_user_state_var”,
“params”: {
“user_id”: “5f91c87ed1c4c8ca207b020c”,
“key”: “licenseAccepted”,
“value”: true
},
“id”: “kgRJLHgZ3yjTGLbBA”
},
“json”: true
}
{
“method”: “POST”,
“uri”: “http://lambda.maas:39002/api”,
“headers”: {
“CRYOSPARC-USER”: “5f91c87ed1c4c8ca207b020c”
},
“body”: {
“jsonrpc”: “2.0”,
“method”: “set_user_state_var”,
“params”: {
“user_id”: “5f91c87ed1c4c8ca207b020c”,
“key”: “licenseAccepted”,
“value”: true
},
“id”: “8LqJdKqGXRn3qhFGi”
},
“json”: true
}
cryoSPARC v2
(node:40870) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
Ready to serve GridFS
==== [projects] project query user 5f91c87ed1c4c8ca207b020c IT Support true
==== [workspace] project query user 5f91c87ed1c4c8ca207b020c IT Support true
==== [jobs] project query user 5f91c87ed1c4c8ca207b020c IT Support true
==== [projects] project query user 5f91c87ed1c4c8ca207b020c IT Support true
==== [projects] project query user 5f91c87ed1c4c8ca207b020c IT Support true
==== [workspace] project query user 5f91c87ed1c4c8ca207b020c IT Support true
==== [jobs] project query user 5f91c87ed1c4c8ca207b020c IT Support true
==== [projects] project query user 5f91c87ed1c4c8ca207b020c IT Support true
utility.getLatestVersion ERROR: { RequestError: Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
at new RequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/errors.js:14:15)
at Request.plumbing.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:87:29)
at Request.RP$callback [as _callback] (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:46:31)
at self.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:185:22)
at emitOne (events.js:116:13)
at Request.emit (events.js:211:7)
at Request.onRequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:881:8)
at emitOne (events.js:116:13)
at ClientRequest.emit (events.js:211:7)
at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:179:21)
at Object.onceWrapper (events.js:315:30)
at emitOne (events.js:116:13)
at ClientRequest.emit (events.js:211:7)
at Socket.socketErrorListener (_http_client.js:387:9)
at emitOne (events.js:116:13)
at Socket.emit (events.js:211:7)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickDomainCallback (internal/process/next_tick.js:218:9)
=> awaited here:
at Function.Promise.await (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)
at Promise.asyncApply (imports/api/Utility/server/methods.js:24:15)
at /nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/fiber_pool.js:43:40
name: ‘RequestError’,
message: ‘Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)’,
cause: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
at Object.onceWrapper (events.js:315:30)
at emitOne (events.js:116:13)
at ClientRequest.emit (events.js:211:7)
at Socket.socketErrorListener (_http_client.js:387:9)
at emitOne (events.js:116:13)
at Socket.emit (events.js:211:7)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: ‘ECONNRESET’ },
error: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
at Object.onceWrapper (events.js:315:30)
at emitOne (events.js:116:13)
at ClientRequest.emit (events.js:211:7)
at Socket.socketErrorListener (_http_client.js:387:9)
at emitOne (events.js:116:13)
at Socket.emit (events.js:211:7)
at emitErrorNT (internal/streams/destroy.js:64:8)
at _combinedTickCallback (internal/process/next_tick.js:138:11)
at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: ‘ECONNRESET’ },
options:
{ uri: ‘https://get.cryosparc.com/get_update_tag/xxxxxxxxx/v2.15.0’,
callback: [Function: RP$callback],
transform: undefined,
simple: true,
resolveWithFullResponse: false,
transform2xxOnly: false },
response: undefined }
{
“method”: “POST”,
“uri”: “http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api”,
“headers”: {
“CRYOSPARC-USER”: “5f91c87ed1c4c8ca207b020c”
},
“body”: {
“jsonrpc”: “2.0”,
“method”: “set_user_state_var”,
“params”: {
“user_id”: “5f91c87ed1c4c8ca207b020c”,
“key”: “licenseAccepted”,
“value”: true
},
“id”: “hdeD2K4uZXRE6cZaz”
},
“json”: true
}
{
“method”: “POST”,
“uri”: “http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api”,
“headers”: {
“CRYOSPARC-USER”: “5f91c87ed1c4c8ca207b020c”
},
“body”: {
“jsonrpc”: “2.0”,
“method”: “set_user_state_var”,
“params”: {
“user_id”: “5f91c87ed1c4c8ca207b020c”,
“key”: “licenseAccepted”,
“value”: true
},
“id”: “5owJ9wwoFtdWiaNeL”
},
“json”: true
}

command_vis log:

2020-10-23 15:02:29,120 VIS.MAIN INFO === STARTED ===
*** client.py: command (http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api) did not reply within timeout of 300 seconds, attempt 1 of 3

webapp log:

2020-10-23 15:02:21,260 INFO RPC interface ‘supervisor’ initialized
2020-10-23 15:02:21,260 CRIT Server ‘unix_http_server’ running without any HTTP authentication checking
2020-10-23 15:02:21,261 INFO daemonizing the supervisord process
2020-10-23 15:02:21,268 INFO supervisord started with pid 40385
2020-10-23 15:02:21,491 INFO spawned: ‘database’ with pid 40387
2020-10-23 15:02:22,718 INFO success: database entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:23,654 INFO spawned: ‘command_core’ with pid 40836
2020-10-23 15:02:25,267 INFO success: command_core entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:25,560 INFO spawned: ‘command_vis’ with pid 40859
2020-10-23 15:02:26,561 INFO success: command_vis entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:26,812 INFO spawned: ‘command_proxy’ with pid 40866
2020-10-23 15:02:28,133 INFO success: command_proxy entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:28,371 INFO spawned: ‘webapp’ with pid 40870
2020-10-23 15:02:29,432 INFO success: webapp entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)

command_proxy log:

(34065) wsgi starting up on http://0.0.0.0:39004
(46370) wsgi starting up on http://0.0.0.0:39004
(49466) wsgi starting up on http://0.0.0.0:39004
(51779) wsgi starting up on http://0.0.0.0:39004
(8390) wsgi starting up on http://0.0.0.0:39004
(12250) wsgi starting up on http://0.0.0.0:39004
(13572) wsgi starting up on http://0.0.0.0:39004
(16192) wsgi starting up on http://0.0.0.0:39004
(16192) accepted (‘192.168.101.2’, 54869)
(1423) wsgi starting up on http://0.0.0.0:39004
(961) wsgi starting up on http://0.0.0.0:39004
(14487) wsgi starting up on http://0.0.0.0:39004
(39108) wsgi starting up on http://0.0.0.0:39004
(40866) wsgi starting up on http://0.0.0.0:39004

command_core log:

COMMAND CORE STARTED === 2020-10-23 15:02:24.265767 ==========================
*** BG WORKER START

supervisord log:

2020-10-23 15:02:21,260 INFO RPC interface ‘supervisor’ initialized
2020-10-23 15:02:21,260 CRIT Server ‘unix_http_server’ running without any HTTP authentication checking
2020-10-23 15:02:21,261 INFO daemonizing the supervisord process
2020-10-23 15:02:21,268 INFO supervisord started with pid 40385
2020-10-23 15:02:21,491 INFO spawned: ‘database’ with pid 40387
2020-10-23 15:02:22,718 INFO success: database entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:23,654 INFO spawned: ‘command_core’ with pid 40836
2020-10-23 15:02:25,267 INFO success: command_core entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:25,560 INFO spawned: ‘command_vis’ with pid 40859
2020-10-23 15:02:26,561 INFO success: command_vis entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:26,812 INFO spawned: ‘command_proxy’ with pid 40866
2020-10-23 15:02:28,133 INFO success: command_proxy entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2020-10-23 15:02:28,371 INFO spawned: ‘webapp’ with pid 40870
2020-10-23 15:02:29,432 INFO success: webapp entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)

I should perhaps add that this was my attempt to get things working after our existing installation of v2.9 stopped working. Unfortunately my notes are incomplete, so I can’t tell you what the symptoms of that were. Same computers though, and no OS updates.

Andrew

Hi @arcr1,

Can you restart cryoSPARC? cryosparcm restart

Then, is it possible if you can paste the outputs of the following (run these immediately after restarting):
cryosparcm log command_core
cryosparcm log webapp

Hi Stephan,

Thank you for getting back to me. The output (with licence code redacted) is below. Most of the log messages seem to be from before I turned off the CentOS 7 firewall daemon, and got the configuration right (I think!) for our cluster. I’ve marked where I think this most recent restart occurs in the two logs.

Best regards,

Andrew

$ cryosparcm restart
CryoSPARC is running.
Stopping cryosparc 
command_proxy: stopped
command_vis: stopped
webapp: stopped
command_core: stopped
database: stopped
Shut down
Starting cryoSPARC System master process..
CryoSPARC is not already running.
database: started
command_core: started
  cryosparc command core startup complete.
command_vis: started
command_proxy: started
webapp: started
-----------------------------------------------------

CryoSPARC master started. 
 From this machine, access cryoSPARC at
    http://localhost:39000

 From other machines on the network, access cryoSPARC at
    http://lambda.maas:39000


Startup can take several minutes. Point your browser to the address
and refresh until you see the cryoSPARC web interface.


$ cryosparcm log command_core

  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 928, in __send_message
    helpers._check_command_response(doc['data'][0])
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/helpers.py", line 210, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
OperationFailure: node is not in primary or recovering state
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 194, in background_worker
    check_heartbeats()
  File "cryosparc2_command/command_core/__init__.py", line 1889, in check_heartbeats
    'heartbeat_at' : {'$lt' : deadline} }, {'project_uid' : 1, 'uid' : 1}))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1114, in next
    if len(self.__data) or self._refresh():
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 928, in __send_message
    helpers._check_command_response(doc['data'][0])
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/helpers.py", line 210, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
OperationFailure: node is not in primary or recovering state
****** Scheduler Failed **** 
****** Heartbeat check failed ****
[JSONRPC ERROR  2020-10-22 18:28:53.180492  at  get_num_active_licenses ]
-----------------------------------------------------
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "cryosparc2_command/command_core/__init__.py", line 1520, in get_num_active_licenses
    for j in jobs_running:
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1114, in next
    if len(self.__data) or self._refresh():
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 928, in __send_message
    helpers._check_command_response(doc['data'][0])
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/helpers.py", line 210, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
OperationFailure: node is not in primary or recovering state
-----------------------------------------------------
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 200, in background_worker
    concurrent_job_monitor()
  File "cryosparc2_command/command_core/__init__.py", line 1527, in concurrent_job_monitor
    current_concurrent_licenses_deque.append(get_num_active_licenses())
  File "cryosparc2_command/command_core/__init__.py", line 124, in wrapper
    raise e
OperationFailure: node is not in primary or recovering state
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 205, in background_worker
    heartbeat_manager()
  File "cryosparc2_command/command_core/__init__.py", line 1571, in heartbeat_manager
    active_jobs = get_active_licenses()
  File "cryosparc2_command/command_core/__init__.py", line 1536, in get_active_licenses
    for j in jobs_running:
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1114, in next
    if len(self.__data) or self._refresh():
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 928, in __send_message
    helpers._check_command_response(doc['data'][0])
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/helpers.py", line 210, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
OperationFailure: node is not in primary or recovering state
****** Concurrent job monitor failed ****
****** Instance heartbeat failed ****
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 189, in background_worker
    scheduler_run_core() # sets last run time
  File "cryosparc2_command/command_core/__init__.py", line 1650, in scheduler_run_core
    prune_scheduler_queued_jobs()
  File "cryosparc2_command/command_core/__init__.py", line 1633, in prune_scheduler_queued_jobs
    all_queued_jobs = list(mongo.db['sched_queued'].find({}, {'queued_job_hash' : 1, 'last_scheduled_at' : 1}))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1114, in next
    if len(self.__data) or self._refresh():
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 928, in __send_message
    helpers._check_command_response(doc['data'][0])
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/helpers.py", line 210, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
OperationFailure: node is not in primary or recovering state
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 194, in background_worker
    check_heartbeats()
  File "cryosparc2_command/command_core/__init__.py", line 1889, in check_heartbeats
    'heartbeat_at' : {'$lt' : deadline} }, {'project_uid' : 1, 'uid' : 1}))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1114, in next
    if len(self.__data) or self._refresh():
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 928, in __send_message
    helpers._check_command_response(doc['data'][0])
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/helpers.py", line 210, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
OperationFailure: node is not in primary or recovering state
****** Scheduler Failed **** 
****** Heartbeat check failed ****
[JSONRPC ERROR  2020-10-22 18:28:54.183922  at  get_num_active_licenses ]
-----------------------------------------------------
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "cryosparc2_command/command_core/__init__.py", line 1520, in get_num_active_licenses
    for j in jobs_running:
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1114, in next
    if len(self.__data) or self._refresh():
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 928, in __send_message
    helpers._check_command_response(doc['data'][0])
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/helpers.py", line 210, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
OperationFailure: node is not in primary or recovering state
-----------------------------------------------------
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 200, in background_worker
    concurrent_job_monitor()
  File "cryosparc2_command/command_core/__init__.py", line 1527, in concurrent_job_monitor
    current_concurrent_licenses_deque.append(get_num_active_licenses())
  File "cryosparc2_command/command_core/__init__.py", line 124, in wrapper
    raise e
OperationFailure: node is not in primary or recovering state
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 205, in background_worker
    heartbeat_manager()
  File "cryosparc2_command/command_core/__init__.py", line 1571, in heartbeat_manager
    active_jobs = get_active_licenses()
  File "cryosparc2_command/command_core/__init__.py", line 1536, in get_active_licenses
    for j in jobs_running:
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1114, in next
    if len(self.__data) or self._refresh():
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 928, in __send_message
    helpers._check_command_response(doc['data'][0])
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/helpers.py", line 210, in _check_command_response
    raise OperationFailure(msg % errmsg, code, response)
OperationFailure: node is not in primary or recovering state
****** Concurrent job monitor failed ****
****** Instance heartbeat failed ****
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 189, in background_worker
    scheduler_run_core() # sets last run time
  File "cryosparc2_command/command_core/__init__.py", line 1650, in scheduler_run_core
    prune_scheduler_queued_jobs()
  File "cryosparc2_command/command_core/__init__.py", line 1633, in prune_scheduler_queued_jobs
    all_queued_jobs = list(mongo.db['sched_queued'].find({}, {'queued_job_hash' : 1, 'last_scheduled_at' : 1}))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1114, in next
    if len(self.__data) or self._refresh():
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 1036, in _refresh
    self.__collation))
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/cursor.py", line 873, in __send_message
    **kwargs)
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/mongo_client.py", line 905, in _send_message_with_response
    exhaust)
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/mongo_client.py", line 916, in _reset_on_error
    return func(*args, **kwargs)
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/server.py", line 99, in send_message_with_response
    with self.get_socket(all_credentials, exhaust) as sock_info:
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/contextlib.py", line 17, in __enter__
    return self.gen.next()
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/server.py", line 168, in get_socket
    with self.pool.get_socket(all_credentials, checkout) as sock_info:
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/contextlib.py", line 17, in __enter__
    return self.gen.next()
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/pool.py", line 790, in get_socket
    sock_info = self._get_socket_no_auth()
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/pool.py", line 836, in _get_socket_no_auth
    sock_info = self._check(sock_info)
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/pool.py", line 890, in _check
    return self.connect()
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/pool.py", line 763, in connect
    _raise_connection_failure(self.address, error)
  File "/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/deps/anaconda/lib/python2.7/site-packages/pymongo/pool.py", line 211, in _raise_connection_failure
    raise AutoReconnect(msg)
AutoReconnect: lambda.servers.mrc-mbu.cam.ac.uk:39001: [Errno 111] Connection refused
COMMAND CORE STARTED ===  2020-10-22 18:30:19.466397  ==========================
*** BG WORKER START
COMMAND CORE STARTED ===  2020-10-23 11:32:09.389530  ==========================
*** BG WORKER START
[JSONRPC ERROR  2020-10-23 13:31:07.841323  at  get_scheduler_target_cluster_info ]
-----------------------------------------------------
Traceback (most recent call last):
  File "cryosparc2_command/command_core/__init__.py", line 115, in wrapper
    res = func(*args, **kwargs)
  File "cryosparc2_command/command_core/__init__.py", line 1226, in get_scheduler_target_cluster_info
    assert clane is not None, "Cluster %s does not exist" % name
AssertionError: Cluster MBU does not exist
-----------------------------------------------------
COMMAND CORE STARTED ===  2020-10-23 13:36:03.331435  ==========================
*** BG WORKER START
COMMAND CORE STARTED ===  2020-10-23 14:05:17.046581  ==========================
*** BG WORKER START
COMMAND CORE STARTED ===  2020-10-23 14:59:29.292069  ==========================
*** BG WORKER START
COMMAND CORE STARTED ===  2020-10-23 15:02:24.265767  ==========================

--------------------------RESTART-HERE????----------------------------

*** BG WORKER START
COMMAND CORE STARTED ===  2020-10-27 21:16:12.187459  ==========================
*** BG WORKER START

$ cryosparcm log webapp
     transform: undefined,
     simple: true,
     resolveWithFullResponse: false,
     transform2xxOnly: false },
  response: undefined }
==== [projects] project query user  *** *** *** *** IT Support true
==== [workspace] project query user  *** *** *** *** IT Support true
==== [jobs] project query user  *** *** *** *** IT Support true
==== [projects] project query user  *** *** *** *** IT Support true
utility.getLatestVersion ERROR: { RequestError: Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at new RequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/errors.js:14:15)
    at Request.plumbing.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:87:29)
    at Request.RP$callback [as _callback] (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:46:31)
    at self.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:185:22)
    at emitOne (events.js:116:13)
    at Request.emit (events.js:211:7)
    at Request.onRequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:881:8)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:179:21)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9)
 => awaited here:
    at Function.Promise.await (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)
    at Promise.asyncApply (imports/api/Utility/server/methods.js:24:15)
    at /nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/fiber_pool.js:43:40
  name: 'RequestError',
  message: 'Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)',
  cause: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  error: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  options: 
   { uri: 'https://get.cryosparc.com/get_update_tag/xxxxxxxxxxxxxxxx/v2.15.0',
     callback: [Function: RP$callback],
     transform: undefined,
     simple: true,
     resolveWithFullResponse: false,
     transform2xxOnly: false },
  response: undefined }
{
  "method": "POST",
  "uri": "http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api",
  "headers": {
    "CRYOSPARC-USER": "*** *** *** ***"
  },
  "body": {
    "jsonrpc": "2.0",
    "method": "set_user_state_var",
    "params": {
      "user_id": "*** *** *** ***",
      "key": "licenseAccepted",
      "value": true
    },
    "id": "qsoDNoJT3h9d7QNCf"
  },
  "json": true
}
{
  "method": "POST",
  "uri": "http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api",
  "headers": {
    "CRYOSPARC-USER": "*** *** *** ***"
  },
  "body": {
    "jsonrpc": "2.0",
    "method": "set_user_state_var",
    "params": {
      "user_id": "*** *** *** ***",
      "key": "licenseAccepted",
      "value": true
    },
    "id": "ZZXsnBAwraMWhJNZu"
  },
  "json": true
}
==== [projects] project query user  *** *** *** *** IT Support true
==== [workspace] project query user  *** *** *** *** IT Support true
==== [jobs] project query user  *** *** *** *** IT Support true
==== [projects] project query user  *** *** *** *** IT Support true
utility.getLatestVersion ERROR: { RequestError: Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at new RequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/errors.js:14:15)
    at Request.plumbing.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:87:29)
    at Request.RP$callback [as _callback] (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:46:31)
    at self.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:185:22)
    at emitOne (events.js:116:13)
    at Request.emit (events.js:211:7)
    at Request.onRequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:881:8)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:179:21)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9)
 => awaited here:
    at Function.Promise.await (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)
    at Promise.asyncApply (imports/api/Utility/server/methods.js:24:15)
    at /nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/fiber_pool.js:43:40
  name: 'RequestError',
  message: 'Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)',
  cause: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  error: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  options: 
   { uri: 'https://get.cryosparc.com/get_update_tag/xxxxxxxxxxxxxx/v2.15.0',
     callback: [Function: RP$callback],
     transform: undefined,
     simple: true,
     resolveWithFullResponse: false,
     transform2xxOnly: false },
  response: undefined }
{
  "method": "POST",
  "uri": "http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api",
  "headers": {
    "CRYOSPARC-USER": "*** *** *** ***"
  },
  "body": {
    "jsonrpc": "2.0",
    "method": "set_user_state_var",
    "params": {
      "user_id": "*** *** *** ***",
      "key": "licenseAccepted",
      "value": true
    },
    "id": "QnTGMCYRaDWG8xqzN"
  },
  "json": true
}
{
  "method": "POST",
  "uri": "http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api",
  "headers": {
    "CRYOSPARC-USER": "*** *** *** ***"
  },
  "body": {
    "jsonrpc": "2.0",
    "method": "set_user_state_var",
    "params": {
      "user_id": "*** *** *** ***",
      "key": "licenseAccepted",
      "value": true
    },
    "id": "bKunk3AzQxtvNy4R4"
  },
  "json": true
}

-------------------------RESTART HERE?-----------------------
cryoSPARC v2
(node:46392) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
Ready to serve GridFS

Hi @arcr1,

After restarting, are you still noticing the same issues you experienced before?

Hi Stephan,

Yes, I still see the “Cannot connect to command” message.

I’d left the “cryosparcm log webapp” running, and I saw the following messages as I logged in to the GUI:

utility.getLatestVersion ERROR: { RequestError: Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at new RequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/errors.js:14:15)
    at Request.plumbing.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:87:29)
    at Request.RP$callback [as _callback] (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:46:31)
    at self.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:185:22)
    at emitOne (events.js:116:13)
    at Request.emit (events.js:211:7)
    at Request.onRequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:881:8)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:179:21)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9)
 => awaited here:
    at Function.Promise.await (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)
    at Promise.asyncApply (imports/api/Utility/server/methods.js:24:15)
    at /nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/fiber_pool.js:43:40
  name: 'RequestError',
  message: 'Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)',
  cause: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  error: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  options: 
   { uri: 'https://get.cryosparc.com/get_update_tag/xxxxxxxxxx/v2.15.0',
     callback: [Function: RP$callback],
     transform: undefined,
     simple: true,
     resolveWithFullResponse: false,
     transform2xxOnly: false },
  response: undefined }
==== [projects] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [workspace] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [jobs] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [projects] project query user  5f91c87ed1c4c8ca207b020c IT Support true
utility.getLatestVersion ERROR: { RequestError: Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at new RequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/errors.js:14:15)
    at Request.plumbing.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:87:29)
    at Request.RP$callback [as _callback] (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:46:31)
    at self.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:185:22)
    at emitOne (events.js:116:13)
    at Request.emit (events.js:211:7)
    at Request.onRequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:881:8)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:179:21)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9)
 => awaited here:
    at Function.Promise.await (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)
    at Promise.asyncApply (imports/api/Utility/server/methods.js:24:15)
    at /nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/fiber_pool.js:43:40
  name: 'RequestError',
  message: 'Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)',
  cause: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  error: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  options: 
   { uri: 'https://get.cryosparc.com/get_update_tag/xxxxxxxxxxxxxxxx/v2.15.0',
     callback: [Function: RP$callback],
     transform: undefined,
     simple: true,
     resolveWithFullResponse: false,
     transform2xxOnly: false },
  response: undefined }

and when I clicked on “Accept” I saw:

{
  "method": "POST",
  "uri": "http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api",
  "headers": {
    "CRYOSPARC-USER": "**** **** **** ****"
  },
  "body": {
    "jsonrpc": "2.0",
    "method": "set_user_state_var",
    "params": {
      "user_id": "5f91c87ed1c4c8ca207b020c",
      "key": "licenseAccepted",
      "value": true
    },
    "id": "aY3kzumoxETqSgNZK"
  },
  "json": true
}

Best regards,

Andrew

HI @arcr1,

This is a weird error message- I know you said you turned off the firewall, but can you ensure the ports required for cryoSPARC (10 port range starting from the base port, e.g., 39000-39010) are all open and accessible from within the master node itself?

Hi Stephan, sorry to puzzle you!

I believe that the ports are accessible:

lambda # firewall-cmd --zone=trusted --list-all

FirewallD is not running

and “nmap” tells me that they are accessible (but there isn’t a process listening on 39005-39010):

% nmap -p39000-39010 lambda
Starting Nmap 7.80 ( https://nmap.org ) at 2020-10-28 15:51 GMT
Nmap scan report for lambda (192.168.30.243)
Host is up (0.043s latency).

PORT      STATE  SERVICE
39000/tcp open   unknown
39001/tcp open   unknown
39002/tcp open   unknown
39003/tcp open   unknown
39004/tcp open   unknown
39005/tcp closed unknown
39006/tcp closed unknown
39007/tcp closed unknown
39008/tcp closed unknown
39009/tcp closed unknown
39010/tcp closed unknown

Nmap done: 1 IP address (1 host up) scanned in 0.19 seconds

I know this is actually checking that the ports are accessible from my laptop, rather than lambda itself, so additionally:

lambda # nmap -p39000-39010 localhost

Starting Nmap 6.40 ( http://nmap.org ) at 2020-10-28 16:29 UTC
Nmap scan report for localhost (127.0.0.1)
Host is up (0.000013s latency).
Other addresses for localhost (not scanned): 127.0.0.1
PORT      STATE  SERVICE
39000/tcp open   unknown
39001/tcp open   unknown
39002/tcp open   unknown
39003/tcp open   unknown
39004/tcp open   unknown
39005/tcp closed unknown
39006/tcp closed unknown
39007/tcp closed unknown
39008/tcp closed unknown
39009/tcp closed unknown
39010/tcp closed unknown

Best regards,

Andrew

Hi again Stephan,

You may have found this already, but the code that raises the “Could not connect to command” message appears to be line 2786 of the source file:

cryosparc2_master/cryosparc2_webapp/bundle/programs/server/app/app.js

which looks like:

try {
response = Promise.await(request(options));
} catch (error) {
throw new Meteor.Error(‘command-req-error’, ‘Could not connect to command.’);
}

Turning on my browser’s JavaScript Console, I see the message:

JavaScript_Error

at the time I click the “Accept” button.

Best regards,

Andrew

Not wanting to overload this thread with information, but I left my browser’s JavaScript console open since I last posted and, without me having clicked on anything in the cryoSPARC GUI, I have seen the following error messages:

JavaScript_Error2

I’ve checked, and I can “curl” all the URLs successfully from both my laptop, and lambda

Many thanks again for anything you can suggest!

Best regards,

Andrew

Hi @arcr1,

Thanks for posting more information- you mentioned “Lambda”- can you explain more about your setup and how you’re running cryoSPARC?
Also, can you make sure the hostname next to CRYOSPARC_MASTER_HOSTNAME in cryosparc2_master/config.sh is accessible from the master node itself?

Hi Stephan,

Yes, no problem…

Also, can you make sure the hostname next to CRYOSPARC_MASTER_HOSTNAME
in cryosparc2_master/config.sh is accessible from the master node itself?

Logged-in to lambda, I can ssh back to lambda. I can also see the 39000 - 39010 range are accessible:

lambda $ nmap -p39000-39010 lambda

Starting Nmap 6.40 ( http://nmap.org ) at 2020-10-29 16:45 UTC
Nmap scan report for lambda (192.168.30.243)
Host is up (0.00025s latency).
rDNS record for 192.168.30.243: lambda.servers.mrc-mbu.cam.ac.uk
PORT      STATE  SERVICE
39000/tcp open   unknown
39001/tcp open   unknown
39002/tcp open   unknown
39003/tcp open   unknown
39004/tcp open   unknown
39005/tcp closed unknown
39006/tcp closed unknown
39007/tcp closed unknown
39008/tcp closed unknown
39009/tcp closed unknown
39010/tcp closed unknown

lambda is one of the nodes in our SLURM cluster - the one that we normally use for compilation and software installation, as it has the relevant development environments. It is a CPU-only box, two sockets with E5-2680 CPUs, running CentOS 7.6. It is also also the SLURM primary control node. We run the cryoSPARC master process and web-UI on lambda, as a single non-privileged user - it runs outside the SLURM queues itself.

The cryosparc2_master installation has config files telling it about the cluster, and how to submit jobs to a partition that is comprised of six other nodes, each of which has 4 x GPU cards.

This was all working fine with version 2.9, but the users noticed that it had stopped some weeks ago (I’m afraid I can’t now find my record of what the error messages were) and nothing I’ve done since has made it function again. Hence the attempt ago a fresh install…

Best regards,

Andrew

Again, apologies for the excess of information, but here is a fresh transcript of the web app log from doing a “cryosparccm start” through me logging in and clicking the “Accept” button. I’ve annotated in-line where each event happened:

cryoSPARC v2
(node:36516) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
Ready to serve GridFS
==== [projects] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [workspace] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [jobs] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [projects] project query user  5f91c87ed1c4c8ca207b020c IT Support true


[About to log in]   


==== [projects] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [workspace] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [jobs] project query user  5f91c87ed1c4c8ca207b020c IT Support true
==== [projects] project query user  5f91c87ed1c4c8ca207b020c IT Support true
utility.getLatestVersion ERROR: { RequestError: Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at new RequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/errors.js:14:15)
    at Request.plumbing.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:87:29)
    at Request.RP$callback [as _callback] (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request-promise-core/lib/plumbing.js:46:31)
    at self.callback (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:185:22)
    at emitOne (events.js:116:13)
    at Request.emit (events.js:211:7)
    at Request.onRequestError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/request/request.js:881:8)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:179:21)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9)
 => awaited here:
    at Function.Promise.await (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)
    at Promise.asyncApply (imports/api/Utility/server/methods.js:24:15)
    at /nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/fiber_pool.js:43:40
  name: 'RequestError',
  message: 'Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)',
  cause: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  error: { Error: tunneling socket could not be established, cause=connect EINVAL 0.0.12.56:80 - Local (0.0.0.0:0)
    at ClientRequest.onError (/nfs4/suffolk/MBU/software/cryosparc/cryosparc2.15/cryosparc2_master/cryosparc2_webapp/bundle/programs/server/npm/node_modules/tunnel-agent/index.js:177:17)
    at Object.onceWrapper (events.js:315:30)
    at emitOne (events.js:116:13)
    at ClientRequest.emit (events.js:211:7)
    at Socket.socketErrorListener (_http_client.js:387:9)
    at emitOne (events.js:116:13)
    at Socket.emit (events.js:211:7)
    at emitErrorNT (internal/streams/destroy.js:64:8)
    at _combinedTickCallback (internal/process/next_tick.js:138:11)
    at process._tickDomainCallback (internal/process/next_tick.js:218:9) code: 'ECONNRESET' },
  options: 
   { uri: 'https://get.cryosparc.com/get_update_tag/6c6bcf44-44da-11e9-89dc-7b219d5660c2/v2.15.0',
     callback: [Function: RP$callback],
     transform: undefined,
     simple: true,
     resolveWithFullResponse: false,
     transform2xxOnly: false },
  response: undefined }


[logged in.  About to click on "Accept"]


{
  "method": "POST",
  "uri": "http://lambda.servers.mrc-mbu.cam.ac.uk:39002/api",
  "headers": {
    "CRYOSPARC-USER": "5f91c87ed1c4c8ca207b020c"
  },
  "body": {
    "jsonrpc": "2.0",
    "method": "set_user_state_var",
    "params": {
      "user_id": "5f91c87ed1c4c8ca207b020c",
      "key": "licenseAccepted",
      "value": true
    },
    "id": "oAgD9BnWyQBjJb2Ww"
  },
  "json": true
}

Best regards,

Andrew