App_api FATAL unknown error making dispatchers for 'app_api': EACCES

Can’t get cryosparcm to start app_api.

CryoSPARC process status:

app RUNNING pid 104228, uptime 2:25:31
app_api FATAL unknown error making dispatchers for ‘app_api’: EACCES
app_api_dev STOPPED Not started
command_core RUNNING pid 104170, uptime 2:25:43
command_rtp RUNNING pid 104204, uptime 2:25:35
command_vis RUNNING pid 104193, uptime 2:25:37
database RUNNING pid 104062, uptime 2:25:48

Worked fine last week, nothing’s changed that I know of. Only log info is:

Exception in setInterval callback: MongoServerSelectionError: connect ECONNREFUSED 127.0.0.1:39001

Rebooted server, stopped/restarted cryosparcm, deleted .sock files and tmp files…nothing.

Welcome to the forum @wmatthews Please can you post the outputs of the following commands:

cryosparcm log app_api | tail -n 40
cryosparcm log database | tail -n 40
ps -eo user:12,pid,ppid,start,command | grep -e cryosparc_ -e mongo
ls -l /tmp/cryosparc*.sock /tmp/mongo*.sock
cryosparcm status | grep -e HOSTNAME -e DB_PATH
sudo ss -anp | grep 3900 | sed "s/\s\+/ /g"
<user>@rohpc02 ~ $ clear
<user>@rohpc02 ~ $ cryosparcm log app_api | tail -n 40
},
code: undefined,
[Symbol(errorLabels)]: Set(0) {}
}
Exception in setInterval callback: MongoServerSelectionError: connect ECONNREFUSED 127.0.0.1:39001
at Timeout._onTimeout (/biotools8/biotools/cryosparc/cryosparc_master/cryosparc_app/bundle/programs/server/npm/node_modules/meteor/npm-mongo/node_modules/mongodb/lib/sdam/topology.js:292:38)
at listOnTimeout (internal/timers.js:557:17)
at processTimers (internal/timers.js:500:7)
=> awaited here:
at Function.Promise.await (/biotools8/biotools/cryosparc/cryosparc_master/cryosparc_app/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)
at Cursor.count (packages/mongo/mongo_driver.js:949:18)
at packages/natestrauser_publish-performant-counts.js:38:29
at packages/meteor.js:365:18
at Meteor.EnvironmentVariable.EVp.withValue (packages/meteor.js:1389:31)
at packages/meteor.js:613:25
at runWithEnvironment (packages/meteor.js:1486:24) {
reason: TopologyDescription {
type: 'ReplicaSetNoPrimary',
servers: Map(1) { 'localhost:39001' => [ServerDescription] },
stale: false,
compatible: true,
heartbeatFrequencyMS: 10000,
localThresholdMS: 15,
setName: 'meteor',
maxElectionId: new ObjectId("7fffffff00000000000000a5"),
maxSetVersion: 1,
commonWireVersion: 0,
logicalSessionTimeoutMinutes: null
},
code: undefined,
[Symbol(errorLabels)]: Set(0) {}
}
cryoSPARC Application API server running
cryoSPARC Application API server running
cryoSPARC Application API server running
cryoSPARC Application API server running
cryoSPARC Application API server running
cryoSPARC Application API server running
cryoSPARC Application API server running
cryoSPARC Application API server running
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $ cryosparcm log database | tail -n 40
2025-02-20T12:59:01.156-0600 I NETWORK [conn6] received client metadata from
127.0.0.1:48592 conn6: { driver: { name: "PyMongo", version: "4.6.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "CPython 3.10.13.final.0" }
2025-02-20T12:59:01.162-0600 I ACCESS [conn6] Successfully authenticated as principal cryosparc_user on admin from client 127.0.0.1:48592
2025-02-20T12:59:01.285-0600 I COMMAND [conn6] CMD: dropIndexes meteor.jobs
2025-02-20T12:59:01.286-0600 I COMMAND [conn6] CMD: dropIndexes meteor.events
2025-02-20T12:59:01.286-0600 I COMMAND [conn6] CMD: dropIndexes meteor.events
2025-02-20T12:59:01.286-0600 I COMMAND [conn6] CMD: dropIndexes meteor.events
2025-02-20T12:59:01.286-0600 I COMMAND [conn6] CMD: dropIndexes meteor.cache_files
2025-02-20T12:59:01.445-0600 I WRITE [conn6] update meteor.jobs command: { q: { cluster_job_id: { $exists: false } }, u: { $set: { cluster_job_id: null } }, multi: true, upsert: false } planSummary: COLLSCAN keysExamined:0 docsExamined:2762 nMatched:0 nModified:0 numYields:21 locks:{ Global: { acquireCount: { r: 22, w: 22 } }, Database: { acquireCount: { w: 22 } }, Collection: { acquireCount: { w: 22 } } } 147ms
2025-02-20T12:59:01.445-0600 I COMMAND [conn6] command meteor.$cmd command: update { update: "jobs", ordered: true, lsid: { id: UUID("d163f261-0d40-4413-abbe-7ea3e4b91960") }, $clusterTime: { clusterTime: Timestamp(1740077941, 1), signature: { hash: BinData(0, A990744016ABDEF71F44721B9F4840BB39265473), keyId: 7428712650776772610 } }, $db: "meteor" } numYields:0 reslen:229 locks:{ Global: { acquireCount: { r: 22, w: 22 } }, Database: { acquireCount: { w: 22 } }, Collection: { acquireCount: { w: 22 } } } protocol:op_msg 147ms
2025-02-20T12:59:01.505-0600 I NETWORK [listener] connection accepted from 127.0.0.1:48604 #7 (3 connections now open)
2025-02-20T12:59:01.505-0600 I NETWORK [conn7] received client metadata from 127.0.0.1:48604 conn7: { driver: { name: "PyMongo", version: "4.6.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "CPython 3.10.13.final.0" }
2025-02-20T12:59:01.505-0600 I ACCESS [conn7] Successfully authenticated as principal cryosparc_user on admin from client 127.0.0.1:48604
2025-02-20T12:59:01.522-0600 I COMMAND [conn6] CMD: drop meteor.benchmark_references
2025-02-20T12:59:01.522-0600 I STORAGE [conn6] dropCollection: meteor.benchmark_references (f5b4f486-faa0-4ef4-8d81-f90898fe0523) - renaming to drop-pending collection: meteor.system.drop.1740077941i8t247.benchmark_references with drop optime { ts: Timestamp(1740077941, 8), t: 247 }
2025-02-20T12:59:01.523-0600 I STORAGE [conn6] createCollection: meteor.benchmark_references with generated UUID: eb084795-8fec-43ad-aab8-615d06404062
2025-02-20T12:59:01.543-0600 I STORAGE [WT RecordStoreThread: local.oplog.rs] WiredTiger record store oplog truncation finished in: 0ms
2025-02-20T12:59:01.566-0600 I REPL [replication-0] Completing collection drop for meteor.system.drop.1740077941i8t247.benchmark_references with drop optime { ts: Timestamp(1740077941, 8), t: 247 } (notification optime: { ts: Timestamp(1740077941, 74), t: 247 })
2025-02-20T12:59:01.566-0600 I STORAGE [replication-0] Finishing collection drop for meteor.system.drop.1740077941i8t247.benchmark_references (f5b4f486-faa0-4ef4-8d81-f90898fe0523).
2025-02-20T12:59:07.719-0600 I NETWORK [listener] connection accepted from 172.25.50.33:34810 #8 (4 connections now open)
2025-02-20T12:59:07.723-0600 I NETWORK [conn8] received client metadata from 172.25.50.33:34810 conn8: { driver: { name: "nodejs", version: "4.9.0" }, os: { type: "Linux", name: "linux", architecture: "x64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "Node.js v16.20.2, LE (unified)|Node.js v16.20.2, LE (unified)" }
2025-02-20T12:59:07.728-0600 I NETWORK [conn8] end connection 172.25.50.33:34810 (3 connections now open)
2025-02-20T12:59:07.729-0600 I NETWORK [listener] connection accepted from
127.0.0.1:46240 #9 (4 connections now open)
2025-02-20T12:59:07.729-0600 I NETWORK [conn9] received client metadata from 127.0.0.1:46240 conn9: { driver: { name: "nodejs", version: "4.9.0" }, os: { type: "Linux", name: "linux", architecture: "x64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "Node.js v16.20.2, LE (unified)|Node.js v16.20.2, LE (unified)" }
2025-02-20T12:59:07.731-0600 I NETWORK [listener] connection accepted from 127.0.0.1:46248 #10 (5 connections now open)
2025-02-20T12:59:07.732-0600 I NETWORK [conn10] received client metadata from 127.0.0.1:46248 conn10: { driver: { name: "nodejs", version: "4.9.0" }, os: { type: "Linux", name: "linux", architecture: "x64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "Node.js v16.20.2, LE (unified)|Node.js v16.20.2, LE (unified)" }
2025-02-20T12:59:07.741-0600 I ACCESS [conn10] Successfully authenticated as principal cryosparc_user on admin from client 127.0.0.1:46248
2025-02-20T12:59:07.768-0600 I NETWORK [listener] connection accepted from 127.0.0.1:46256 #11 (6 connections now open)
2025-02-20T12:59:07.768-0600 I NETWORK [listener] connection accepted from 127.0.0.1:46264 #12 (7 connections now open)
2025-02-20T12:59:07.808-0600 I NETWORK [conn11] received client metadata from 127.0.0.1:46256 conn11: { driver: { name: "nodejs", version: "4.9.0" }, os: { type: "Linux", name: "linux", architecture: "x64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "Node.js v16.20.2, LE (unified)|Node.js v16.20.2, LE (unified)" }
2025-02-20T12:59:07.808-0600 I NETWORK [conn12] received client metadata from 127.0.0.1:46264 conn12: { driver: { name: "nodejs", version: "4.9.0" }, os: { type: "Linux", name: "linux", architecture: "x64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "Node.js v16.20.2, LE (unified)|Node.js v16.20.2, LE (unified)" }
2025-02-20T12:59:07.813-0600 I ACCESS [conn12] Successfully authenticated as principal cryosparc_user on admin from client 127.0.0.1:46264
2025-02-20T12:59:07.813-0600 I ACCESS [conn11] Successfully authenticated as principal cryosparc_user on admin from client 127.0.0.1:46256
2025-02-20T12:59:11.970-0600 I NETWORK [listener] connection accepted from 172.25.50.33:34824 #13 (8 connections now open)
2025-02-20T12:59:11.970-0600 I NETWORK [conn13] received client metadata from 172.25.50.33:34824 conn13: { driver: { name: "PyMongo", version: "4.6.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "CPython 3.10.13.final.0" }
2025-02-20T12:59:11.971-0600 I NETWORK [conn13] end connection 172.25.50.33:34824 (7 connections now open)
2025-02-20T12:59:11.971-0600 I NETWORK [listener] connection accepted from 127.0.0.1:46266 #14 (8 connections now open)
2025-02-20T12:59:11.971-0600 I NETWORK [conn14] received client metadata from 127.0.0.1:46266 conn14: { driver: { name: "PyMongo", version: "4.6.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "CPython 3.10.13.final.0" }
2025-02-20T12:59:11.972-0600 I NETWORK [listener] connection accepted from 127.0.0.1:46280 #15 (9 connections now open)
2025-02-20T12:59:11.973-0600 I NETWORK [conn15] received client metadata from 127.0.0.1:46280 conn15: { driver: { name: "PyMongo", version: "4.6.2" }, os: { type: "Linux", name: "Linux", architecture: "x86_64", version: "4.18.0-553.27.1.el8_10.x86_64" }, platform: "CPython 3.10.13.final.0" }
2025-02-20T12:59:11.977-0600 I ACCESS [conn15] Successfully authenticated as principal cryosparc_user on admin from client 127.0.0.1:46280
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $ ps -eo user:12,pid,ppid,start,command | grep -e cryosparc_ -e mongo
wa30462 118767 1 12:58:46 python /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/supervisord -c /biotools8/biotools/cryosparc/cryosparc_master/supervisord.conf
wa30462 118940 118767 12:58:51 mongod --auth --dbpath /biotools8/biotools/cryosparc/cryosparc_database --port 39001 --oplogSize 64 --replSet meteor --wiredTigerCacheSizeGB 4 --bind_ip_all
wa30462 119071 118767 12:58:55 /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/python /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/gunicorn -n command_core -b 0.0.0.0:39002 cryosparc_command.command_core:start() -c /biotools8/biotools/cryosparc/cryosparc_master/gunicorn.conf.py
wa30462 119072 119071 12:58:56 /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/python /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/gunicorn -n command_core -b 0.0.0.0:39002 cryosparc_command.command_core:start() -c /biotools8/biotools/cryosparc/cryosparc_master/gunicorn.conf.py
wa30462 119103 118767 12:59:01 /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/python /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/gunicorn cryosparc_command.command_vis:app -n command_vis -b 0.0.0.0:39003 -c /biotools8/biotools/cryosparc/cryosparc_master/gunicorn.conf.py
wa30462 119166 119103 12:59:02 /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/python /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/gunicorn cryosparc_command.command_vis:app -n command_vis -b 0.0.0.0:39003 -c /biotools8/biotools/cryosparc/cryosparc_master/gunicorn.conf.py
wa30462 119168 118767 12:59:02 /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/python /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/gunicorn cryosparc_command.command_rtp:start() -n command_rtp -b 0.0.0.0:39005 -c /biotools8/biotools/cryosparc/cryosparc_master/gunicorn.conf.py
wa30462 119186 119168 12:59:03 /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/python /biotools8/biotools/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/gunicorn cryosparc_command.command_rtp:start() -n command_rtp -b 0.0.0.0:39005 -c /biotools8/biotools/cryosparc/cryosparc_master/gunicorn.conf.py
<user> 123799 10565 14:08:37 grep --color=auto -e cryosparc_ -e mongo
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $ ls -l /tmp/cryosparc*.sock /tmp/mongo*.sock
srwx------ 1 wa30462 hpc_rohpc_wa30462 0 Feb 20 12:58 /tmp/cryosparc-supervisor-c47ea5c11066834e4edaca78e223e479.sock
srwx------ 1 wa30462 hpc_rohpc_wa30462 0 Feb 20 12:58 /tmp/mongodb-39001.sock
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $ cryosparcm status | grep -e HOSTNAME -e DB_PATH
export CRYOSPARC_MASTER_HOSTNAME="rohpc02.mayo.edu"
export CRYOSPARC_DB_PATH="/biotools8/biotools/cryosparc/cryosparc_database"
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $ ss -anp | grep 3900 | sed "s/\s\+/ /g"
u_str LISTEN 0 128 /tmp/mongodb-39001.sock 1301329 * 0
u_str ESTAB 0 0 * 50492 * 39009
u_str ESTAB 0 0 * 30959 * 39007
u_str ESTAB 0 0 /run/systemd/journal/stdout 39001 * 762
u_str ESTAB 0 0 * 762 * 39001
u_str ESTAB 0 0 /run/systemd/journal/stdout 39007 * 30959
u_str ESTAB 0 0 /run/systemd/journal/stdout 39009 * 50492
tcp LISTEN 0 128 0.0.0.0:39001 0.0.0.0:*
tcp LISTEN 0 2048 0.0.0.0:39002 0.0.0.0:*
tcp LISTEN 0 2048 0.0.0.0:39003 0.0.0.0:*
tcp LISTEN 0 2048 0.0.0.0:39005 0.0.0.0:*
tcp ESTAB 0 0 127.0.0.1:46266 127.0.0.1:39001
tcp TIME-WAIT 0 0 172.25.50.33:39002 172.25.50.33:58260
tcp ESTAB 0 0 127.0.0.1:46256 127.0.0.1:39001
tcp ESTAB 0 0 127.0.0.1:46240 127.0.0.1:39001
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:46240
tcp ESTAB 0 0 127.0.0.1:48582 127.0.0.1:39001
tcp ESTAB 0 0 127.0.0.1:46264 127.0.0.1:39001
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:48592
tcp ESTAB 0 0 127.0.0.1:48592 127.0.0.1:39001
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:46266
tcp ESTAB 0 0 127.0.0.1:46248 127.0.0.1:39001
tcp ESTAB 0 0 127.0.0.1:48604 127.0.0.1:39001
tcp TIME-WAIT 0 0 172.25.50.33:39002 172.25.50.33:58258
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:46280
tcp ESTAB 0 0 127.0.0.1:46280 127.0.0.1:39001
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:48582
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:48604
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:46248
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:46264
tcp ESTAB 0 0 127.0.0.1:39001 127.0.0.1:46256
tcp LISTEN 0 511 *:39000 *:*
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59434
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:172.22.225.109]:50634
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59393
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59383
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59424
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59396
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59395
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59438
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59415
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59401
tcp TIME-WAIT 0 0 [::ffff:172.25.50.33]:39000 [::ffff:10.200.33.29]:59379
<user>@rohpc02 ~ $

Thanks @wmatthews . Is CryoSPARC still not working?
Please can you post the output of these commands:

cryosparcm log app_api | grep -B 40 ECONNREFUSED | tail -n 50
cryosparcm log app_api | grep -B 20 -A 20 EACCES | tail -n 50
<user>@rohpc02 ~ $ cryosparcm log app_api | grep -B 40 ECONNREFUSED | tail -n 50
    at packages/natestrauser_publish-performant-counts.js:38:29
    at packages/meteor.js:365:18
    at Meteor.EnvironmentVariable.EVp.withValue (packages/meteor.js:1389:31)
    at packages/meteor.js:613:25
    at runWithEnvironment (packages/meteor.js:1486:24) {
  reason: TopologyDescription {
    type: 'ReplicaSetNoPrimary',
    servers: Map(1) { 'localhost:39001' => [ServerDescription] },
    stale: false,
    compatible: true,
    heartbeatFrequencyMS: 10000,
    localThresholdMS: 15,
    setName: 'meteor',
    maxElectionId: new ObjectId("7fffffff00000000000000a5"),
    maxSetVersion: 1,
    commonWireVersion: 0,
    logicalSessionTimeoutMinutes: null
  },
  code: undefined,
  [Symbol(errorLabels)]: Set(0) {}
}
Exception in setInterval callback: MongoServerSelectionError: connect ECONNREFUSED 127.0.0.1:39001
    at Timeout._onTimeout (/biotools8/biotools/cryosparc/cryosparc_master/cryosparc_app/bundle/programs/server/npm/node_modules/meteor/npm-mongo/node_modules/mongodb/lib/sdam/topology.js:292:38)
    at listOnTimeout (internal/timers.js:557:17)
    at processTimers (internal/timers.js:500:7)
 => awaited here:
    at Function.Promise.await (/biotools8/biotools/cryosparc/cryosparc_master/cryosparc_app/bundle/programs/server/npm/node_modules/meteor/promise/node_modules/meteor-promise/promise_server.js:56:12)
    at Cursor.count (packages/mongo/mongo_driver.js:949:18)
    at packages/natestrauser_publish-performant-counts.js:38:29
    at packages/meteor.js:365:18
    at Meteor.EnvironmentVariable.EVp.withValue (packages/meteor.js:1389:31)
    at packages/meteor.js:613:25
    at runWithEnvironment (packages/meteor.js:1486:24) {
  reason: TopologyDescription {
    type: 'ReplicaSetNoPrimary',
    servers: Map(1) { 'localhost:39001' => [ServerDescription] },
    stale: false,
    compatible: true,
    heartbeatFrequencyMS: 10000,
    localThresholdMS: 15,
    setName: 'meteor',
    maxElectionId: new ObjectId("7fffffff00000000000000a5"),
    maxSetVersion: 1,
    commonWireVersion: 0,
    logicalSessionTimeoutMinutes: null
  },
  code: undefined,
  [Symbol(errorLabels)]: Set(0) {}
}
Exception in setInterval callback: MongoServerSelectionError: connect ECONNREFUSED 127.0.0.1:39001
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $ cryosparcm log app_api | grep -B 20 -A 20 EACCES | tail -n 50
<user>@rohpc02 ~ $
<user>@rohpc02 ~ $

Thanks @wmatthews . What is the output of
cryosparcm status now?

<user>@rohpc02 ~ $ cryosparcm status
----------------------------------------------------------------------------
CryoSPARC System master node installed at
/biotools8/biotools/cryosparc/cryosparc_master
Current cryoSPARC version: v4.5.3+240729-gthread
----------------------------------------------------------------------------

CryoSPARC process status:

app                              RUNNING   pid 119199, uptime 2:33:16
app_api                          FATAL     unknown error making dispatchers for 'app_api': EACCES
app_api_dev                      STOPPED   Not started
command_core                     RUNNING   pid 119071, uptime 2:33:28
command_rtp                      RUNNING   pid 119168, uptime 2:33:20
command_vis                      RUNNING   pid 119103, uptime 2:33:22
database                         RUNNING   pid 118940, uptime 2:33:32

----------------------------------------------------------------------------
License is valid
----------------------------------------------------------------------------

global config variables:
export CRYOSPARC_LICENSE_ID="---"
export CRYOSPARC_MASTER_HOSTNAME="rohpc02.mayo.edu"
export CRYOSPARC_DB_PATH="/biotools8/biotools/cryosparc/cryosparc_database"
export CRYOSPARC_BASE_PORT=39000
export CRYOSPARC_DB_CONNECTION_TIMEOUT_MS=20000
export CRYOSPARC_INSECURE=false
export CRYOSPARC_DB_ENABLE_AUTH=true
export CRYOSPARC_CLUSTER_JOB_MONITOR_INTERVAL=10
export CRYOSPARC_CLUSTER_JOB_MONITOR_MAX_RETRIES=1000000
export CRYOSPARC_PROJECT_DIR_PREFIX='CS-'
export CRYOSPARC_DEVELOP=false
export CRYOSPARC_CLICK_WRAP=true
export CRYOSPARC_FORCE_USER=true

You might face a problem similar to Change cryosparc hostname - #11 by BrianCuttler.

… increases the risk that the CryoSPARC instance can be operated by the incorrect Linux user. What prompted the inclusion of CRYOSPARC_FORCE_USER in the configuration?
What are the output of the commands

ls -l /biotools8/biotools/cryosparc/cryosparc_master/run
ls -al /biotools8/biotools/cryosparc/cryosparc_master/run/

Thank you so much for pointing me to that thread.
I chown’d -R the Cryosparc directory for the work account running the package, and made false the ‘CRYOSPARC_FORCE_USER=true’ statement.
The next ‘cryosparcm restart’ went perfectly. There must have been log files not owned by the work account. Thank you again!

1 Like