Hi Wtempel,
Thank you for the help.
Instead of upgrading Cryosparc 4.4.0 from the old version, I made a fresh installation, but the same problem still happened after cryosparc ran for ~28 hours (I tried twice).
Here is the output of your instructed procedure.
$cryosparcm stop
CryoSPARC is running.
Stopping cryoSPARC
unix:///tmp/cryosparc-supervisor-5fccf1c670aab55f9d50ce55f18e4c54.sock refused connection
$ ps -w -U user1 -opid,ppid,start,cmd | grep -e cryosparc -e mongo | grep -v grep
NO OUTPUT.
THEN I DELETED THE SOCK FILE:
$ rm /tmp/cryosparc-supervisor-5fccf1c670aab55f9d50ce55f18e4c54.sock
$ cryosparcm start
Starting cryoSPARC System master process..
CryoSPARC is not already running.
configuring database
configuration complete
database: started
checkdb success
command_core: started
command_core connection succeeded
command_core startup successful
command_vis: started
command_rtp: started
command_rtp connection succeeded
command_rtp startup successful
app: started
app_api: started
-----------------------------------------------------
CryoSPARC master started.
From this machine, access CryoSPARC and CryoSPARC Live at
http://localhost:61000
From other machines on the network, access CryoSPARC and CryoSPARC Live at
http://cryo:61000
Startup can take several minutes. Point your browser to the address
and refresh until you see the cryoSPARC web interface.
ps -weopid,ppid,start,cmd | grep -e cryosparc -e mongo | grep -v grep
$ ps -weopid,ppid,start,cmd | grep -e cryosparc -e mongo | grep -v grep
82204 2765 12:58:51 python /home/jz/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/bin/supervisord -c /home/jz/cryosparc/cryosparc_master/supervisord.conf
82319 82204 12:58:57 mongod --auth --dbpath /home/jz/cryosparc/cryosparc_database --port 61001 --oplogSize 64 --replSet meteor --wiredTigerCacheSizeGB 4 --bind_ip_all
82430 82204 12:59:01 python -c import cryosparc_command.command_core as serv; serv.start(port=61002)
82468 82204 12:59:08 python -c import cryosparc_command.command_vis as serv; serv.start(port=61003)
82492 82204 12:59:09 python -c import cryosparc_command.command_rtp as serv; serv.start(port=61005)
82556 82204 12:59:14 /home/jz/cryosparc/cryosparc_master/cryosparc_app/nodejs/bin/node ./bundle/main.js
82589 82430 12:59:18 bash /home/jz/cryosparc/cryosparc_worker/bin/cryosparcw run --project P1 --job J33 --master_hostname cryo --master_command_core_port 61002
82604 82589 12:59:18 python -c import cryosparc_compute.run as run; run.run() --project P1 --job J33 --master_hostname cryo --master_command_core_port 61002
82606 82604 12:59:18 python -c import cryosparc_compute.run as run; run.run() --project P1 --job J33 --master_hostname cryo --master_command_core_port 61002
82609 82430 12:59:20 bash /home/jz/cryosparc/cryosparc_worker/bin/cryosparcw run --project P1 --job J34 --master_hostname cryo --master_command_core_port 61002
82624 82609 12:59:20 python -c import cryosparc_compute.run as run; run.run() --project P1 --job J34 --master_hostname cryo --master_command_core_port 61002
82626 82624 12:59:20 python -c import cryosparc_compute.run as run; run.run() --project P1 --job J34 --master_hostname cryo --master_command_core_port 61002
$ ls -l /tmp/cryosparc*.sock /tmp/mongodb-*.sock
srwx------ 1 jz jz 0 12月 6 12:58 /tmp/cryosparc-supervisor-5fccf1c670aab55f9d50ce55f18e4c54.sock
srwx------ 1 jz jz 0 12月 6 12:58 /tmp/mongodb-61001.sock
$ free -g
total used free shared buff/cache available
Mem: 503 13 41 0 448 485
Swap: 1 0 1
I only run cryosparc on the workstation, so there should be enough RAM.