Empty json files after machine soft reboot

I recently had a problem with one of the machines of our team, and had to soft reset it.

When I relaunched my jobs in cryosparc (version 4.6.2), they kept running, like always, until the end where I have the following error, for one of our Projects:

I checked the json files for both the projects, workspace and job_manifest, and they were empty.

I tried to lunch jobs in other Projects, in the same directory, and everything runs like usual, with no problems. The json files for these Projects were normal.

Is there a way to retrieve the information or replace these json files, without losing any data, so we can continue the data analysis of this Project?

Can anyone tell me how to fix this?

Thanks a lot!

Welcome to the forum @rafve . Please can you post

  • the Traceback as text
  • outputs of these commands on the server
df -Th
ps -eo user:16,command | grep cryosparc_
ls -la /path/to/CS-cyolo-tubes

Hello,

Thank you very much for the welcoming.

Sorry, the Traceback text is the following:

Traceback (most recent call last):
File “cryosparc_master/cryosparc_compute/run.py”, line 139, in cryosparc_master.cryosparc_compute.run.main
File “/home/impmc/cryosparc_worker/cryosparc_tools/cryosparc/command.py”, line 122, in func
raise CommandError(
cryosparc_tools.cryosparc.errors.CommandError: *** (http://titane2:39002, code 400) Encountered ServerError from JSONRPC function “dump_job_database” with params {‘project_uid’: ‘P6’, ‘job_uid’: ‘J404’, ‘job_completed’: True}:
ServerError: [Errno 5] Input/output error: ‘/media/impmc/14BDC9250A6E035F/CS-cyclo-tubes/job_manifest.json’
Traceback (most recent call last):
File “/home/impmc/cryosparc_master/cryosparc_command/commandcommon.py”, line 196, in wrapper
res = func(*args, **kwargs)
File “/home/impmc/cryosparc_master/cryosparc_command/commandcommon.py”, line 265, in wrapper
return func(*args, **kwargs)
File “/home/impmc/cryosparc_master/cryosparc_command/command_core/init.py”, line 4041, in dump_job_database
rc.dump_job_database(project_uid = project_uid, job_uid = job_uid, job_completed = job_completed, migration = migration, abs_export_dir = abs_export_dir, logger = logger)
File “/home/impmc/cryosparc_master/cryosparc_compute/jobs/runcommon.py”, line 552, in dump_job_database
with open(abs_job_manifest_job_path, ‘w’) as openfile:
OSError: [Errno 5] Input/output error: ‘/media/impmc/14BDC9250A6E035F/CS-cyclo-tubes/job_manifest.json’

Concerning the commands you suggested these are the results:

  • df -Th
    df -Th
    Sys. de fichiers Type Taille Utilisé Dispo Uti% Monté sur
    sde2 fuseblk 15T 14T 1,3T 92% /media/impmc/14BDC9250A6E035F

  • ps -eo user:16,command | grep cryosparc_

impmc python /home/impmc/**cryosparc_**master/deps/anaconda/envs/**cryosparc_**master_env/bin/supervisord -c /home/impmc/**cryosparc_**master/supervisord.conf

impmc mongod --auth --dbpath /home/impmc/**cryosparc_**database --port 39001 --oplogSize 64 --replSet meteor --wiredTigerCacheSizeGB 4 --bind_ip_all

impmc python /home/impmc/**cryosparc_**master/deps/anaconda/envs/**cryosparc_**master_env/bin/gunicorn -n command_core -b 0.0.0.0:39002 **cryosparc_**command.command_core:start() -c /home/impmc/**cryosparc_**master/gunicorn.conf.py

impmc python /home/impmc/**cryosparc_**master/deps/anaconda/envs/**cryosparc_**master_env/bin/gunicorn -n command_core -b 0.0.0.0:39002 **cryosparc_**command.command_core:start() -c /home/impmc/**cryosparc_**master/gunicorn.conf.py

impmc python /home/impmc/**cryosparc_**master/deps/anaconda/envs/**cryosparc_**master_env/bin/gunicorn **cryosparc_**command.command_vis:app -n command_vis -b 0.0.0.0:39003 -c /home/impmc/**cryosparc_**master/gunicorn.conf.py

impmc python /home/impmc/**cryosparc_**master/deps/anaconda/envs/**cryosparc_**master_env/bin/gunicorn **cryosparc_**command.command_vis:app -n command_vis -b 0.0.0.0:39003 -c /home/impmc/**cryosparc_**master/gunicorn.conf.py

impmc python /home/impmc/**cryosparc_**master/deps/anaconda/envs/**cryosparc_**master_env/bin/gunicorn **cryosparc_**command.command_rtp:start() -n command_rtp -b 0.0.0.0:39005 -c /home/impmc/**cryosparc_**master/gunicorn.conf.py

impmc python /home/impmc/**cryosparc_**master/deps/anaconda/envs/**cryosparc_**master_env/bin/gunicorn **cryosparc_**command.command_rtp:start() -n command_rtp -b 0.0.0.0:39005 -c /home/impmc/**cryosparc_**master/gunicorn.conf.py

impmc /home/impmc/**cryosparc_**master/**cryosparc_**app/nodejs/bin/node ./bundle/main.js

impmc grep --color=auto cryosparc_

  • ls -la /path/to/Cs-cyclo-tubes

ls: cannot access ‘/path/to/Cs-cyclo-tubes’: No files or folders of this type

The last command line gave an abnormal result. In fact I can access the folders and see all the jobs I already finished.

I can provide you more information if needed.

Thanks @rafve
I also meant to ask for a file listing of the project directory:

ls -la /media/impmc/14BDC9250A6E035F/CS-cyclo-tubes/

Hi,

This is the result from the command line you asked is the following:

  • ls -la /media/impmc/14BDC9250A6E035F/CS-cyclo-tubes/

ls: reading directory ‘/media/impmc/14BDC9250A6E035F/CS-cyclo-tubes/’: Input/output error
total 0

I also lost the access to check, by terminal, all jobs inside of the CS-cyclo-tubes project. I can however still download and check the project in the cryosparc streaming interface.

Additionally, I checked my json files of all other projects in this directory and they contain all the scripts for all files (project, work_jobs etc.). I had an idea of copying the json files from one project to the this one and change the path, names and number of jobs. Do you think this could help?

Thank you very much for your help.

The filesystem on which the the CS-cyclo-tubes/ is stored may not be robust enough for your purposes. You may want to consider an alternative, journaling filesystem with a backup regimen in place.

You may be accessing records in the CryoSPARC database, which is stored outside the CS-cyclo-tubes/

I am not sure. The damage to the project directory may extend beyond just json files.
Do you have a backup copy of the CS-cyclo-tubes/ directory?

Thank you for your answers.

Unfortunately, due to the size of the dataset we were obliged to do this Project in an external hard disk. This, has you suggest, might not be robust enough to proceed.

Okay, I see what you mean. Is there a possibility that we might find something that can restore this project in the CryosPARC database?

Unfortunately, due to the big size of the dataset we don’t have any backups of this… I was thinking of changing these json files as a last resort. I think I might try it if there’s no other option.

Thank you very much for your help.