Missing jobs after transfering project

This topic has been raised before, but I could not find a solution, so I am asking for help. My situation is similar to this post.

We had a workstation mounted to a shared filesystem that crashed. I then installed cryosparc 4.7 on a different workstation that is mounted to the same shared filesystem, and then I deleted the cs.lock file in my project folder and attached it to the new one. I did not move those files. The project is over 20 TB in size and has 867 jobs, but when I attached it, there are only 64 jobs there. All the workspaces are present, but some of them have no jobs. When I originally attached the project, there was a loading bar for about 30 minutes and then it was marked complete. Others have noted that a project this big can take a long time to attach, so I waited for 2 days now, but there are still only 64 jobs visible. Can you please help me get my project back?

I also ran the command cryosparcm filterlog command_core -l ERROR and this was the result:

[marcell@msg-app4 ~]$ cryosparcm filterlog command_core -l ERROR
2025-05-09 13:52:29,359 import_project_run   ERROR    | Unable to import project from /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2
2025-05-09 13:52:29,359 import_project_run   ERROR    | Traceback (most recent call last):
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
2025-05-09 13:52:29,359 import_project_run   ERROR    |     warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4861, in import_jobs
2025-05-09 13:52:29,359 import_project_run   ERROR    |     job_doc_data = json.load(openfile, object_hook=json_util.object_hook)
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 293, in load
2025-05-09 13:52:29,359 import_project_run   ERROR    |     return loads(fp.read(),
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 359, in loads
2025-05-09 13:52:29,359 import_project_run   ERROR    |     return cls(**kw).decode(s)
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/decoder.py", line 340, in decode
2025-05-09 13:52:29,359 import_project_run   ERROR    |     raise JSONDecodeError("Extra data", s, end)
2025-05-09 13:52:29,359 import_project_run   ERROR    | json.decoder.JSONDecodeError: Extra data: line 6080 column 2 (char 241418)
2025-05-09 13:52:32,247 run                  ERROR    | POST-RESPONSE-THREAD ERROR at import_project_run
2025-05-09 13:52:32,247 run                  ERROR    | Traceback (most recent call last):
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/commandcommon.py", line 73, in run
2025-05-09 13:52:32,247 run                  ERROR    |     self.target(*self.args)
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
2025-05-09 13:52:32,247 run                  ERROR    |     warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4861, in import_jobs
2025-05-09 13:52:32,247 run                  ERROR    |     job_doc_data = json.load(openfile, object_hook=json_util.object_hook)
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 293, in load
2025-05-09 13:52:32,247 run                  ERROR    |     return loads(fp.read(),
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 359, in loads
2025-05-09 13:52:32,247 run                  ERROR    |     return cls(**kw).decode(s)
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/decoder.py", line 340, in decode
2025-05-09 13:52:32,247 run                  ERROR    |     raise JSONDecodeError("Extra data", s, end)
2025-05-09 13:52:32,247 run                  ERROR    | json.decoder.JSONDecodeError: Extra data: line 6080 column 2 (char 241418)
2025-05-12 19:44:15,145 heartbeat_manager    ERROR    | HTTPSConnectionPool(host='get.cryosparc.com', port=443): Max retries exceeded with url: /heartbeat/XXX (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('timed out')))
2025-05-12 19:45:25,345 heartbeat_manager    ERROR    | HTTPSConnectionPool(host='get.cryosparc.com', port=443): Max retries exceeded with url: /heartbeat/XXX (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('timed out')))
2025-05-12 19:49:40,454 heartbeat_manager    ERROR    | HTTPSConnectionPool(host='get.cryosparc.com', port=443): Max retries exceeded with url: /heartbeat/XXX (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('timed out')))
2025-05-12 19:50:50,627 heartbeat_manager    ERROR    | HTTPSConnectionPool(host='get.cryosparc.com', port=443): Max retries exceeded with url: /heartbeat/XXX (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('timed out')))

@Marcell Please can you inspect the unfiltered command_core log for additional (including non-ERROR) lines that may reveal additional details, such as a specific job ID, relevant to the JSONDecodeError around 13:52 on May 9. You can browse the log with the command

cryosparcm log command_core | less

or try

cryosparcm log command_core | grep '2025-05-09 13:52:'

Thanks for looking into this. I think this is the relevant piece. I included a few jobs that were successfully loaded as well in case that helps.

2025-05-09 13:52:28,394 import_jobs          INFO     | Inserted job document in 0.67s...
2025-05-09 13:52:28,394 import_jobs          INFO     | Inserting streamlogs into jobs...
2025-05-09 13:52:28,561 import_jobs          INFO     | Done. Inserted 299 streamlogs in 0.17s...
2025-05-09 13:52:28,561 import_jobs          INFO     | Imported J165 into P1 in 0.83s...
2025-05-09 13:52:28,566 import_jobs          INFO     | Uploading image data for J166...
2025-05-09 13:52:29,282 import_jobs          INFO     | Done. Uploaded 197 files in 0.72s
2025-05-09 13:52:29,359 import_project_run   ERROR    | Unable to import project from /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2
2025-05-09 13:52:29,359 import_project_run   ERROR    | Traceback (most recent call last):
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
2025-05-09 13:52:29,359 import_project_run   ERROR    |     warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4861, in import_jobs
2025-05-09 13:52:29,359 import_project_run   ERROR    |     job_doc_data = json.load(openfile, object_hook=json_util.object_hook)
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 293, in load
2025-05-09 13:52:29,359 import_project_run   ERROR    |     return loads(fp.read(),
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 359, in loads
2025-05-09 13:52:29,359 import_project_run   ERROR    |     return cls(**kw).decode(s)
2025-05-09 13:52:29,359 import_project_run   ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/decoder.py", line 340, in decode
2025-05-09 13:52:29,359 import_project_run   ERROR    |     raise JSONDecodeError("Extra data", s, end)
2025-05-09 13:52:29,359 import_project_run   ERROR    | json.decoder.JSONDecodeError: Extra data: line 6080 column 2 (char 241418)
2025-05-09 13:52:29,383 dump_project         INFO     | Exporting project P1
2025-05-09 13:52:29,417 dump_project         INFO     | Exported project P1 to /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2/project.json in 0.03s
2025-05-09 13:52:32,247 run                  ERROR    | POST-RESPONSE-THREAD ERROR at import_project_run
2025-05-09 13:52:32,247 run                  ERROR    | Traceback (most recent call last):
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/commandcommon.py", line 73, in run
2025-05-09 13:52:32,247 run                  ERROR    |     self.target(*self.args)
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
2025-05-09 13:52:32,247 run                  ERROR    |     warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4861, in import_jobs
2025-05-09 13:52:32,247 run                  ERROR    |     job_doc_data = json.load(openfile, object_hook=json_util.object_hook)
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 293, in load
2025-05-09 13:52:32,247 run                  ERROR    |     return loads(fp.read(),
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 359, in loads
2025-05-09 13:52:32,247 run                  ERROR    |     return cls(**kw).decode(s)
2025-05-09 13:52:32,247 run                  ERROR    |   File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/decoder.py", line 340, in decode
2025-05-09 13:52:32,247 run                  ERROR    |     raise JSONDecodeError("Extra data", s, end)
2025-05-09 13:52:32,247 run                  ERROR    | json.decoder.JSONDecodeError: Extra data: line 6080 column 2 (char 241418)
**custom thread exception hook caught something
**** handle exception rc
Traceback (most recent call last):
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_compute/jobs/runcommon.py", line 2306, in run_with_except_hook
    run_old(*args, **kw)
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/commandcommon.py", line 73, in run
    self.target(*self.args)
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
    warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4861, in import_jobs
    job_doc_data = json.load(openfile, object_hook=json_util.object_hook)
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 293, in load
    return loads(fp.read(),
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 359, in loads
    return cls(**kw).decode(s)
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 6080 column 2 (char 241418)

Traceback (most recent call last):
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_compute/jobs/runcommon.py", line 2306, in run_with_except_hook
    run_old(*args, **kw)
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/commandcommon.py", line 73, in run
    self.target(*self.args)
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
    warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4861, in import_jobs
    job_doc_data = json.load(openfile, object_hook=json_util.object_hook)
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 293, in load
    return loads(fp.read(),
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/__init__.py", line 359, in loads
    return cls(**kw).decode(s)
  File "/wynton/home/chenglab/marcell/cryosparc_install/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 6080 column 2 (char 241418)
2025-05-09 14:44:14,610 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-09 15:04:35,935 update_project_size  INFO     | Starting PostResponseThread for update project size for P1
2025-05-09 15:04:35,937 update_project_size_run INFO     | Beginning update project size for P1
2025-05-09 15:37:20 info                 INFO     | Handling signal: quit
2025-05-09 15:37:20 info                 INFO     | Worker exiting (pid: 1193651)

It seems to me like it’s getting stuck at J166 or J167 (I really don’t know what I am talking about though). Would it be helpful if I uploaded the .json files for those jobs?

Possibly. Do you have a way of sharing the files? Otherwise, I can send you a direct message about sharing arrangements.

Here is a folder with some json files. I put in the project, workspace and job_manifest from the project level, and I added the json file from jobs 166 and 167. I renamed them, in their respective folders they are still named job.json.

This download link only seems to work if it’s copy-pasted into the browser, let me know if there is an issue.

https://limewire.com/d/TgAcG#OehdmTR3c4

Thanks @Marcell I fetched the files.

1 Like

Have there been multiple attempts to attach the project to the newly installed, v4.7 CryoSPARC instance?

I did try it a second time. After the first attempt, I detached, then reattached. In both cases, it got stuck on the same job.

Was the second attempt performed before or after

2025-05-09 13:52:28,566 import_jobs          INFO     | Uploading image data for J166...
2025-05-09 13:52:29,282 import_jobs          INFO     | Done. Uploaded 197 files in 0.72s
2025-05-09 13:52:29,359 import_project_run   ERROR    | Unable to import project from /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2

It was performed after that, i.e. this was from the first attempt. I just looked at the log from the second attempt, and there may be another clue:

2025-05-12 13:36:35,150 import_jobs          INFO     | Uploading image data for J165...
2025-05-12 13:36:35,656 import_jobs          INFO     | Done. Uploaded 176 files in 0.51s
2025-05-12 13:36:35,661 import_jobs          INFO     | Inserted job document in 0.51s...
2025-05-12 13:36:35,661 import_jobs          INFO     | Inserting streamlogs into jobs...
2025-05-12 13:36:35,790 import_jobs          INFO     | Done. Inserted 299 streamlogs in 0.13s...
2025-05-12 13:36:35,790 import_jobs          INFO     | Imported J165 into P2 in 0.64s...
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J159 parents include non-existent job J65
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J159 parents include non-existent job J94
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J148 parents include non-existent job J65
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J148 parents include non-existent job J94
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J138 parents include non-existent job J65
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J138 parents include non-existent job J94
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J129 parents include non-existent job J65
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J129 parents include non-existent job J94
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J119 parents include non-existent job J77
2025-05-12 13:36:36,329 layout_tree          WARNING  | Job P2 J119 parents include non-existent job J99
2025-05-12 13:36:36,330 layout_tree          WARNING  | Job P2 J100 parents include non-existent job J99
2025-05-12 13:36:36,330 layout_tree          WARNING  | Job P2 J15 parents include non-existent job J5
2025-05-12 13:36:36,330 layout_tree          WARNING  | Job P2 J10 parents include non-existent job J9
2025-05-12 13:36:36,330 layout_tree          WARNING  | Job P2 J127 parents include non-existent job J21
2025-05-12 13:36:36,732 import_project_run   WARNING  | Failed laying out tree in P2: 'J5'
2025-05-12 13:36:36,733 import_project_run   WARNING  | Imported project from /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2 as P2 in 3976.71s with errors.
2025-05-12 13:36:36,760 dump_project         INFO     | Exporting project P2
2025-05-12 13:36:36,764 dump_project         INFO     | Exported project P2 to /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2/project.json in 0.00s
2025-05-12 13:36:36,764 update_project_size_run INFO     | Beginning update project size for P2
2025-05-12 14:03:16,141 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 14:10:30,089 update_project_size  INFO     | Starting PostResponseThread for update project size for P2
2025-05-12 14:10:30,107 update_project_size_run INFO     | Beginning update project size for P2
2025-05-12 14:12:30,633 update_project_size  INFO     | Starting PostResponseThread for update project size for P2
2025-05-12 14:12:30,697 update_project_size_run INFO     | Beginning update project size for P2
2025-05-12 15:03:17,392 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 15:25:53,512 dump_project         INFO     | Exporting project P2
2025-05-12 15:25:53,514 dump_project         INFO     | Exporting project P1
2025-05-12 15:25:53,516 dump_project         INFO     | Exported project P2 to /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2/project.json in 0.00s
2025-05-12 15:25:53,516 update_project_size_run INFO     | Finished updating project size for P2 in 6556.75s
2025-05-12 15:25:53,517 dump_project         INFO     | Exported project P1 to /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2/project.json in 0.00s
2025-05-12 15:25:53,517 update_project_size_run INFO     | Finished updating project size for P1 in 12152.48s
2025-05-12 15:25:53,517 update_all_job_sizes_run INFO     | Finished updating all job sizes (0 jobs updated, 1 projects updated)
2025-05-12 15:35:47,875 dump_project         INFO     | Exporting project P2
2025-05-12 15:35:47,878 dump_project         INFO     | Exported project P2 to /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2/project.json in 0.00s
2025-05-12 15:35:47,878 update_project_size_run INFO     | Finished updating project size for P2 in 4997.18s
2025-05-12 15:35:47,878 dump_project         INFO     | Exporting project P2
2025-05-12 15:35:47,881 dump_project         INFO     | Exported project P2 to /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2/project.json in 0.00s
2025-05-12 15:35:47,881 update_project_size_run INFO     | Finished updating project size for P2 in 5117.77s
2025-05-12 16:03:17,467 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 17:03:18,411 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 18:03:18,520 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 19:03:18,699 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 19:44:15,145 heartbeat_manager    ERROR    | HTTPSConnectionPool(host='get.cryosparc.com', port=443): Max retries exceeded with url: /heartbeat/2d62cbc6-2c0b-11f0-880c-732536936247 (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('timed out')))
2025-05-12 19:44:15,145 heartbeat_manager    WARNING  | Error connecting to cryoSPARC license server during instance heartbeat.
2025-05-12 19:45:25,345 heartbeat_manager    ERROR    | HTTPSConnectionPool(host='get.cryosparc.com', port=443): Max retries exceeded with url: /heartbeat/2d62cbc6-2c0b-11f0-880c-732536936247 (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('timed out')))
2025-05-12 19:45:25,345 heartbeat_manager    WARNING  | Error connecting to cryoSPARC license server during instance heartbeat.
2025-05-12 19:49:40,454 heartbeat_manager    ERROR    | HTTPSConnectionPool(host='get.cryosparc.com', port=443): Max retries exceeded with url: /heartbeat/2d62cbc6-2c0b-11f0-880c-732536936247 (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('timed out')))
2025-05-12 19:49:40,454 heartbeat_manager    WARNING  | Error connecting to cryoSPARC license server during instance heartbeat.
2025-05-12 19:50:50,627 heartbeat_manager    ERROR    | HTTPSConnectionPool(host='get.cryosparc.com', port=443): Max retries exceeded with url: /heartbeat/2d62cbc6-2c0b-11f0-880c-732536936247 (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('timed out')))
2025-05-12 19:50:50,627 heartbeat_manager    WARNING  | Error connecting to cryoSPARC license server during instance heartbeat.
2025-05-12 20:03:19,119 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 21:03:19,333 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 22:03:19,737 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-12 23:03:20,505 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 00:03:21,457 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 01:03:22,425 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 02:03:22,729 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 03:03:22,878 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 04:03:23,793 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 05:03:24,603 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 06:03:24,744 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 07:03:24,846 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 08:03:25,624 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 09:03:25,829 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 10:03:26,034 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 11:03:26,253 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 12:03:26,882 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 12:21:12,438 update_project_size  INFO     | Starting PostResponseThread for update project size for P2
2025-05-13 12:21:12,439 update_project_size_run INFO     | Beginning update project size for P2
2025-05-13 13:03:28,040 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 14:03:29,002 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 14:38:54,881 dump_project         INFO     | Exporting project P2
2025-05-13 14:38:54,887 dump_project         INFO     | Exported project P2 to /wynton/group/cheng/marcell/Janelia_2/CS-janelia-2/project.json in 0.01s
2025-05-13 14:38:54,887 update_project_size_run INFO     | Finished updating project size for P2 in 8262.45s
2025-05-13 15:03:29,289 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 16:03:30,112 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 17:03:30,264 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 18:03:30,667 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 19:03:30,887 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 20:03:31,143 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 21:03:31,960 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 22:03:32,236 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-13 23:03:32,385 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 00:03:33,189 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 01:03:33,309 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 02:03:34,099 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 03:03:34,368 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 04:03:35,249 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 05:03:36,003 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 06:03:36,664 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 07:03:37,577 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 08:03:37,739 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 09:03:38,599 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 10:03:39,170 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 11:03:39,676 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 12:03:39,715 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 13:03:39,932 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 14:03:40,186 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-05-14 14:10:04 info                 INFO     | Handling signal: quit
2025-05-14 14:10:04 info                 INFO     | Worker exiting (pid: 2856084)
2025-05-14 14:10:04,925 interrupt            INFO     |  === EXITED === 
2025-05-14 14:10:05 info                 INFO     | Shutting down: Master

Those jobs do exist in the project folder, and there are files in the respective folders.

Hi I got the same problem. :frowning: After attaching a project (here P6), I got at first 69 jobs, but when I refresh it, it turned to be 0. Nothing showed up after attaching. I checked the command_core.log file, and I did get an error but do not quite know how to solve it. Could anyone give me a hand? Thanks!

2025-10-07 14:40:06,422 export_project       INFO     | Exporting J80 in P6
2025-10-07 14:40:06,422 dump_job_database    INFO     | Request to export P6 J80
2025-10-07 14:40:06,423 dump_job_database    INFO     |    Exporting job to /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/J80
2025-10-07 14:40:06,424 dump_job_database    INFO     |    Exporting all of job's images in the database to /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/J80/gridfs_data...
2025-10-07 14:40:06,617 dump_job_database    INFO     |    Writing 232 database images to /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/J80/gridfs_data/gridfsdata_0
2025-10-07 14:40:06,618 dump_job_database    INFO     |    Done. Exported 232 images in 0.19s
2025-10-07 14:40:06,618 dump_job_database    INFO     |    Exporting all job's streamlog events...
2025-10-07 14:40:06,628 dump_job_database    INFO     |    Done. Exported 1 files in 0.01s
2025-10-07 14:40:06,628 dump_job_database    INFO     |    Exporting job metafile...
2025-10-07 14:40:06,629 dump_job_database    INFO     |    Creating .csg file for particles
2025-10-07 14:40:06,632 dump_job_database    INFO     |    Creating .csg file for class_averages
2025-10-07 14:40:06,635 dump_job_database    INFO     |    Done. Exported in 0.01s
2025-10-07 14:40:06,635 dump_job_database    INFO     |    Updating job manifest...
2025-10-07 14:40:06,635 dump_job_database    INFO     |    Done. Updated in 0.00s
2025-10-07 14:40:06,635 dump_job_database    INFO     | Exported P6 J80 in 0.21s
2025-10-07 14:40:06,637 export_project       INFO     | Exporting J9 in P6
2025-10-07 14:40:06,637 dump_job_database    INFO     | Request to export P6 J9
2025-10-07 14:40:06,638 dump_job_database    INFO     |    Exporting job to /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/J9
2025-10-07 14:40:06,638 dump_job_database    INFO     |    Exporting all of job's images in the database to /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/J9/gridfs_data...
2025-10-07 14:40:06,668 dump_job_database    INFO     |    Writing 16 database images to /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/J9/gridfs_data/gridfsdata_0
2025-10-07 14:40:06,668 dump_job_database    INFO     |    Done. Exported 16 images in 0.03s
2025-10-07 14:40:06,668 dump_job_database    INFO     |    Exporting all job's streamlog events...
2025-10-07 14:40:06,762 dump_job_database    INFO     |    Done. Exported 1 files in 0.09s
2025-10-07 14:40:06,762 dump_job_database    INFO     |    Exporting job metafile...
2025-10-07 14:40:06,764 dump_job_database    INFO     |    Creating .csg file for particles
2025-10-07 14:40:06,766 dump_job_database    INFO     |    Creating .csg file for micrographs
2025-10-07 14:40:06,769 dump_job_database    INFO     |    Creating .csg file for templates
2025-10-07 14:40:06,772 dump_job_database    INFO     |    Done. Exported in 0.01s
2025-10-07 14:40:06,772 dump_job_database    INFO     |    Updating job manifest...
2025-10-07 14:40:06,773 dump_job_database    INFO     |    Done. Updated in 0.00s
2025-10-07 14:40:06,773 dump_job_database    INFO     | Exported P6 J9 in 0.14s
2025-10-07 14:40:06,781 dump_project         INFO     | Exporting project P6
2025-10-07 14:40:06,782 dump_project         INFO     | Exported project P6 to /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/project.json in 0.00s
2025-10-07 14:40:06,782 dump_workspaces      INFO     | Exporting all workspaces in P6...
2025-10-07 14:40:06,783 dump_workspaces      INFO     | Exported all workspaces in P6 to /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/workspaces.json in 0.00s
2025-10-07 14:40:06,784 export_project       WARNING  | Exported jobs in P6 with errors
2025-10-07 15:04:15,997 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-07 16:04:16,443 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-07 17:04:16,816 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-07 18:04:17,695 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-07 19:04:17,716 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-07 20:04:18,684 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-07 21:04:18,806 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-07 22:04:19,545 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-07 23:04:20,290 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 00:04:21,031 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 01:04:21,302 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 02:04:21,986 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 03:04:22,730 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 04:04:23,280 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 05:04:24,131 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 06:04:24,610 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 07:04:25,080 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 08:04:25,609 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 09:04:26,346 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 10:04:27,081 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 11:04:27,847 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 12:04:28,654 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 13:04:28,782 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 14:04:29,462 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 15:04:30,298 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 16:04:30,971 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 17:04:31,768 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 18:04:32,627 background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2025-10-08 18:12:59,293 wrapper              ERROR    | JSONRPC ERROR at set_user_viewed_project
2025-10-08 18:12:59,293 wrapper              ERROR    | Traceback (most recent call last):
2025-10-08 18:12:59,293 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 196, in wrapper
2025-10-08 18:12:59,293 wrapper              ERROR    |     res = func(*args, **kwargs)
2025-10-08 18:12:59,293 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 1208, in set_user_viewed_project
2025-10-08 18:12:59,293 wrapper              ERROR    |     update_project(project_uid, {'last_accessed' : {'name' : get_username_by_id(user_id), 'accessed_at' : datetime.datetime.utcnow()}}, operation='$set', export=False)
2025-10-08 18:12:59,293 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 187, in wrapper
2025-10-08 18:12:59,293 wrapper              ERROR    |     return func(*args, **kwargs)
2025-10-08 18:12:59,293 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 250, in wrapper
2025-10-08 18:12:59,293 wrapper              ERROR    |     assert os.path.isfile(
2025-10-08 18:12:59,293 wrapper              ERROR    | AssertionError: validation error: lock file for P5 at /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/CS-caeel-ad318-mo-graphene-grid-anti-gfp-3/cs.lock absent or otherwise inaccessible. 
2025-10-08 18:13:00,435 wrapper              ERROR    | JSONRPC ERROR at set_user_viewed_workspace
2025-10-08 18:13:00,435 wrapper              ERROR    | Traceback (most recent call last):
2025-10-08 18:13:00,435 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 196, in wrapper
2025-10-08 18:13:00,435 wrapper              ERROR    |     res = func(*args, **kwargs)
2025-10-08 18:13:00,435 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 1234, in set_user_viewed_workspace
2025-10-08 18:13:00,435 wrapper              ERROR    |     update_workspace(project_uid, workspace_uid, {'last_accessed' : {'name' : get_username_by_id(user_id), 'accessed_at' : datetime.datetime.utcnow()}}, operation='$set', export=False)
2025-10-08 18:13:00,435 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 187, in wrapper
2025-10-08 18:13:00,435 wrapper              ERROR    |     return func(*args, **kwargs)
2025-10-08 18:13:00,435 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 250, in wrapper
2025-10-08 18:13:00,435 wrapper              ERROR    |     assert os.path.isfile(
2025-10-08 18:13:00,435 wrapper              ERROR    | AssertionError: validation error: lock file for P5 at /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/CS-caeel-ad318-mo-graphene-grid-anti-gfp-3/cs.lock absent or otherwise inaccessible. 
2025-10-08 18:13:02,372 wrapper              ERROR    | JSONRPC ERROR at set_user_viewed_project
2025-10-08 18:13:02,372 wrapper              ERROR    | Traceback (most recent call last):
2025-10-08 18:13:02,372 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 196, in wrapper
2025-10-08 18:13:02,372 wrapper              ERROR    |     res = func(*args, **kwargs)
2025-10-08 18:13:02,372 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 1208, in set_user_viewed_project
2025-10-08 18:13:02,372 wrapper              ERROR    |     update_project(project_uid, {'last_accessed' : {'name' : get_username_by_id(user_id), 'accessed_at' : datetime.datetime.utcnow()}}, operation='$set', export=False)
2025-10-08 18:13:02,372 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 187, in wrapper
2025-10-08 18:13:02,372 wrapper              ERROR    |     return func(*args, **kwargs)
2025-10-08 18:13:02,372 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 250, in wrapper
2025-10-08 18:13:02,372 wrapper              ERROR    |     assert os.path.isfile(
2025-10-08 18:13:02,372 wrapper              ERROR    | AssertionError: validation error: lock file for P5 at /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/CS-caeel-ad318-mo-graphene-grid-anti-gfp-3/cs.lock absent or otherwise inaccessible. 
2025-10-08 18:13:05,622 wrapper              ERROR    | JSONRPC ERROR at request_delete_project
2025-10-08 18:13:05,622 wrapper              ERROR    | Traceback (most recent call last):
2025-10-08 18:13:05,622 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 196, in wrapper
2025-10-08 18:13:05,622 wrapper              ERROR    |     res = func(*args, **kwargs)
2025-10-08 18:13:05,622 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 250, in wrapper
2025-10-08 18:13:05,622 wrapper              ERROR    |     assert os.path.isfile(
2025-10-08 18:13:05,622 wrapper              ERROR    | AssertionError: validation error: lock file for P5 at /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/CS-caeel-ad318-mo-graphene-grid-anti-gfp-3/cs.lock absent or otherwise inaccessible. 
2025-10-08 18:13:15,659 wrapper              ERROR    | JSONRPC ERROR at archive_project
2025-10-08 18:13:15,659 wrapper              ERROR    | Traceback (most recent call last):
2025-10-08 18:13:15,659 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 196, in wrapper
2025-10-08 18:13:15,659 wrapper              ERROR    |     res = func(*args, **kwargs)
2025-10-08 18:13:15,659 wrapper              ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 250, in wrapper
2025-10-08 18:13:15,659 wrapper              ERROR    |     assert os.path.isfile(
2025-10-08 18:13:15,659 wrapper              ERROR    | AssertionError: validation error: lock file for P5 at /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/CS-caeel-ad318-mo-graphene-grid-anti-gfp-3/cs.lock absent or otherwise inaccessible. 
2025-10-08 18:13:44,820 import_project       INFO     | Importing project from /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2
2025-10-08 18:13:45,921 import_project       INFO     | Created project P8
2025-10-08 18:13:45,926 import_workspaces    INFO     | Created workspace W1 in P8
2025-10-08 18:13:46,081 import_jobs          INFO     | Inserting jobs into project...
2025-10-08 18:13:46,081 import_jobs          INFO     | Uploading image data for J1...
2025-10-08 18:13:46,081 import_jobs          INFO     | Done. Uploaded 0 files in 0.00s
2025-10-08 18:13:46,086 import_jobs          INFO     | Inserted job document in 0.01s...
2025-10-08 18:13:46,086 import_jobs          INFO     | Inserting streamlogs into jobs...
2025-10-08 18:13:46,088 import_project_run   ERROR    | Unable to import project from /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2
2025-10-08 18:13:46,088 import_project_run   ERROR    | Traceback (most recent call last):
2025-10-08 18:13:46,088 import_project_run   ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
2025-10-08 18:13:46,088 import_project_run   ERROR    |     warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
2025-10-08 18:13:46,088 import_project_run   ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4904, in import_jobs
2025-10-08 18:13:46,088 import_project_run   ERROR    |     events_doc_data = BSON(openfile.read()).decode()
2025-10-08 18:13:46,088 import_project_run   ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/site-packages/bson/__init__.py", line 1446, in decode
2025-10-08 18:13:46,088 import_project_run   ERROR    |     return decode(self, codec_options)
2025-10-08 18:13:46,088 import_project_run   ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/site-packages/bson/__init__.py", line 1078, in decode
2025-10-08 18:13:46,088 import_project_run   ERROR    |     return cast("Union[dict[str, Any], _DocumentType]", _bson_to_dict(data, opts))
2025-10-08 18:13:46,088 import_project_run   ERROR    | bson.errors.InvalidBSON: not enough data for a BSON document
2025-10-08 18:13:46,090 dump_project         INFO     | Exporting project P8
2025-10-08 18:13:46,092 dump_project         INFO     | Exported project P8 to /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/project.json in 0.00s
2025-10-08 18:13:46,094 run                  ERROR    | POST-RESPONSE-THREAD ERROR at import_project_run
2025-10-08 18:13:46,094 run                  ERROR    | Traceback (most recent call last):
2025-10-08 18:13:46,094 run                  ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 73, in run
2025-10-08 18:13:46,094 run                  ERROR    |     self.target(*self.args)
2025-10-08 18:13:46,094 run                  ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
2025-10-08 18:13:46,094 run                  ERROR    |     warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
2025-10-08 18:13:46,094 run                  ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4904, in import_jobs
2025-10-08 18:13:46,094 run                  ERROR    |     events_doc_data = BSON(openfile.read()).decode()
2025-10-08 18:13:46,094 run                  ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/site-packages/bson/__init__.py", line 1446, in decode
2025-10-08 18:13:46,094 run                  ERROR    |     return decode(self, codec_options)
2025-10-08 18:13:46,094 run                  ERROR    |   File "/usr/local/cryosparc4/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/site-packages/bson/__init__.py", line 1078, in decode
2025-10-08 18:13:46,094 run                  ERROR    |     return cast("Union[dict[str, Any], _DocumentType]", _bson_to_dict(data, opts))
2025-10-08 18:13:46,094 run                  ERROR    | bson.errors.InvalidBSON: not enough data for a BSON document
**custom thread exception hook caught something
**** handle exception rc
Traceback (most recent call last):
  File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_compute/jobs/runcommon.py", line 2306, in run_with_except_hook
    run_old(*args, **kw)
  File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 73, in run
    self.target(*self.args)
  File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4618, in import_project_run
    warning = import_jobs(jobs_manifest, abs_path_export_project_dir, new_project_uid, owner_user_id, notification_id) or warning
  File "/usr/local/cryosparc4/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 4904, in import_jobs
    events_doc_data = BSON(openfile.read()).decode()
  File "/usr/local/cryosparc4/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/site-packages/bson/__init__.py", line 1446, in decode
    return decode(self, codec_options)
  File "/usr/local/cryosparc4/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.10/site-packages/bson/__init__.py", line 1078, in decode
    return cast("Union[dict[str, Any], _DocumentType]", _bson_to_dict(data, opts))
bson.errors.InvalidBSON: not enough data for a BSON document


@HanW Please can you

  1. confirm whether you are concerned about the project at
    /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/
    
    and or the project at
    /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/
    
  2. describe the relationship between these two directories that have the same name, but different absolute paths?
  3. post the outputs of the commands
    df -Th /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/
    df -Th /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/
    

Thanks for the reply!

The project was originally at, and from where I detached the project

/data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/ 

But since this disk was almost full so I decided to move and copy it to

/data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/

So when I tried to attach the copied project in the second directory (in the same instance), I got 0 jobs. And I also tried to attach the project in the first directory to see if I missed something, but even the project in the first directory could not be attached correctly…..

And the outputs of the commands:

df -Th /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/
Filesystem     Type  Size  Used Avail Use% Mounted on
/dev/sdb1      ext4  7,3T  6,9T  4,0K 100% /data/8TBssd1
df -Th /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/
Filesystem     Type  Size  Used Avail Use% Mounted on
/dev/sde       ext4   13T   11T  1,2T  91% /data/14TBdisk2

It’s possible that a disk full event corrupted the project directory. A corrupt project directory may lead to an aborted, incomplete project attachment. An incompletely attached project may lead to an incomplete job_manifest.json to be written to the project directory. A partial recovery of the project may still be possible.
Please can you post the outputs of these commands

cat /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/job_manifest.json
cat /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/job_manifest.json

A note on the side: The error

2025-10-08 18:13:05,622 wrapper              ERROR    | AssertionError: validation error: lock file for P5 at /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/CS-caeel-ad318-mo-graphene-grid-anti-gfp-3/cs.lock

suggests that a CryoSPARC project directory may have been created inside another project directory. Was this a mistake?

Thanks for the reply!

Here are the outputs

cat /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/job_manifest.json
{
    "jobs": [
        "J1"
    ]
cat /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/job_manifest.jsonn
{
    "jobs": [
        "J1",
        "J10",
        "J11",
        "J12",
        "J13",
        "J14",
        "J15",
        "J16",
        "J17",
        "J18",
        "J19",
        "J2",
        "J20",
        "J21",
        "J22",
        "J23",
        "J24",
        "J25",
        "J26",
        "J27",
        "J28",
        "J29",
        "J3",
        "J30",
        "J31",
        "J32",
        "J33",
        "J34",
        "J35",
        "J36",
        "J37",
        "J38",
        "J39",
        "J4",
        "J40",
        "J41",
        "J42",
        "J43",
        "J44",
        "J45",
        "J46",
        "J47",
        "J48",
        "J49",
        "J5",
        "J50",
        "J51",
        "J52",
        "J53",
        "J54",
        "J55",
        "J56",
        "J57",
        "J58",
        "J59",
        "J6",
        "J60",
        "J61",
        "J62",
        "J63",
        "J64",
        "J65",
        "J66",
        "J67",
        "J68",
        "J69",
        "J7",
        "J70",
        "J71",
        "J72",
        "J73",
        "J74",
        "J75",
        "J76",
        "J77",
        "J78",
        "J79",
        "J8",
        "J80",
        "J9"
    ]

To answer your question “CryoSPARC project directory may have been created inside another project directory. Was this a mistake?”, maybe it is because when I found I cannot attach the project /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/ properly, I tried to reattach the original project at /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/…..I am sorry I could not really recall what I did.

Please can you:

  1. save the following python code to a file create_manifest.py on your CryoSPARC master computer:
    project_dir = "/data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2"
    unique_prefix = "mjqmyuZxyg"
    manifest_path = f"/tmp/{unique_prefix}.job_manifest.json"
    failed_job_docs_path = f"/tmp/{unique_prefix}.failed.json"
    
    import bson, json, pathlib
    
    jobs = []
    failed_job_dirs = {}
    
    project_path = pathlib.Path(project_dir)
    
    for doc in project_path.glob('J*/job.json'):
        job_dir = doc.parts[-2]
        try: 
            assert job_dir[1:].isdigit(), f"{doc} is not a valid job document path."
            with open(doc) as jhandle:
                data = json.load(jhandle)
            assert job_dir == data['uid'], f"Job uid does not match directory name {job_dir}."
            with open(project_path / job_dir / 'events.bson', 'rb') as bhandle:
                events = bson.decode(bhandle.read())
            jobs.append(data['uid'])
        except Exception as e:
            failed_job_dirs[job_dir] = str(e)
    
    with open(manifest_path, 'w') as mhandle:
        json.dump({'jobs': sorted(jobs)}, mhandle, indent=4) 
    
    if failed_job_dirs:
        with open(failed_job_docs_path, 'w') as fhandle:
            json.dump(failed_job_dirs, fhandle, indent=4)
    
    [script modified 2025-10-17: indentation]
  2. run the script with the command
    cryosparcm call python create_manifest.py
    
  3. post the output of the command
    cat /tmp/mjqmyuZxyg*.json

Hi,

Here is the output:

cat /tmp/mjqmyuZxyg*.json
{
    "J43": "objsize too large",
    "J68": "Unterminated string starting at: line 1409 column 16 (char 43965)",
    "J42": "objsize too large",
    "J15": "objsize too large"
}{
    "jobs": [
        "J1",
        "J10",
        "J11",
        "J12",
        "J13",
        "J14",
        "J16",
        "J17",
        "J18",
        "J19",
        "J2",
        "J20",
        "J21",
        "J22",
        "J23",
        "J24",
        "J25",
        "J26",
        "J27",
        "J28",
        "J29",
        "J3",
        "J30",
        "J31",
        "J32",
        "J33",
        "J34",
        "J35",
        "J36",
        "J37",
        "J38",
        "J39",
        "J4",
        "J40",
        "J41",
        "J44",
        "J45",
        "J46",
        "J47",
        "J48",
        "J49",
        "J5",
        "J50",
        "J51",
        "J52",
        "J53",
        "J54",
        "J55",
        "J56",
        "J57",
        "J58",
        "J59",
        "J6",
        "J60",
        "J61",
        "J62",
        "J63",
        "J64",
        "J65",
        "J66",
        "J67",
        "J69",
        "J7",
        "J70",
        "J71",
        "J72",
        "J73",
        "J74",
        "J75",
        "J76",
        "J77",
        "J78",
        "J79",
        "J8",
        "J80",
        "J9"
    ]

Thanks @HanW. Next you could try running these commands (as the Linux user owning the CryoSPARC instance)

# change into the project directory
cd /data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/
# replace the truncated job manifest
cp /tmp/mjqmyuZxyg.job_manifest.json job_manifest.json
# backup the workspaces doc
mv workspaces.json workspaces.json.$(date +%s)
# use the workspaces doc from the old version of the project directory
cp /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/workspaces.json .

and then attaching the project stored at

/data/14TBdisk2/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/

In case this procedure does not attach the project, including all jobs except J15, 42, 43, 68, please post related log command_core log entries and the output of the command

cat /data/8TBssd1/cryosparcuser/CS-caeel-ad318-mo-graphene-grid-anti-gfp-2/workspaces.json