No jobs showing up after recreating database

Hi there,
I have recently had a hard disk failure with the disk containing the home directory and the cryosparc database. I have recovered the disk following this guide (recovering the superblock from a backup on the disk), but the cryosparc_database got corrupted. There was no suitable backup of the database (huge mistake, I know), and I was unable to restore it with mongod. I tried to delete the database and creating an empty one following this solution, which seems to have worked. Trying to attach P1 failed due to a live session which was running at the time of the disk failure, so I have deleted the session folder from the project directory. After this, P1 was successfully (?) attached and the workspaces and live sessions appear:


The jobs, however, do not show up within the workspaces, even though the statistics (number of completed/failed/killed jobs) are correctly showing in the workspace details:

Trying to run any new jobs fails as an existing job ID (starting at J1) is assigned to them. Running cryosparcm test workers also fails for the same reason.

Is there a way to salvage my earlier jobs? I think starting a new project should work for any later data processing, but if possible, it would be nice to get my earlier results back.
Job log from the failed worker test:


================= CRYOSPARCW =======  2022-07-25 11:01:55.677029  =========
Project P1 Job J1
Master biopc6.bcbp.gu.se Port 61002
===========================================================================
========= monitor process now starting main process
MAINPROCESS PID 14836
MAIN PID 14836
imports.run cryosparc_compute.jobs.jobregister
========= monitor process now waiting for main process
========= sending heartbeat
========= sending heartbeat
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
TIFFReadDirectory: Warning, Bogus "StripByteCounts" field, ignoring and calculating from imagelength.
========= sending heartbeat
TIFFFetchDirectory: Can not read TIFF directory count.
TIFFReadDirectory: Failed to read directory at offset 1572864.
/data1/szabolcs/2022-07-04_Westenhoff_PaPhy-FL-initial_EPU/Images-Disc1/GridSquare_3686077/Data/FoilHole_4773925_Data_3688194_3688196_20220705_190027_Fractions.tif: Not a TIFF or MDI file, bad magic number 0 (0x0).
/data1/szabolcs/2022-07-04_Westenhoff_PaPhy-FL-initial_EPU/Images-Disc1/GridSquare_3686077/Data/FoilHole_4773937_Data_3688194_3688196_20220705_190546_Fractions.tif: Not a TIFF or MDI file, bad magic number 0 (0x0).
***************************************************************
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
========= sending heartbeat
min: 60566.393555 max: 73381.872070
min: 1030643.386719 max: 1112529.113281
min: 1030643.386719 max: 1112529.113281
min: 252635.568848 max: 283566.743652
min: 60659.390869 max: 73373.327881
min: 60553.894531 max: 73431.792969
***************************************************************
Traceback (most recent call last):
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 870, in run_import_movies_or_micrographs
    result.get()
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/multiprocessing/pool.py", line 657, in get
    raise self._value
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 788, in header_check_worker
    eer_upsamp_factor=eer_upsamp_factor)
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 677, in read_movie_header
    shape = tiff.read_tiff_shape(abs_path)
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/blobio/tiff.py", line 69, in read_tiff_shape
    tif = libtiff.TIFF.open(path)
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/site-packages/libtiff/libtiff_ctypes.py", line 501, in open
    raise TypeError('Failed to open file ' + repr(filename))
TypeError: Failed to open file b'/data1/szabolcs/2022-07-04_Westenhoff_PaPhy-FL-initial_EPU/Images-Disc1/GridSquare_3686077/Data/FoilHole_4773923_Data_3688194_3688196_20220705_185941_Fractions.tif'
Traceback (most recent call last):
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 870, in run_import_movies_or_micrographs
    result.get()
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/multiprocessing/pool.py", line 657, in get
    raise self._value
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 788, in header_check_worker
    eer_upsamp_factor=eer_upsamp_factor)
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 677, in read_movie_header
    shape = tiff.read_tiff_shape(abs_path)
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/blobio/tiff.py", line 69, in read_tiff_shape
    tif = libtiff.TIFF.open(path)
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/site-packages/libtiff/libtiff_ctypes.py", line 501, in open
    raise TypeError('Failed to open file ' + repr(filename))
TypeError: Failed to open file b'/data1/szabolcs/2022-07-04_Westenhoff_PaPhy-FL-initial_EPU/Images-Disc1/GridSquare_3686077/Data/FoilHole_4773925_Data_3688194_3688196_20220705_190027_Fractions.tif'
Traceback (most recent call last):
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 870, in run_import_movies_or_micrographs
    result.get()
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/multiprocessing/pool.py", line 657, in get
    raise self._value
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 788, in header_check_worker
    eer_upsamp_factor=eer_upsamp_factor)
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/jobs/imports/run.py", line 677, in read_movie_header
    shape = tiff.read_tiff_shape(abs_path)
  File "/home/analys/cryosparc/cryosparc_master/cryosparc_compute/blobio/tiff.py", line 69, in read_tiff_shape
    tif = libtiff.TIFF.open(path)
  File "/home/analys/cryosparc/cryosparc_master/deps/anaconda/envs/cryosparc_master_env/lib/python3.7/site-packages/libtiff/libtiff_ctypes.py", line 501, in open
    raise TypeError('Failed to open file ' + repr(filename))
TypeError: Failed to open file b'/data1/szabolcs/2022-07-04_Westenhoff_PaPhy-FL-initial_EPU/Images-Disc1/GridSquare_3686077/Data/FoilHole_4773937_Data_3688194_3688196_20220705_190546_Fractions.tif'
========= main process now complete.
========= monitor process now complete.

Hi @Boszlacs ,

Thanks for the detailed report. My initial question would be if you can access job data via the MongoDB command-line:

  1. cryosparcm mongo
  2. > db.jobs.count()
  3. > db.events.count()
  4. exit

In the project directory that you’ve attached, are you able to see folders for each job and a job.json for each?

- Suhail

Hi,

Running the MongoDB commands seems to return the 3 failed jobs (instance testing) from the incorrectly restored P1 and 5 currently running (so far without problems) jobs from a new project I started. I cannot make sense of the number of events, but I had about 700 total jobs in P1 if that helps.

(base) analys@biopc6:~$ cryosparcm mongo
MongoDB shell version v3.6.23
connecting to: mongodb://biopc6.kemi.uu.se:61001/meteor?authSource=admin&gssapiServiceName=mongodb
Implicit session: session { "id" : UUID("c0b4d60c-6086-4676-8d9f-5b92553f209a") }
MongoDB server version: 3.6.23
Welcome to the MongoDB shell.
For interactive help, type "help".
For more comprehensive documentation, see
	http://docs.mongodb.org/
Questions? Try the support group
	http://groups.google.com/group/mongodb-user
meteor:PRIMARY> db.jobs.count()
8
meteor:PRIMARY> db.events.count()
2664
meteor:PRIMARY> exit
bye

I can see the job folders in the project folder a I have found the job.json file in the couple that I have checked.



You may want to examine the command_core log to check on success or failure of the project’s attachment:

cryosparcm log command_core
cryosparcm filterlog command_core -l ERROR
cryosparcm filterlog command_core -f import_project
cryosparcm filterlog command_core -f import_workspaces
cryosparcm filterlog command_core -f import_jobs

After checking the command_core log, I see this error message being constantly written:

2023-09-29 15:39:05,151 wrapper          	ERROR	| Traceback (most recent call last):
2023-09-29 15:39:05,151 wrapper          	ERROR	|   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 195, in wrapper
2023-09-29 15:39:05,151 wrapper          	ERROR	| 	res = func(*args, **kwargs)
2023-09-29 15:39:05,151 wrapper          	ERROR	|   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 7203, in get_job_status
2023-09-29 15:39:05,151 wrapper          	ERROR	| 	return get_job(project_uid, job_uid, 'status')['status']
2023-09-29 15:39:05,151 wrapper          	ERROR	|   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 186, in wrapper
2023-09-29 15:39:05,151 wrapper          	ERROR	| 	return func(*args, **kwargs)
2023-09-29 15:39:05,151 wrapper          	ERROR	|   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 5784, in get_job
2023-09-29 15:39:05,151 wrapper          	ERROR	| 	raise ValueError(f"{project_uid} {job_uid} does not exist.")
2023-09-29 15:39:05,151 wrapper          	ERROR	| ValueError: P1 J717 does not exist.
2023-09-29 15:39:06,289 wrapper          	ERROR	| JSONRPC ERROR at get_job_status
2023-09-29 15:39:06,289 wrapper          	ERROR	| Traceback (most recent call last):
2023-09-29 15:39:06,289 wrapper          	ERROR	|   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 195, in wrapper
2023-09-29 15:39:06,289 wrapper          	ERROR	| 	res = func(*args, **kwargs)
2023-09-29 15:39:06,289 wrapper          	ERROR	|   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 7203, in get_job_status
2023-09-29 15:39:06,289 wrapper          	ERROR	| 	return get_job(project_uid, job_uid, 'status')['status']
2023-09-29 15:39:06,289 wrapper          	ERROR	|   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 186, in wrapper
2023-09-29 15:39:06,289 wrapper          	ERROR	| 	return func(*args, **kwargs)
2023-09-29 15:39:06,289 wrapper          	ERROR	|   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 5784, in get_job
2023-09-29 15:39:06,289 wrapper          	ERROR	| 	raise ValueError(f"{project_uid} {job_uid} does not exist.")
2023-09-29 15:39:06,289 wrapper          	ERROR	| ValueError: P1 J717 does not exist.

This persists even after detaching P1 and deleting it from the database. As far as I can tell, this is all the log contains, and no import_project/workspaces/jobs entry can be found. J717 is a directory containing only an empty job.log file, and it is among the last jobs launched in P1 (720 jobs total).
I have given up on P1 for now and am working in a newly created project directory, but this constant error message worries me.

Hi @Boszlacs, we typically see this issue when a Live session was improperly paused and is in an inconsistent state. To identify this problem, use CryoSPARC’s interactive Python shell:

cryosparcm icli

Enter the following Python code into the prompt

sessions = list(
    db.workspaces.find(
        {
            "status": "paused",
            "$or": [
                {"rtp_childs": {"$exists": True, "$ne": []}},
                {"rtp_workers": {"$exists": True, "$ne": {}}},
            ]
        },
        {"project_uid": 1, "session_uid": 1, "rtp_childs": 1, "rtp_workers": 1},
    )
)
print(sessions)

If you see any output other than [], there are inconsistent sessions.

To fix the issue enter the following code into the prompt:

db.workspaces.update_many(
    {
        "status": "paused",
        "$or": [
            {"rtp_childs": {"$exists": True, "$ne": []}},
            {"rtp_workers": {"$exists": True, "$ne": {}}},
        ],
    },
    {"$set": {"rtp_childs": [], "rtp_workers": {}}},
)

Enter exit() to quit the interactive prompt.

Future versions of CryoSPARC will address this issue so this workaround will no longer be required. Let me know how that goes!

2 Likes

Hi @Nfrasser,

Thanks a lot for the tip! This has indeed fixed the error I was seeing. The session in question was the one running at the time of the crash.
I have tried attaching the problematic project again, but I still the the same issue - the workspaces show up but they are empty.

@Boszlacs Please can you inspect your command_core log for errors:

cryosparcm filterlog command_core -l ERROR
cryosparcm filterlog command_core -f import_workspaces
cryosparcm filterlog command_core -f import_jobs
cryosparcm filterlog command_core -f import_project
(base) analys@biopc6:~$ cryosparcm filterlog command_core -l ERROR
2023-10-12 12:41:37,687 wrapper              ERROR    | JSONRPC ERROR at dump_workspaces
2023-10-12 12:41:37,687 wrapper              ERROR    | Traceback (most recent call last):
2023-10-12 12:41:37,687 wrapper              ERROR    |   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 195, in wrapper
2023-10-12 12:41:37,687 wrapper              ERROR    |     res = func(*args, **kwargs)
2023-10-12 12:41:37,687 wrapper              ERROR    |   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 258, in wrapper
2023-10-12 12:41:37,687 wrapper              ERROR    |     assert not project['detached'], f"validation error: project {project_uid} is detached"
2023-10-12 12:41:37,687 wrapper              ERROR    | AssertionError: validation error: project P3 is detached
2023-10-12 13:46:18,920 wrapper              ERROR    | JSONRPC ERROR at attach_project
2023-10-12 13:46:18,920 wrapper              ERROR    | Traceback (most recent call last):
2023-10-12 13:46:18,920 wrapper              ERROR    |   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 195, in wrapper
2023-10-12 13:46:18,920 wrapper              ERROR    |     res = func(*args, **kwargs)
2023-10-12 13:46:18,920 wrapper              ERROR    |   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/command_core/__init__.py", line 8604, in attach_project
2023-10-12 13:46:18,920 wrapper              ERROR    |     assert project_instance_uid != get_instance_uid(), "attach error: project already locked to current instance"
2023-10-12 13:46:18,920 wrapper              ERROR    | AssertionError: attach error: project already locked to current instance
2023-10-12 14:01:24,735 wrapper              ERROR    | JSONRPC ERROR at dump_workspaces
2023-10-12 14:01:24,735 wrapper              ERROR    | Traceback (most recent call last):
2023-10-12 14:01:24,735 wrapper              ERROR    |   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 195, in wrapper
2023-10-12 14:01:24,735 wrapper              ERROR    |     res = func(*args, **kwargs)
2023-10-12 14:01:24,735 wrapper              ERROR    |   File "/home/analys/cryosparc/cryosparc_master/cryosparc_command/commandcommon.py", line 258, in wrapper
2023-10-12 14:01:24,735 wrapper              ERROR    |     assert not project['detached'], f"validation error: project {project_uid} is detached"
2023-10-12 14:01:24,735 wrapper              ERROR    | AssertionError: validation error: project P3 is detached

(base) analys@biopc6:~$ cryosparcm filterlog command_core -f import_workspaces
2023-10-12 14:01:12,822 import_workspaces    INFO     | Created workspace W1 in P5
2023-10-12 14:01:12,906 import_workspaces    INFO     | Created workspace W2 in P5
2023-10-12 14:01:12,997 import_workspaces    INFO     | Created workspace W8 in P5
2023-10-12 14:01:13,107 import_workspaces    INFO     | Created workspace W16 in P5
2023-10-12 14:01:13,206 import_workspaces    INFO     | Created workspace W17 in P5
2023-10-12 14:01:13,274 import_workspaces    INFO     | Created workspace W29 in P5
2023-10-12 14:01:13,381 import_workspaces    INFO     | Created workspace W32 in P5
2023-10-12 14:01:13,479 import_workspaces    INFO     | Created workspace W33 in P5
2023-10-12 14:01:13,592 import_workspaces    INFO     | Created workspace W14 in P5
2023-10-12 14:01:13,614 import_workspaces    INFO     | CryoSPARC Live session detected. Importing S12 into W14 in P5
2023-10-12 14:01:13,615 import_workspaces    INFO     | CryoSPARC Live S12: Uploading project image data...
2023-10-12 14:01:13,628 import_workspaces    INFO     | CryoSPARC Live S12: Done. Uploaded 0 files in 0.01s
2023-10-12 14:01:13,639 import_workspaces    INFO     | CryoSPARC Live S12: Inserting exposures into session...
2023-10-12 14:01:13,649 import_workspaces    WARNING  | CryoSPARC Live S12: Unable to locate exported exposure document
2023-10-12 14:01:13,650 import_workspaces    INFO     | CryoSPARC Live S12: Completed Import in 0.06s
2023-10-12 14:01:13,726 import_workspaces    INFO     | Created workspace W28 in P5
2023-10-12 14:01:13,759 import_workspaces    INFO     | CryoSPARC Live session detected. Importing S28 into W28 in P5
2023-10-12 14:01:13,759 import_workspaces    INFO     | CryoSPARC Live S28: Uploading project image data...
2023-10-12 14:12:57,528 import_workspaces    INFO     | CryoSPARC Live S28: Done. Uploaded 98563 files in 703.77s
2023-10-12 14:12:57,528 import_workspaces    INFO     | CryoSPARC Live S28: Inserting exposures into session...
2023-10-12 14:13:20,734 import_workspaces    INFO     | CryoSPARC Live S28: Done. Inserted 13501 exposures in 23.21s...
2023-10-12 14:13:20,734 import_workspaces    INFO     | CryoSPARC Live S28: Completed Import in 727.00s
2023-10-12 14:13:20,738 import_workspaces    INFO     | Created workspace W30 in P5
2023-10-12 14:13:20,740 import_workspaces    INFO     | CryoSPARC Live session detected. Importing S29 into W30 in P5
2023-10-12 14:13:20,740 import_workspaces    INFO     | CryoSPARC Live S29: Uploading project image data...
2023-10-12 14:13:59,577 import_workspaces    INFO     | CryoSPARC Live S29: Done. Uploaded 7779 files in 38.84s
2023-10-12 14:13:59,577 import_workspaces    INFO     | CryoSPARC Live S29: Inserting exposures into session...
2023-10-12 14:14:01,097 import_workspaces    INFO     | CryoSPARC Live S29: Done. Inserted 1065 exposures in 1.52s...
2023-10-12 14:14:01,097 import_workspaces    INFO     | CryoSPARC Live S29: Completed Import in 40.36s

(base) analys@biopc6:~$ cryosparcm filterlog command_core -f import_jobs
2023-10-12 14:14:01,157 import_jobs          INFO     | Inserting jobs into project...
2023-10-12 14:14:01,158 import_jobs          INFO     | Uploading image data for J1...
2023-10-12 14:14:01,192 import_jobs          INFO     | Done. Uploaded 7 files in 0.03s
2023-10-12 14:14:01,347 import_jobs          INFO     | Inserted job document in 0.19s...
2023-10-12 14:14:01,348 import_jobs          INFO     | Inserting streamlogs into jobs...
2023-10-12 14:14:01,348 import_jobs          INFO     | Done. Inserted 0 streamlogs in 0.00s...
2023-10-12 14:14:01,349 import_jobs          INFO     | Imported J1 into P5 in 0.19s...
2023-10-12 14:14:01,350 import_jobs          INFO     | Uploading image data for J2...
2023-10-12 14:14:01,351 import_jobs          INFO     | Done. Uploaded 0 files in 0.00s
2023-10-12 14:14:01,354 import_jobs          INFO     | Inserted job document in 0.00s...
2023-10-12 14:14:01,354 import_jobs          INFO     | Inserting streamlogs into jobs...
2023-10-12 14:14:01,498 import_jobs          INFO     | Done. Inserted 2 streamlogs in 0.14s...
2023-10-12 14:14:01,498 import_jobs          INFO     | Imported J2 into P5 in 0.15s...
2023-10-12 14:14:01,500 import_jobs          INFO     | Uploading image data for J3...
2023-10-12 14:14:02,111 import_jobs          INFO     | Done. Uploaded 102 files in 0.61s
2023-10-12 14:14:02,116 import_jobs          INFO     | Inserted job document in 0.62s...
2023-10-12 14:14:02,117 import_jobs          INFO     | Inserting streamlogs into jobs...
2023-10-12 14:14:02,117 import_jobs          INFO     | Done. Inserted 0 streamlogs in 0.00s...
2023-10-12 14:14:02,117 import_jobs          INFO     | Imported J3 into P5 in 0.62s...

(base) analys@biopc6:~$ cryosparcm filterlog command_core -f import_project
2023-10-12 14:01:11,879 import_project       INFO     | Importing project from /data1/szabolcs/P1
2023-10-12 14:01:12,665 import_project       INFO     | Created project P5

(base) analys@biopc6:~$

Jobs J1-J3 do not show up in the UI. The contents of the job directories seem inconsistent, for example J1 is originally an Import Movies job, but the job.json file looks like a worker test job, and the associated W32 was created for a instance testing right after the hard drive crash, when the database got corrupted.
Job directories:

(base) analys@biopc6:/data1/szabolcs/P1/J1$ ls -lah
total 704K
drwxrwxrwx   4 1024 users 4,0K jul 25  2022 .
drwxrwxrwx 730 1024 users  20K okt 12 14:08 ..
-rwxrwxrwx   1 1024 users   18 okt 12 14:01 events.bson
-rwxrwxrwx   1 1024 users  761 jul 25  2022 failed_movies.cs
drwxrwxrwx   2 1024 users 4,0K jul 25  2022 gridfs_data
drwxrwxrwx   2 1024 users 284K jul 25  2022 imported
-rwxrwxrwx   1 1024 users 340K jul 25  2022 imported_movies.cs
-rwxrwxrwx   1 1024 users 4,2K okt 12 14:01 job.json
-rwxrwxrwx   1 1024 users  21K jul 25  2022 job.log
-rwxrwxrwx   1 1024 users  283 jul 28  2022 P1_J1_failed_movies.csg
-rwxrwxrwx   1 1024 users  372 jul 28  2022 P1_J1_imported_movies.csg
(base) analys@biopc6:/data1/szabolcs/P1/J1$ ls -lah ../J2
total 44K
drwxrwxrwx   3 1024 users 4,0K okt 12 13:57 .
drwxrwxrwx 730 1024 users  20K okt 12 14:08 ..
-rwxrwxrwx   1 1024 users  425 okt 12 14:01 events.bson
drwxrwxrwx   2 1024 users 4,0K okt 12 13:57 gridfs_data
-rwxrwxrwx   1 1024 users  11K okt 12 14:01 job.json
(base) analys@biopc6:/data1/szabolcs/P1/J1$ ls -lah ../J3
total 5,0M
drwxrwxrwx   4 1024 users 4,0K okt 18  2022 .
drwxrwxrwx 730 1024 users  20K okt 12 14:08 ..
drwxrwxrwx   2 1024 users 1,1M jul 25  2022 ctfestimated
-rwxrwxrwx   1 1024 users   18 okt 12 14:01 events.bson
-rwxrwxrwx   1 1024 users 974K jul 25  2022 exposures_ctf_estimated.cs
drwxrwxrwx   2 1024 users 4,0K jul 25  2022 gridfs_data
-rwxrwxrwx   1 1024 users 1,5K mar  7  2023 J3_exposures.csg
-rwxrwxrwx   1 1024 users  30K okt 12 14:01 job.json
-rwxrwxrwx   1 1024 users  14K jul 25  2022 job.log
-rwxrwxrwx   1 1024 users 1,5K sep 14  2022 P1_J3_exposures.csg
-rwxrwxrwx   1 1024 users 2,9M jul 25  2022 P1_J3_passthrough_exposures.cs

J1 job.json:

{
    "project_uid": "P4",
    "uid": "J1",
    "PID_main": null,
    "PID_monitor": null,
    "PID_workers": [],
    "bench": {},
    "children": [],
    "cloned_from": null,
    "cluster_job_custom_vars": {},
    "cluster_job_id": null,
    "cluster_job_monitor_event_id": null,
    "cluster_job_monitor_last_run_at": null,
    "cluster_job_monitor_retries": 0,
    "cluster_job_status": null,
    "cluster_job_status_code": null,
    "cluster_job_submission_script": null,
    "completed_at": null,
    "completed_count": 0,
    "created_at": {
        "$date": 1693417196297
    },
    "created_by_job_uid": null,
    "created_by_user_id": "64ef8007a234bfd23e6cff9f",
    "deleted": true,
    "description": "Enter a description.",
    "enable_bench": false,
    "errors_build_inputs": {},
    "errors_build_params": {},
    "errors_run": [],
    "experiment_worker_path": null,
    "failed_at": null,
    "generate_intermediate_results": false,
    "heartbeat_at": null,
    "input_slot_groups": [],
    "instance_information": {},
    "interactive": false,
    "interactive_hostname": "",
    "interactive_port": null,
    "intermediate_results_size_bytes": 0,
    "intermediate_results_size_last_updated": {
        "$date": 1697110164659
    },
    "is_ancestor_of_final_result": false,
    "is_experiment": false,
    "is_final_result": false,
    "job_dir": "J1",
    "job_dir_size": 3687040,
    "job_dir_size_last_updated": {
        "$date": 1697110164659
    },
    "job_type": "instance_launch_test",
    "killed_at": null,
    "last_accessed": {
        "name": "analys",
        "accessed_at": {
            "$date": 1697111387336
        }
    },
    "last_intermediate_data_cleared_amount": 0,
    "last_intermediate_data_cleared_at": null,
    "last_scheduled_at": null,
    "last_updated": {
        "$date": 1697111424766
    },
    "launched_at": null,
    "output_group_images": {},
    "output_result_groups": [],
    "output_results": [],
    "params_base": {
        "use_all_gpus": {
            "type": "boolean",
            "value": true,
            "title": "Benchmark all available GPUs",
            "desc": "If enabled, benchmark all available GPUs on the target. This option may not work when submitting to a cluster resource manager.",
            "order": 0,
            "section": "resource_settings",
            "advanced": false,
            "hidden": true
        },
        "gpu_num_gpus": {
            "type": "number",
            "value": 0,
            "title": "Number of GPUs to benchmark",
            "desc": "The number of GPUs to request from the scheduler.",
            "order": 1,
            "section": "resource_settings",
            "advanced": false,
            "hidden": true
        },
        "use_ssd": {
            "type": "boolean",
            "value": false,
            "title": "Use SSD for Tests",
            "desc": "Whether or not to use the SSD on the worker for the tests.",
            "order": 2,
            "section": "resource_settings",
            "advanced": false,
            "hidden": true
        }
    },
    "params_secs": {
        "resource_settings": {
            "title": "Resource Settings",
            "desc": "",
            "order": 0
        }
    },
    "params_spec": {},
    "parents": [],
    "priority": 0,
    "project_uid_num": 1,
    "queue_index": null,
    "queue_message": null,
    "queue_status": null,
    "queued_at": null,
    "queued_job_hash": null,
    "queued_to_lane": "",
    "resources_allocated": {},
    "resources_needed": {
        "slots": {
            "CPU": 1,
            "GPU": 0,
            "RAM": 1
        },
        "fixed": {
            "SSD": false
        }
    },
    "run_as_user": null,
    "running_at": null,
    "started_at": null,
    "status": "building",
    "title": "New Job J1",
    "tokens_acquired_at": null,
    "tokens_requested_at": null,
    "type": "instance_launch_test",
    "ui_tile_height": 1,
    "ui_tile_images": [],
    "ui_tile_width": 1,
    "uid_num": 1,
    "version": "v4.3.0",
    "waiting_at": null,
    "workspace_uids": [
        "W32"
    ],
    "ui_layouts": {},
    "imported": true,
    "last_exported": {
        "$date": 1697111395060
    }
}

Please can you describe the contexts of

  1.  2023-10-12 12:41:37,687 wrapper              ERROR    | AssertionError: validation error: project P3 is detached
    
  2.  2023-10-12 13:46:18,920 wrapper              ERROR    | AssertionError: attach error: project already locked to current instance
    

and post the output of the cryosparcm icli command

cryosparcm icli # enter the interactive cli
[p for p in db.projects.find({}, {'_id': 0, 'uid': 1, 'project_dir': 1, 'created_at': 1, 'imported_at': 1, 'import_status': 1})]
exit() # exit the icli
In [1]: [p for p in db.projects.find({}, {'_id': 0, 'uid': 1, 'project_dir': 1, 'created_at': 1, 'imported_at': 1, 'import_status': 1})]
Out[1]: 
[{'uid': 'P1',
  'project_dir': '/data1/szabolcs/P1',
  'created_at': datetime.datetime(2022, 7, 25, 8, 51, 43, 59000),
  'import_status': 'complete',
  'imported_at': datetime.datetime(2023, 8, 30, 19, 16, 40, 526000)},
 {'uid': 'P2',
  'project_dir': '/data1/szabolcs/CS-pabphp',
  'created_at': datetime.datetime(2023, 8, 31, 11, 6, 45, 511000)},
 {'uid': 'P3',
  'project_dir': '/data1/szabolcs//P1',
  'created_at': datetime.datetime(2022, 7, 25, 8, 51, 43, 59000),
  'import_status': 'complete',
  'imported_at': datetime.datetime(2023, 9, 7, 20, 10, 4, 832000)},
 {'uid': 'P4',
  'project_dir': '/data1/szabolcs/P1',
  'created_at': datetime.datetime(2022, 7, 25, 8, 51, 43, 59000),
  'import_status': 'complete',
  'imported_at': datetime.datetime(2023, 10, 5, 7, 29, 34, 411000)},
 {'uid': 'P5',
  'project_dir': '/data1/szabolcs/P1',
  'created_at': datetime.datetime(2022, 7, 25, 8, 51, 43, 59000),
  'import_status': 'complete',
  'imported_at': datetime.datetime(2023, 10, 12, 12, 1, 12, 425000)}]

I am not sure about the error messages - P3 was one of the attempts at reattaching /data1/szabolcs/P1, but it was detached and deleted from the database with the corresponding buttons in the UI.

Thanks. Please can you also run this command in the icli:

[p for p in db.projects.find({}, {'_id': 0, 'uid': 1, 'project_dir': 1, 'archived': 1, 'detached': 1, 'deleted': 1, 'imported': 1})]
In [1]: [p for p in db.projects.find({}, {'_id': 0, 'uid': 1, 'project_dir': 1, 'archived': 1, 'detached': 1, 'deleted': 1, 'imported': 1})]
Out[1]: 
[{'uid': 'P1',
  'project_dir': '/data1/szabolcs/P1',
  'deleted': True,
  'archived': False,
  'detached': True,
  'imported': True},
 {'uid': 'P2',
  'project_dir': '/data1/szabolcs/CS-pabphp',
  'deleted': False,
  'archived': False,
  'detached': False},
 {'uid': 'P3',
  'project_dir': '/data1/szabolcs//P1',
  'deleted': True,
  'archived': False,
  'detached': True,
  'imported': True},
 {'uid': 'P4',
  'project_dir': '/data1/szabolcs/P1',
  'deleted': True,
  'archived': False,
  'detached': True,
  'imported': True},
 {'uid': 'P5',
  'project_dir': '/data1/szabolcs/P1',
  'deleted': False,
  'archived': False,
  'detached': False,
  'imported': True}]

Thanks. Maybe this query can shed some light on the mismatching job types that you observed for some job directories:

[j for j in db.jobs.find({"project_uid": {"$nin": ["P2"]}},{"_id": 0, "project_uid": 1, "uid": 1, "job_type": 1, "status": 1, "created_at": 1})]
In [1]: [j for j in db.jobs.find({"project_uid": {"$nin": ["P2"]}},{"_id": 0, "project_uid": 1, "uid": 1, "job_type": 1, "status": 1, "created_at": 1})]
Out[1]: 
[{'project_uid': 'P1',
  'uid': 'J3',
  'created_at': datetime.datetime(2023, 8, 30, 19, 50, 24, 365000),
  'job_type': 'patch_motion_correction_multi',
  'status': 'building'},
 {'project_uid': 'P1',
  'uid': 'J1',
  'created_at': datetime.datetime(2023, 8, 30, 17, 39, 56, 297000),
  'job_type': 'instance_launch_test',
  'status': 'building'},
 {'project_uid': 'P1',
  'uid': 'J2',
  'created_at': datetime.datetime(2023, 8, 30, 19, 47, 47, 854000),
  'job_type': 'import_movies',
  'status': 'failed'},
 {'project_uid': 'P3',
  'uid': 'J3',
  'created_at': datetime.datetime(2023, 8, 30, 19, 50, 24, 365000),
  'job_type': 'patch_motion_correction_multi',
  'status': 'building'},
 {'project_uid': 'P3',
  'uid': 'J1',
  'created_at': datetime.datetime(2023, 8, 30, 17, 39, 56, 297000),
  'job_type': 'instance_launch_test',
  'status': 'building'},
 {'project_uid': 'P3',
  'uid': 'J2',
  'created_at': datetime.datetime(2023, 8, 30, 19, 47, 47, 854000),
  'job_type': 'import_movies',
  'status': 'failed'},
 {'project_uid': 'P4',
  'uid': 'J3',
  'created_at': datetime.datetime(2023, 8, 30, 19, 50, 24, 365000),
  'job_type': 'patch_motion_correction_multi',
  'status': 'building'},
 {'project_uid': 'P4',
  'uid': 'J1',
  'created_at': datetime.datetime(2023, 8, 30, 17, 39, 56, 297000),
  'job_type': 'instance_launch_test',
  'status': 'building'},
 {'project_uid': 'P4',
  'uid': 'J2',
  'created_at': datetime.datetime(2023, 8, 30, 19, 47, 47, 854000),
  'job_type': 'import_movies',
  'status': 'failed'},
 {'project_uid': 'P5',
  'uid': 'J3',
  'created_at': datetime.datetime(2023, 8, 30, 19, 50, 24, 365000),
  'job_type': 'patch_motion_correction_multi',
  'status': 'building'},
 {'project_uid': 'P5',
  'uid': 'J1',
  'created_at': datetime.datetime(2023, 8, 30, 17, 39, 56, 297000),
  'job_type': 'instance_launch_test',
  'status': 'building'},
 {'project_uid': 'P5',
  'uid': 'J2',
  'created_at': datetime.datetime(2023, 8, 30, 19, 47, 47, 854000),
  'job_type': 'import_movies',
  'status': 'failed'}]