Can't delete workspaces or projects (v2)

Hi,

Workspaces and projects can’t be deleted in v2. Regardless of whether they are empty or not, attempted deletion gives an error message saying “can’t delete non-empty workspace”.

This is evidently a bug when applied to an empty workspace or project, but regardless I think users should be able to delete non-empty workspaces/projects.

If I want to delete a project, I shouldn’t have to delete each individual experiment and workspace within it - just give me a couple of confirmation dialogs to make sure that I am definitely, 100% sure I want to delete the project (or have a trashcan mechanism for deleted projects/workspaces, where they are not irretrievably gone until one empties the trash).

I also wonder whether it might be worth having a mechanism to archive old projects (I guess depends how efficiently they can be compressed as to whether it is worth it or not).

It would also be good to have a disk-cleanup job type to automatically delete intermediate files (and/or to have an option when running a job to control how much intermediate stuff is kept).

Cheers
Oli

1 Like

The same here. It it also easy to confuse later the workspaces which where wrongly inserted with new ones.

Hi Oli,

Deleting workspaces (after confirmation) and automated disk-cleanups are in the pipeline! Archiving projects will be something that is covered during the disk-cleanup procedure (removing process files and keeping metadata). There are a lot of edge cases to consider, so we’ll be adding features iteratively.

Thanks,
Stephan

1 Like

Glad to hear it thanks Stephan! :slight_smile:

Cheers
Oli

Non-empty project/workspace deletion is now possible (v2.2+)! A disk-cleanup utility is still in the pipeline.

Hi @stephan,

Deleting projects still doesn’t seem to work - I click delete, confirm deletion, and the dialog does not disappear, and nor do any files.

Any word on when the disk cleanup tool might appear? My project directory is getting rather overweight

Cheers
Oli

This is also the case for certain workspaces - it will delete a few jobs and then stall indefinitely (and I can tell it is not doing anything because nothing is changing in the relevant directories).

Cheers
Oli

Sometimes if I cancel and then repeat the request to delete the workspace or project multiple times, then it works for some reason

For example I have one particular project where deleting workspaces just doesn’t work - it will only delete one job at a time from each workspace. Happy to provide any info needed to debug.

Is there a timeline for this feature?
Given the immense amount of data being used in cryoEM, data management is a necessity.

Any development on the disk/database cleanup features? The ever growing database is unmanageable, a relion-style local database would seem more sensible (each user would manage its own database, written in their user space instead of a global database).

1 Like

Dear @istv01, in v2.11 we have released several features for data management, including clearing of intermediate results, import and export of projects/jobs/results. Thanks!

I have deleted 2 projects, but I have windows for both that keep popping up that read “clearing intermediate results for all jobs in Px”. How do I clear this up?

Hi @acd,

If this is a notification, you can go to the notification manager (found inside the resource manager) and “clear” the orphaned notification. This sometimes happen when functions are terminated during execution and they don’t get to clear the notifications themselves.

Yes, that is exactly what happened. This cleared it up- Thank you!

I can click and delete projects from the UI, but they persist. Then if I try another import,
it says jobs are are already linked and it fails. But the projects referred to are one I deleted, supposedly.If I could clear the whole database of projects and start over, I would. Would that require
a reinstallation of cryosparc from scratch? Thanks for any clues for the newbie.

HI @GeorgeP,

Is it possible if you can send over screen shots or error logs?

Hi,

I am not the original poster but I can confirm this is still an issue with version 2.15.0. We run the master node on a RHEL 7 OS and the worked on CentOS 7. What we see in line with what previous posts claimed in the thread is that users are unable to remove projects or many jobs, if the number of jobs under a project is large.

The screenshot I uploaded shows two things of importance:
-Project P4 should have been deleted, yet it shows present with 345 Gigs of data on the Data Management tab to the right. This is also the case in the filesystem.
-A job to clear intermediate results from P7 was hanging.

We have tried to troubleshoot this by ensuring we did not run out of space (we didn’t) and restart the master cryosparc process various times. This did not fix the issues and we are unclear as to why we have a discrepancy between the state of the filesystem, the state of the project (deleted) and yet what shows in the data management field. It seems that the CryoSparc database does not update to reflect correctly the state of projects.

Can you please clarify why we have this behavior and what is the correct procedure to delete projects to regain space?

Thanks!

George
ahmadP3projectdelectionissue

Hi George, are you still getting this issue? Sometimes deleting workspaces, projects or intermediate results takes a long time depending on the project size, but it definitely shouldn’t hang forever.

If you’ve got a notification that’s been hanging for a while, please try the following to help me troubleshoot:

  1. Restart cryoSPARC’s command_core module from the command line:
    cryosparcm restart command_core
    
  2. Start logging command_core output and leave it running in the background:
    cryosparcm log command_core
    
  3. In cryoSPARC, open the Notification Manager and clear all active notifications
  4. Delete/Clear the desired project, workspace or job
  5. Leave it running for a few minutes. When it seems like it’s hanging, go back to the log and press Ctrl+C to stop logging.
  6. Copy the full output and send it to me.

Let me know if you run into any trouble with any of that,

Nick

1 Like