Libcufft.so.10: cannot open shared object file

Hi,

Now I’m facing another problem I think this is related to the GPU controller. When I try to run 3D refinement or MotionCor I get this error:

3D refinement:

Traceback (most recent call last):
  File "cryosparc_master/cryosparc_compute/run.py", line 83, in cryosparc_compute.run.main
  File "/home/victor/cryosparc/cryosparc_worker/cryosparc_compute/jobs/jobregister.py", line 442, in get_run_function
    runmod = importlib.import_module(".."+modname, __name__)
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 1174, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "cryosparc_master/cryosparc_compute/jobs/refine/newrun.py", line 16, in init cryosparc_compute.jobs.refine.newrun
  File "/home/victor/cryosparc/cryosparc_worker/cryosparc_compute/engine/__init__.py", line 8, in <module>
    from .engine import *  # noqa
  File "cryosparc_master/cryosparc_compute/engine/engine.py", line 9, in init cryosparc_compute.engine.engine
  File "cryosparc_master/cryosparc_compute/engine/cuda_core.py", line 12, in init cryosparc_compute.engine.cuda_core
  File "/home/victor/cryosparc/cryosparc_worker/cryosparc_compute/skcuda_internal/fft.py", line 9, in <module>
    from . import gpufft
ImportError: libcufft.so.10: cannot open shared object file: No such file or directory

Motion Cor:

License is valid.

Launching job on lane default target c105627.dhcp.swmed.org ...

Running job on master node hostname c105627.dhcp.swmed.org

[CPU:  164.2 MB  Avail: 370.57 GB]
Job J6 Started

[CPU:  164.2 MB  Avail: 370.57 GB]
Master running v4.2.1, worker running v4.2.1

[CPU:  164.3 MB  Avail: 370.57 GB]
Working in directory: /Bobcat/230501_JD116-R582/CS-230501-jd116-r582-cluster/J6

[CPU:  164.3 MB  Avail: 370.57 GB]
Running on lane default

[CPU:  164.3 MB  Avail: 370.57 GB]
Resources allocated: 

[CPU:  164.3 MB  Avail: 370.57 GB]
  Worker:  c105627.dhcp.swmed.org

[CPU:  164.3 MB  Avail: 370.57 GB]
  CPU   :  [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]

[CPU:  164.3 MB  Avail: 370.57 GB]
  GPU   :  [0, 1]

[CPU:  164.3 MB  Avail: 370.57 GB]
  RAM   :  [0, 1, 2, 3]

[CPU:  164.3 MB  Avail: 370.57 GB]
  SSD   :  False

[CPU:  164.3 MB  Avail: 370.57 GB]
--------------------------------------------------------------

[CPU:  164.3 MB  Avail: 370.57 GB]
Importing job module for job type patch_motion_correction_multi...

[CPU:  199.5 MB  Avail: 370.37 GB]
Job ready to run

[CPU:  199.5 MB  Avail: 370.37 GB]
***************************************************************

[CPU:  199.9 MB  Avail: 370.38 GB]
Job will process this many movies:  500

[CPU:  199.9 MB  Avail: 370.38 GB]
parent process is 156841

[CPU:  158.8 MB  Avail: 370.36 GB]
Calling CUDA init from 156897

[CPU:  158.8 MB  Avail: 370.36 GB]
Calling CUDA init from 156898

[CPU:  200.4 MB  Avail: 370.41 GB]
Child process with PID 156897 terminated unexpectedly with exit code 1.

[CPU:  200.4 MB  Avail: 370.41 GB]
Child process with PID 156898 terminated unexpectedly with exit code 1.

[CPU:  201.1 MB  Avail: 370.40 GB]
--------------------------------------------------------------

[CPU:  201.1 MB  Avail: 370.40 GB]
Compiling job outputs...

[CPU:  201.1 MB  Avail: 370.40 GB]
Passing through outputs for output group micrographs from input group movies

[CPU:  201.1 MB  Avail: 370.40 GB]
This job outputted results ['micrograph_blob_non_dw', 'micrograph_thumbnail_blob_1x', 'micrograph_thumbnail_blob_2x', 'micrograph_blob', 'background_blob', 'rigid_motion', 'spline_motion']

[CPU:  201.1 MB  Avail: 370.40 GB]
  Loaded output dset with 0 items

[CPU:  201.1 MB  Avail: 370.40 GB]
Passthrough results ['movie_blob', 'gain_ref_blob', 'mscope_params']

[CPU:  201.1 MB  Avail: 370.40 GB]
  Loaded passthrough dset with 500 items

[CPU:  201.1 MB  Avail: 370.40 GB]
  Intersection of output and passthrough has 0 items

[CPU:  201.1 MB  Avail: 370.40 GB]
Passing through outputs for output group micrographs_incomplete from input group movies

[CPU:  201.1 MB  Avail: 370.40 GB]
This job outputted results ['micrograph_blob']

[CPU:  201.1 MB  Avail: 370.40 GB]
  Loaded output dset with 500 items

[CPU:  201.1 MB  Avail: 370.40 GB]
Passthrough results ['movie_blob', 'gain_ref_blob', 'mscope_params']

[CPU:  201.1 MB  Avail: 370.40 GB]
  Loaded passthrough dset with 500 items

[CPU:  201.1 MB  Avail: 370.40 GB]
  Intersection of output and passthrough has 500 items

[CPU:  201.2 MB  Avail: 370.40 GB]
Checking outputs for output group micrographs

[CPU:  201.2 MB  Avail: 370.40 GB]
Checking outputs for output group micrographs_incomplete

[CPU:  201.2 MB  Avail: 370.40 GB]
Updating job size...

[CPU:  201.2 MB  Avail: 370.40 GB]
Exporting job and creating csg files...

[CPU:  201.2 MB  Avail: 370.41 GB]
***************************************************************

[CPU:  201.2 MB  Avail: 370.41 GB]
Job complete. Total time 35.70s

Output from nvidia-smi:

NVIDIA-SMI 465.19.01    Driver Version: 465.19.01    CUDA Version: 11.3  

nvcc is pointing to this file, I’m not sure if it is a problem

which nvcc
/usr/local/cuda/bin/nvcc

Were you able to resolve the previous problems, such as

?

Please post CryoSPARC worker environment details (from a Linux shell specially prepared as described in the link).

@wtempel, the previous problem is completely fixed (database, interface access and license validation).

Here are the worker environment details:

$ eval $(/home/victor/cryosparc/cryosparc_worker/bin/cryosparcw env)
$ env | grep PATH
LIBRARY_PATH=
LD_LIBRARY_PATH=/usr/local/cuda-10.0/lib64:/home/victor/cryosparc/cryosparc_worker/deps/external/cudnn/lib:/usr/local/relion-3/lib:/usr/local/cuda/lib:/usr/local/cuda/lib64:/usr/local/cuda-9.2/lib:/usr/local/cuda-9.2/lib64:/usr/local/cuda-9.1/lib:/usr/local/cuda-9.1/lib64:/usr/local/cuda-8.0/lib:/usr/local/cuda-8.0/lib64:/usr/local/cuda-7.5/lib:/usr/local/cuda-7.5/lib64:/usr/local/IMOD/lib
CPATH=
PATH=/usr/local/cuda-10.0/bin:/home/victor/cryosparc/cryosparc_worker/bin:/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/bin:/home/victor/cryosparc/cryosparc_worker/deps/anaconda/condabin:/home/victor/cryosparc/cryosparc_master/bin:/usr/local/EMAN_2.21/condabin:/usr/local/relion-3/bin:/usr/local/mpich-3.2.1/bin:/usr/local/cuda/bin:/usr/local/IMOD/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/var/lib/snapd/snap/bin:/usr/local/motioncorr_v2.1/bin:/usr/local/Gctf_v1.06/bin:/usr/local/Gctf_v0.50/bin:/usr/local/ResMap:/usr/local/summovie_1.0.2/bin:/usr/local/unblur_1.0.2/bin:/usr/local/EMAN_2.21/bin:/home/victor/.local/bin:/home/victor/bin
CRYOSPARC_PATH=/home/victor/cryosparc/cryosparc_worker/bin
PYTHONPATH=/home/victor/cryosparc/cryosparc_worker
CRYOSPARC_CUDA_PATH=/usr/local/cuda-10.0
$ which nvcc
/usr/local/cuda-10.0/bin/nvcc
$ python -c "import pycuda.driver; print(pycuda.driver.get_version())"
(10, 0, 0)
$ uname -a
Linux c105627 3.10.0-1160.24.1.el7.x86_64 #1 SMP Thu Apr 8 19:51:47 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
$ free -g
              total        used        free      shared  buff/cache   available
Mem:            376          12           1           0         362         362
Swap:             9           0           9
$ nvidia-smi
Wed May  3 10:36:41 2023       
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 465.19.01    Driver Version: 465.19.01    CUDA Version: 11.3     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ...  On   | 00000000:1A:00.0 Off |                  N/A |
| 29%   26C    P8     8W / 250W |     45MiB / 11019MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   1  NVIDIA GeForce ...  On   | 00000000:1B:00.0 Off |                  N/A |
| 30%   27C    P8    10W / 250W |      1MiB / 11019MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   2  NVIDIA GeForce ...  On   | 00000000:60:00.0 Off |                  N/A |
| 30%   27C    P8     1W / 250W |      1MiB / 11019MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   3  NVIDIA GeForce ...  On   | 00000000:61:00.0 Off |                  N/A |
| 29%   28C    P8    16W / 250W |      1MiB / 11019MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   4  NVIDIA GeForce ...  On   | 00000000:B1:00.0 Off |                  N/A |
| 29%   27C    P8     7W / 250W |      1MiB / 11019MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   5  NVIDIA GeForce ...  On   | 00000000:B2:00.0 Off |                  N/A |
| 29%   26C    P8     1W / 250W |      1MiB / 11019MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   6  NVIDIA GeForce ...  On   | 00000000:DA:00.0 Off |                  N/A |
| 28%   25C    P8     7W / 250W |      1MiB / 11019MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
|   7  NVIDIA GeForce ...  On   | 00000000:DB:00.0 Off |                  N/A |
| 29%   26C    P8     1W / 250W |      1MiB / 11019MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A      3345      G   /usr/bin/X                         27MiB |
|    0   N/A  N/A      4012      G   /usr/bin/gnome-shell               15MiB |
+-----------------------------------------------------------------------------+

Logs:

$ cryosparcm log command_core

2023-05-02 18:54:19,243 COMMAND.DATA         dump_project         INFO     | Exporting project P6
2023-05-02 18:54:19,244 COMMAND.DATA         dump_project         INFO     | Exported project P6 to /data1/VICTOR_RAW_DATA/P6/project.json in 0.00s
2023-05-02 18:54:19,943 COMMAND.DATA         dump_project         INFO     | Exporting project P7
2023-05-02 18:54:19,945 COMMAND.DATA         dump_project         INFO     | Exported project P7 to /data2/CRYOSPARC-DATA/P7/project.json in 0.00s
2023-05-02 18:54:22,520 COMMAND.DATA         dump_project         INFO     | Exporting project P8
2023-05-02 18:54:22,522 COMMAND.DATA         dump_project         INFO     | Exported project P8 to /data1/VICTOR_RAW_DATA/220518_PG86_R596_Xsslink/cryosparc_project/P8/project.json in 0.00s
2023-05-02 18:54:23,265 COMMAND.DATA         dump_project         INFO     | Exporting project P9
2023-05-02 18:54:23,266 COMMAND.DATA         dump_project         INFO     | Exported project P9 to /data1/VICTOR_RAW_DATA/P9/project.json in 0.00s
2023-05-02 18:54:23,911 COMMAND.DATA         dump_project         INFO     | Exporting project P10
2023-05-02 18:54:23,914 COMMAND.DATA         dump_project         INFO     | Exported project P10 to /data1/VICTOR_RAW_DATA/P10/project.json in 0.00s
2023-05-02 18:54:26,107 COMMAND.DATA         dump_project         INFO     | Exporting project P11
2023-05-02 18:54:26,109 COMMAND.DATA         dump_project         INFO     | Exported project P11 to /data1/VICTOR_RAW_DATA/P11/project.json in 0.00s
2023-05-02 18:54:26,510 COMMAND.DATA         dump_project         INFO     | Exporting project P12
2023-05-02 18:54:26,511 COMMAND.DATA         dump_project         INFO     | Exported project P12 to /data1/VICTOR_RAW_DATA/221020_MTG14_tRNA/P12/project.json in 0.00s
2023-05-02 18:54:27,114 COMMAND.DATA         dump_project         INFO     | Exporting project P13
2023-05-02 18:54:27,116 COMMAND.DATA         dump_project         INFO     | Exported project P13 to /data1/VICTOR_RAW_DATA/230119_JD116-CS-proccessing/P13/project.json in 0.00s
2023-05-02 19:31:44,958 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-02 19:39:45,389 COMMAND.DATA         dump_project         INFO     | Exporting project P14
2023-05-02 19:39:45,395 COMMAND.DATA         dump_project         INFO     | Exported project P14 to /Bobcat/230406-JD116_monomer/P14/project.json in 0.01s
2023-05-02 19:39:45,877 COMMAND.DATA         dump_project         INFO     | Exporting project P15
2023-05-02 19:39:45,882 COMMAND.DATA         dump_project         INFO     | Exported project P15 to /Bobcat/230501_JD116-R582/CS-230501-jd116-r582-/project.json in 0.01s
2023-05-02 20:31:45,257 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-02 21:31:45,398 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-02 22:31:46,122 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-02 23:31:46,128 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 00:31:47,124 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 01:31:47,625 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 02:31:48,401 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 03:31:48,910 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 04:31:49,049 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 05:31:49,286 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 06:31:49,524 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 07:31:50,363 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 08:31:51,083 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 09:31:51,556 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
2023-05-03 10:31:52,536 COMMAND.BG_WORKER    background_worker    INFO     | License does not have telemetry enabled; will re-check license in 1 hour.
Waiting for data... (interrupt to abort)

Error in 3D Refinement,

$ cryosparcm joblog P14 J96

~
~
~


================= CRYOSPARCW =======  2023-05-02 18:14:05.777105  =========
Project P14 Job J96
Master c105627.dhcp.swmed.org Port 39002
===========================================================================
========= monitor process now starting main process at 2023-05-02 18:14:05.777159
MAINPROCESS PID 157830
MAIN PID 157830
refine.newrun cryosparc_compute.jobs.jobregister
**** handle exception rc
Traceback (most recent call last):
  File "cryosparc_master/cryosparc_compute/run.py", line 83, in cryosparc_compute.run.main
  File "/home/victor/cryosparc/cryosparc_worker/cryosparc_compute/jobs/jobregister.py", line 442, in get_run_function
    runmod = importlib.import_module(".."+modname, __name__)
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 1174, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "cryosparc_master/cryosparc_compute/jobs/refine/newrun.py", line 16, in init cryosparc_compute.jobs.refine.newrun
  File "/home/victor/cryosparc/cryosparc_worker/cryosparc_compute/engine/__init__.py", line 8, in <module>
    from .engine import *  # noqa
  File "cryosparc_master/cryosparc_compute/engine/engine.py", line 9, in init cryosparc_compute.engine.engine
  File "cryosparc_master/cryosparc_compute/engine/cuda_core.py", line 12, in init cryosparc_compute.engine.cuda_core
  File "/home/victor/cryosparc/cryosparc_worker/cryosparc_compute/skcuda_internal/fft.py", line 9, in <module>
    from . import gpufft
ImportError: libcufft.so.10: cannot open shared object file: No such file or directory
set status to failed
========= monitor process now waiting for main process
========= main process now complete at 2023-05-02 18:14:09.727332.
========= monitor process now complete at 2023-05-02 18:14:09.732920.

Error in patch motion cor:

cryosparcm joblog P15 J6

ssing/queues.py", line 245, in _feed
    send_bytes(obj)
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 411, in _send_bytes
    self._send(header + buf)
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Traceback (most recent call last):
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/queues.py", line 245, in _feed
    send_bytes(obj)
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 411, in _send_bytes
    self._send(header + buf)
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
Traceback (most recent call last):
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/queues.py", line 245, in _feed
    send_bytes(obj)
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 411, in _send_bytes
    self._send(header + buf)
  File "/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/site-packages/numpy/core/fromnumeric.py:3372: RuntimeWarning: Mean of empty slice.
  return _methods._mean(a, axis=axis, dtype=dtype,
/home/victor/cryosparc/cryosparc_worker/deps/anaconda/envs/cryosparc_worker_env/lib/python3.8/site-packages/numpy/core/_methods.py:170: RuntimeWarning: invalid value encountered in double_scalars
  ret = ret.dtype.type(ret / rcount)
========= main process now complete at 2023-05-02 18:09:13.903234.
========= monitor process now complete at 2023-05-02 18:09:14.002068.

I hope this helps

What is the output of the command
ls -l /usr/local/cuda-10.0/lib64/
?

@wtempel the output is:

[victor@c105627 ~]$ ls -l /usr/local/cuda-10.0/lib64/
total 1528784
lrwxrwxrwx. 1 root root        19 May  4  2019 libaccinj64.so -> libaccinj64.so.10.0
lrwxrwxrwx. 1 root root        23 May  4  2019 libaccinj64.so.10.0 -> libaccinj64.so.10.0.130
-rwxr-xr-x. 1 root root   7407024 May  4  2019 libaccinj64.so.10.0.130
lrwxrwxrwx. 1 root root        17 May  4  2019 libcublas.so -> libcublas.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libcublas.so.10.0 -> libcublas.so.10.0.130
-rwxr-xr-x. 1 root root  70796360 May  4  2019 libcublas.so.10.0.130
-rw-r--r--. 1 root root  88190630 May  4  2019 libcublas_static.a
-rw-r--r--. 1 root root    695156 May  4  2019 libcudadevrt.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libcudart.so -> libcudart.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libcudart.so.10.0 -> libcudart.so.10.0.130
-rwxr-xr-x. 1 root root    495736 May  4  2019 libcudart.so.10.0.130
-rw-r--r--. 1 root root    955082 May  4  2019 libcudart_static.a
lrwxrwxrwx. 1 root root        16 May  4  2019 libcufft.so -> libcufft.so.10.0
lrwxrwxrwx. 1 root root        20 May  4  2019 libcufft.so.10.0 -> libcufft.so.10.0.145
-rwxr-xr-x. 1 root root 103177128 May  4  2019 libcufft.so.10.0.145
-rw-r--r--. 1 root root 123979550 May  4  2019 libcufft_static.a
-rw-r--r--. 1 root root 109454136 May  4  2019 libcufft_static_nocallback.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libcufftw.so -> libcufftw.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libcufftw.so.10.0 -> libcufftw.so.10.0.145
-rwxr-xr-x. 1 root root    561192 May  4  2019 libcufftw.so.10.0.145
-rw-r--r--. 1 root root     33250 May  4  2019 libcufftw_static.a
lrwxrwxrwx. 1 root root        18 May  4  2019 libcuinj64.so -> libcuinj64.so.10.0
lrwxrwxrwx. 1 root root        22 May  4  2019 libcuinj64.so.10.0 -> libcuinj64.so.10.0.130
-rwxr-xr-x. 1 root root   7792472 May  4  2019 libcuinj64.so.10.0.130
-rw-r--r--. 1 root root     31954 May  4  2019 libculibos.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libcurand.so -> libcurand.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libcurand.so.10.0 -> libcurand.so.10.0.130
-rwxr-xr-x. 1 root root  60806128 May  4  2019 libcurand.so.10.0.130
-rw-r--r--. 1 root root  60723962 May  4  2019 libcurand_static.a
lrwxrwxrwx. 1 root root        19 May  4  2019 libcusolver.so -> libcusolver.so.10.0
lrwxrwxrwx. 1 root root        23 May  4  2019 libcusolver.so.10.0 -> libcusolver.so.10.0.130
-rwxr-xr-x. 1 root root 139257368 May  4  2019 libcusolver.so.10.0.130
-rw-r--r--. 1 root root  72147850 May  4  2019 libcusolver_static.a
lrwxrwxrwx. 1 root root        19 May  4  2019 libcusparse.so -> libcusparse.so.10.0
lrwxrwxrwx. 1 root root        23 May  4  2019 libcusparse.so.10.0 -> libcusparse.so.10.0.130
-rwxr-xr-x. 1 root root  59078736 May  4  2019 libcusparse.so.10.0.130
-rw-r--r--. 1 root root  67262190 May  4  2019 libcusparse_static.a
-rw-r--r--. 1 root root  12722350 May  4  2019 liblapack_static.a
-rw-r--r--. 1 root root    967976 May  4  2019 libmetis_static.a
lrwxrwxrwx. 1 root root        15 May  4  2019 libnppc.so -> libnppc.so.10.0
lrwxrwxrwx. 1 root root        19 May  4  2019 libnppc.so.10.0 -> libnppc.so.10.0.130
-rwxr-xr-x. 1 root root    553320 May  4  2019 libnppc.so.10.0.130
-rw-r--r--. 1 root root     26216 May  4  2019 libnppc_static.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libnppial.so -> libnppial.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libnppial.so.10.0 -> libnppial.so.10.0.130
-rwxr-xr-x. 1 root root  10556304 May  4  2019 libnppial.so.10.0.130
-rw-r--r--. 1 root root  14040866 May  4  2019 libnppial_static.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libnppicc.so -> libnppicc.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libnppicc.so.10.0 -> libnppicc.so.10.0.130
-rwxr-xr-x. 1 root root   3956688 May  4  2019 libnppicc.so.10.0.130
-rw-r--r--. 1 root root   4450012 May  4  2019 libnppicc_static.a
lrwxrwxrwx. 1 root root        18 May  4  2019 libnppicom.so -> libnppicom.so.10.0
lrwxrwxrwx. 1 root root        22 May  4  2019 libnppicom.so.10.0 -> libnppicom.so.10.0.130
-rwxr-xr-x. 1 root root   1348432 May  4  2019 libnppicom.so.10.0.130
-rw-r--r--. 1 root root    949372 May  4  2019 libnppicom_static.a
lrwxrwxrwx. 1 root root        18 May  4  2019 libnppidei.so -> libnppidei.so.10.0
lrwxrwxrwx. 1 root root        22 May  4  2019 libnppidei.so.10.0 -> libnppidei.so.10.0.130
-rwxr-xr-x. 1 root root   7215416 May  4  2019 libnppidei.so.10.0.130
-rw-r--r--. 1 root root   9207224 May  4  2019 libnppidei_static.a
lrwxrwxrwx. 1 root root        16 May  4  2019 libnppif.so -> libnppif.so.10.0
lrwxrwxrwx. 1 root root        20 May  4  2019 libnppif.so.10.0 -> libnppif.so.10.0.130
-rwxr-xr-x. 1 root root  47194064 May  4  2019 libnppif.so.10.0.130
-rw-r--r--. 1 root root  51287500 May  4  2019 libnppif_static.a
lrwxrwxrwx. 1 root root        16 May  4  2019 libnppig.so -> libnppig.so.10.0
lrwxrwxrwx. 1 root root        20 May  4  2019 libnppig.so.10.0 -> libnppig.so.10.0.130
-rwxr-xr-x. 1 root root  25033264 May  4  2019 libnppig.so.10.0.130
-rw-r--r--. 1 root root  27175822 May  4  2019 libnppig_static.a
lrwxrwxrwx. 1 root root        16 May  4  2019 libnppim.so -> libnppim.so.10.0
lrwxrwxrwx. 1 root root        20 May  4  2019 libnppim.so.10.0 -> libnppim.so.10.0.130
-rwxr-xr-x. 1 root root   6197800 May  4  2019 libnppim.so.10.0.130
-rw-r--r--. 1 root root   6150298 May  4  2019 libnppim_static.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libnppist.so -> libnppist.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libnppist.so.10.0 -> libnppist.so.10.0.130
-rwxr-xr-x. 1 root root  16604560 May  4  2019 libnppist.so.10.0.130
-rw-r--r--. 1 root root  18732154 May  4  2019 libnppist_static.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libnppisu.so -> libnppisu.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libnppisu.so.10.0 -> libnppisu.so.10.0.130
-rwxr-xr-x. 1 root root    544592 May  4  2019 libnppisu.so.10.0.130
-rw-r--r--. 1 root root     10690 May  4  2019 libnppisu_static.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libnppitc.so -> libnppitc.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libnppitc.so.10.0 -> libnppitc.so.10.0.130
-rwxr-xr-x. 1 root root   2884112 May  4  2019 libnppitc.so.10.0.130
-rw-r--r--. 1 root root   3145290 May  4  2019 libnppitc_static.a
lrwxrwxrwx. 1 root root        15 May  4  2019 libnpps.so -> libnpps.so.10.0
lrwxrwxrwx. 1 root root        19 May  4  2019 libnpps.so.10.0 -> libnpps.so.10.0.130
-rwxr-xr-x. 1 root root   8424408 May  4  2019 libnpps.so.10.0.130
-rw-r--r--. 1 root root   9580914 May  4  2019 libnpps_static.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libnvblas.so -> libnvblas.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libnvblas.so.10.0 -> libnvblas.so.10.0.130
-rwxr-xr-x. 1 root root    596080 May  4  2019 libnvblas.so.10.0.130
lrwxrwxrwx. 1 root root        18 May  4  2019 libnvgraph.so -> libnvgraph.so.10.0
lrwxrwxrwx. 1 root root        22 May  4  2019 libnvgraph.so.10.0 -> libnvgraph.so.10.0.130
-rwxr-xr-x. 1 root root  88921848 May  4  2019 libnvgraph.so.10.0.130
-rw-r--r--. 1 root root 186926198 May  4  2019 libnvgraph_static.a
lrwxrwxrwx. 1 root root        17 May  4  2019 libnvjpeg.so -> libnvjpeg.so.10.0
lrwxrwxrwx. 1 root root        21 May  4  2019 libnvjpeg.so.10.0 -> libnvjpeg.so.10.0.130
-rwxr-xr-x. 1 root root   1089608 May  4  2019 libnvjpeg.so.10.0.130
-rw-r--r--. 1 root root   1001070 May  4  2019 libnvjpeg_static.a
lrwxrwxrwx. 1 root root        25 May  4  2019 libnvrtc-builtins.so -> libnvrtc-builtins.so.10.0
lrwxrwxrwx. 1 root root        29 May  4  2019 libnvrtc-builtins.so.10.0 -> libnvrtc-builtins.so.10.0.130
-rwxr-xr-x. 1 root root   4612768 May  4  2019 libnvrtc-builtins.so.10.0.130
lrwxrwxrwx. 1 root root        16 May  4  2019 libnvrtc.so -> libnvrtc.so.10.0
lrwxrwxrwx. 1 root root        20 May  4  2019 libnvrtc.so.10.0 -> libnvrtc.so.10.0.130
-rwxr-xr-x. 1 root root  20332456 May  4  2019 libnvrtc.so.10.0.130
lrwxrwxrwx. 1 root root        18 May  4  2019 libnvToolsExt.so -> libnvToolsExt.so.1
lrwxrwxrwx. 1 root root        22 May  4  2019 libnvToolsExt.so.1 -> libnvToolsExt.so.1.0.0
-rwxr-xr-x. 1 root root     37240 May  4  2019 libnvToolsExt.so.1.0.0
lrwxrwxrwx. 1 root root        14 May  4  2019 libOpenCL.so -> libOpenCL.so.1
lrwxrwxrwx. 1 root root        16 May  4  2019 libOpenCL.so.1 -> libOpenCL.so.1.1
-rwxr-xr-x. 1 root root     27096 May  4  2019 libOpenCL.so.1.1
drwxr-xr-x. 2 root root      4096 May  4  2019 stubs

Interesting. Instead of wondering why there is a link libcufft.so.10.0, but no link libcufft.so.10, you might want to just try:

  1. updating the nvidia driver of v520 or higher, then
  2. installing cuda toolkit v11.8 (not v12), then
  3. running
    /path/to/cryosparc_worker/bin/cryosparcw newcuda /path/to/cuda-11.8
  4. re-running the refinement and motion correction jobs

@wtempel, I solved the problem by:

Upgrading the nvidia-driver to version 530.30.02-1.el7 :

yum install nvidia-driver-latest-dkms.x86_64
nvidia-smi
Mon May  8 15:28:16 2023       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.30.02              Driver Version: 530.30.02    CUDA Version: 12.1     |

Upgrading cuda to 11.8

yum install cuda-11-8.x86_64

Eventually I ended up with version 11.8 and 12.0, I used:

sudo update-alternatives --display cuda

sudo update-alternatives --config cuda

For pointing nvvc to cuda 11.8 instead of 12.0 so now nvcc version is:

nvcc -V
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Wed_Sep_21_10:33:58_PDT_2022
Cuda compilation tools, release 11.8, V11.8.89
Build cuda_11.8.r11.8/compiler.31833905_0

I didn’t run:

/path/to/cryosparc_worker/bin/cryosparcw newcuda /path/to/cuda-11.8

Now I can run both 3D refinement and Patch motion correction.

Thanks for helping! :slightly_smiling_face: