NVIDIA Support

AI models inference is available on a wide-range of NVIDIA GPUs whether it is consumer-grade like GeForce RTX 3060, data center-grade like A100, or embedded devices (Jetson platform).

To enable the inference on the GPUs using the Nx AI Manager, you need to set up the machine by installing compatible NVIDIA drivers, CUDA toolkit and cuDNN SDK versions based on the compute capability of the GPU.

To install the NVIDIA drivers, refer to this page: https://www.nvidia.com/en-us/drivers/

To install the CUDA toolkit, refer to this page: https://developer.nvidia.com/cuda-downloads

To install cuDNN SDK, refer to this page: https://developer.nvidia.com/cudnn

To determine the minimal version of CUDA version to install on your machine, please refer to this table and this one. Preferably install the latest version that is compatible with the GPU.

Operating conditions

For the AI Manager to work on the machine with Nvidia GPUs, the machine needs to have CUDA version 11.x, 12.x or 13.x and cuDNN properly installed, in which case these conditions need to be met:

On X86_64 (both Ubuntu and Windows)

  1. NVIDIA drivers, CUDA toolkit and cuDNN installed.

  2. The nvidia-smi command should be installed, has to be compatible with the installed Nvidia drivers and works properly.

On AARCH64 (only Jetson platforms)

  • A compatible JetPack version needs to be installed. You can verify that by running this command:

    dpkg-query --showformat='${Version}' --show nvidia-l4t-core

The command returns the version of the installed NVIDIA JetPack SDK.

Troubleshooting

For systems using NVIDIA's JetPack SDK, especially recent installations, the networkoptix-metavms user might not automatically be added to the render group. This group membership is essential for the Network Optix AI Manager plugin to fully utilize NVIDIA GPUs for hardware acceleration. While this process will be automated in a future Nx Server release, for now, you can manually add the user to the render group by following these steps:

1. Check if the 'render' Group Exists

First, verify whether the render group exists on your system:

getent group render
  • Expected Output

    If the render group exists, you will see output similar to:

    render:x:104:username

    This indicates that the group exists and lists the users currently in the group.

  • No Output

    If there's no output, the render group does not exist on your system. In this case, there's no need to continue with the next steps. Y

2. Add 'networkoptix-metavms' to the 'render' Group

  • Run the following command to add the user to the render group:

    sudo usermod -aG render networkoptix-metavms

    Explanation of the Command:

    • sudo runs the command with administrative privileges.

    • usermod is used to modify user accounts.

    • -aG appends the user to the specified group(s) without removing them from others.

    • render is the group you're adding the user to.

    • networkoptix-metavms is the username for the Network Optix VMS user.

3. Verify the Group Membership

Confirm that the networkoptix-metavms user has been added to the render group:

groups networkoptix-metavms
  • Expected Output

    The command will list all groups the user is a part of. You should see render included in the list.

4. Restart the Network Optix Service

For the changes to take effect, restart the Network Optix media server service:

sudo systemctl restart networkoptix-mediaserver.service
  • This command restarts the service, allowing it to recognize the updated group permissions.

If the AI Manager doesn't work even when the conditions above are met, please refer to our general troubleshooting section.

Last updated