# NVIDIA Support

<table><thead><tr><th width="153.22265625">AI Accelerator</th><th>Support Level</th><th>CPU Architecture</th><th>Operating System</th><th>API/driver version</th></tr></thead><tbody><tr><td>NVIDIA dGPU</td><td>Full support</td><td>x86_64</td><td>Ubuntu 20+ and Windows</td><td>CUDA 11, CUDA 12, CUDA 13</td></tr><tr><td>NVIDIA Jetson</td><td>Full support</td><td>aarch64</td><td>Ubuntu 20+</td><td>JetPack 5, JetPack 6, JetPack 7</td></tr></tbody></table>

AI models inference is available on a wide-range of NVIDIA GPUs whether it is consumer-grade like GeForce RTX 3060, data center-grade like A100, or embedded devices (Jetson platform).

To enable the inference on the GPUs using the Nx AI Manager, you need to set up the machine by installing compatible NVIDIA drivers and CUDA toolkit SDK versions based on the [compute capability](https://developer.nvidia.com/cuda-gpus) of the GPU.

{% hint style="warning" %}
Nx AI Manager supports only GPUs compatible with **CUDA Toolkit 11** or higher.
{% endhint %}

{% hint style="info" %}
To determine the minimal version of CUDA version to install on your machine, please refer to this [table](https://en.wikipedia.org/wiki/CUDA#GPUs_supported) and this [one](https://docs.nvidia.com/deeplearning/cudnn/backend/v9.14.0/reference/support-matrix.html). Preferably install the latest version that is compatible with the GPU.

To determine which GPU driver version to install based on your desired CUDA version (or vice versa), please check out this [table](https://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html#id6).
{% endhint %}

## Installation procedure

{% hint style="info" %}
**Please make sure you install GPU drivers and CUDA toolkit&#x20;*****versions*****&#x20;that are compatible with each other.**
{% endhint %}

### x86\_64 Windows

1. Go to <https://www.nvidia.com/en-us/drivers/>, and select your GPU model and OS. Then, download the driver installer, and run it.
2. Go to <https://developer.nvidia.com/cuda-toolkit-archive> and choose the CUDA version compatible with your installed GPU drivers. \
   (To find out the recommended CUDA toolkit to install, please run `nvidia-smi` on PowerShell. The version will be printed in the top right corner of the output)\
   After a successful installation, the CUDA toolkit should be available at: `C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\`

### x86\_64 Ubuntu

1. Install the GPU drivers by running this command: `sudo ubuntu-drivers autoinstall` , then reboot the machine.
2. Go to <https://developer.nvidia.com/cuda-toolkit-archive> and install the CUDA version compatible with your installed GPU drivers. \
   (To find out the recommended CUDA toolkit to install, please run `nvidia-smi` on a terminal window. The version will be printed in the top right corner of the output)

## Troubleshooting

For systems using NVIDIA's JetPack SDK, especially recent installations, the `networkoptix-metavms` user might not automatically be added to the `render` group. This group membership is essential for the Network Optix AI Manager plugin to fully utilize NVIDIA GPUs for hardware acceleration. While this process will be automated in a future Nx Server release, for now, you can manually add the user to the `render` group by following these steps:

#### 1. Check if the 'render' Group Exists

First, verify whether the `render` group exists on your system:

```bash
getent group render
```

* **Expected Output**

  If the `render` group exists, you will see output similar to:

  ```
  render:x:104:username
  ```

  This indicates that the group exists and lists the users currently in the group.
* **No Output**

  If there's **no output**, the `render` group does not exist on your system. In this case, there's no need to continue with the next steps. Y

#### 2. Add 'networkoptix-metavms' to the 'render' Group

* Run the following command to add the user to the `render` group:

  ```bash
  sudo usermod -aG render networkoptix-metavms
  ```

  **Explanation of the Command:**

  * `sudo` runs the command with administrative privileges.
  * `usermod` is used to modify user accounts.
  * `-aG` appends the user to the specified group(s) without removing them from others.
  * `render` is the group you're adding the user to.
  * `networkoptix-metavms` is the username for the Network Optix VMS user.

#### 3. Verify the Group Membership

Confirm that the `networkoptix-metavms` user has been added to the `render` group:

```bash
groups networkoptix-metavms
```

* **Expected Output**

  The command will list all groups the user is a part of. You should see `render` included in the list.

#### 4. Restart the Network Optix Service

For the changes to take effect, restart the Network Optix media server service:

```bash
sudo systemctl restart networkoptix-mediaserver.service
```

* This command restarts the service, allowing it to recognize the updated group permissions.

{% hint style="info" %}
If the AI Manager doesn't work even when the conditions above are met, please refer to our [general troubleshooting section](https://nx.docs.scailable.net/support-and-troubleshooting/how-to-get-support).
{% endhint %}
