Nx AI Manager Documentation
  • Nx AI Manager plugin v4.3
  • Nx AI Manager
    • Get started with the NX AI Manager plugin
    • 1. Install Network Optix
    • 2. Install Nx AI Manager Plugin
    • 3. Configure the Nx AI Manager plugin
      • 3.1 Model Settings
      • 3.2 Model pipeline selection and configuration
      • 3.3 Model pipelines on multiple devices
    • 4. Other Network Optix Plugin Settings
    • 5. Manual Plugin Installation
    • 6. Removing the Nx AI Manager
    • 7. Advanced configuration
      • 7.1 Nx AI Manager Manual Installation
      • 7.2 External Post-processing
      • 7.3 External Pre-processing
      • 7.4 Training Loop
      • 7.5 Enable ini settings
  • Nx AI Cloud
    • Introduction
    • Registration and log-in
    • Deployment and device management
    • Upload your model
      • Normalization
    • Use your model
    • API Documentation
  • SUPPORT & TROUBLESHOOTING
    • How to get support
    • Troubleshooting
      • Plugin checks
      • OS checks
      • System checks
      • Things to try
      • Controlling the server and the plugin
      • Q&A
  • Videos
    • Howto videos
  • AI Accelerators Support
    • Introduction
    • Supported AI accelerators
    • Nvidia Support
    • OpenVino Support
    • Hailo Support
  • For Data Scientists
    • Introduction
    • About ONNX
    • Custom model creation
    • ONNX requirements
    • Importing models
      • From Edge Impulse
      • From Nota AI
      • From Teachable Machine
      • From Hugging Face
      • From Ultralytics
      • From PyTorch
      • From TensorFlow / TFLite
      • From Scikit-learn
      • Common Models
  • Miscellaneous
    • Nx AI Certification Test
    • Nx AI Manager on SCAiLX
    • Privacy policy
    • Support
    • End user license agreement
    • Nx cloud cookie statement
Powered by GitBook
On this page
  1. AI Accelerators Support

OpenVino Support

PreviousNvidia SupportNextHailo Support

Last updated 1 month ago

To use OpenVino runtimes, you have the option select it on plugin install for compatible hardware. If you want to preselect GPU or NPU runtimes, make sure you have the very latest Intel drivers and libtbb installed on your machine. For the Intel drivers, follow: - For GPU: - For NPU:

Also make sure you have the latest libtbb installed:

sudo apt install libtbb12 ocl-icd-libopencl1

Add the user to video and gender groups:

sudo usermod -aG video $USER

sudo usermod -aG render $USER

Then, add Environment="DEVICE_TYPE=GPU" vs Environment="DEVICE_TYPE=NPU" to /etc/systemd/system/networkoptix-metavms-mediaserver.service under [Service]:

# Network Optix Media Server

[Unit]
Description=Network Optix Media Server
After=network.target local-fs.target remote-fs.target
Requires=networkoptix-metavms-root-tool.service

[Service]
Environment="DEVICE_TYPE=GPU"
PermissionsStartOnly=true
ExecStartPre=/opt/networkoptix-metavms/mediaserver/lib/scripts/systemd_mediaserver_pre_start.sh
ExecStart=/opt/networkoptix-metavms/mediaserver/lib/scripts/systemd_mediaserver_start.sh
User=networkoptix-metavms
Group=networkoptix-metavms
Restart=always
TimeoutStopSec=120
KillMode=process
TasksMax=8191
LimitCORE=infinity

[Install]
WantedBy=multi-user.target



https://github.com/intel/compute-runtime/releases/tag/25.09.32961.5
https://github.com/intel/linux-npu-driver/releases/tag/v1.16.0