๐ฅ๏ธ Hardware Compatibility¶
Platform-specific compatibility and testing status for inference-models.
๐งช Testing Coverage & Support Status¶
The following table shows the current testing and support status for different platforms:
| Platform | OS/Distribution | Installation Method | Support Status |
|---|---|---|---|
| CPU (x86_64) | Linux (general) | Bare-metal, Docker | โ Stable |
| CPU (x86_64) | macOS | Bare-metal, Docker | โ Stable |
| CPU (Apple Silicon) | macOS | Bare-metal, Docker | โ Stable |
| NVIDIA GPU | Ubuntu 22.04 LTS | Bare-metal, Docker | โ Stable |
| NVIDIA GPU | Ubuntu 24.04 LTS | Bare-metal, Docker | โ Stable |
| NVIDIA GPU | Other Linux distros | Bare-metal, Docker | โ ๏ธ Requires verification |
| Jetson (JetPack 6.1) | Ubuntu 22.04 (Jetson) | Docker | โ Stable |
| Jetson (JetPack 6.1) | Ubuntu 22.04 (Jetson) | Bare-metal | โ ๏ธ Experimental |
| Jetson (JetPack 5.1) | Ubuntu 20.04 (Jetson) | Docker (custom build) | โ Stable |
| Jetson (JetPack 5.1) | Ubuntu 20.04 (Jetson) | Bare-metal | โ Not possible |
| Windows | Windows 10/11 | Any | โ Not tested |
Windows Support
Windows is not tested at all. While the package may install and run on Windows, we have not performed any testing on this platform. Use at your own risk.
๐ป CPU Support¶
x86_64 / AMD64¶
Supported platforms:
- Linux - General distribution support
- macOS - Intel and Apple Silicon
Installation methods:
- โ Bare-metal - Direct pip/uv installation (see Installation Guide)
- โ Docker - Containerized deployment (see Docker Environments)
What works:
- All CPU-based models
- PyTorch CPU backend
- ONNX Runtime CPU backend
Apple Silicon (M1/M2/M3/M4)¶
Supported platforms:
- macOS with Apple Silicon processors
Installation methods:
- โ Bare-metal - Direct pip/uv installation
- โ Docker - Containerized deployment
MPS (Metal Performance Shaders) Support:
- โ ๏ธ Experimental - MPS GPU acceleration available for select models only
- Supported models: RFDetr and other compatible architectures
- Limitations: Not all models support MPS; most run on CPU
๐ฎ NVIDIA GPU Support¶
Tested Distributions¶
Stable support:
- โ Ubuntu 22.04 LTS - Fully tested and recommended
- โ Ubuntu 24.04 LTS - Fully tested and recommended
Other distributions:
- โ ๏ธ Requires verification - Other Linux distributions (Debian, RHEL, CentOS, etc.) should work but require testing in your specific environment
Installation methods:
- โ Bare-metal - Direct pip/uv installation (see Installation Guide)
- โ Docker - Containerized deployment (see Docker Environments)
Requirements:
- NVIDIA GPU with CUDA 11.8 or 12.x support
- Appropriate NVIDIA drivers installed
- For Docker: NVIDIA Container Toolkit
๐ค NVIDIA Jetson Support¶
JetPack 6.1 (Recommended)¶
Supported devices:
- Jetson Orin AGX
- Jetson Orin NX
- Jetson Orin Nano
Installation methods:
| Method | Status | Description |
|---|---|---|
| Docker | โ Stable | Recommended for production use |
| Bare-metal | โ ๏ธ Experimental | pip install works but not extensively tested; use at your own risk |
Docker installation (recommended):
See Docker Environments for pre-built Jetson images.
Bare-metal installation (experimental):
# Use at your own risk - not extensively tested
uv pip install "inference-models[torch-jp6-cu126,onnx-jp6-cu126]"
Bare-metal on Jetson
Bare-metal installation on Jetson devices is experimental. While pip install should work, it has not been extensively tested. We recommend using Docker for production deployments.
JetPack 5.1 (Legacy)¶
Supported devices:
- Jetson Orin AGX
- Jetson Orin NX
- Jetson Orin Nano
Installation methods:
| Method | Status | Description |
|---|---|---|
| Docker (custom build) | โ Stable | Custom Docker build required; see below |
| Bare-metal | โ Not possible | Verified to not work due to dependency conflicts |
Docker installation:
JetPack 5.1 requires a custom Docker build. See the Roboflow inference repository for Jetson 5.1 Dockerfiles.
No Bare-metal Support for JetPack 5.1
We have verified that bare-metal installation on JetPack 5.1 is not possible due to incompatible system dependencies and library conflicts. You must use the custom Docker build.
TensorRT on Jetson¶
Use JetPack TensorRT
Jetson devices come with TensorRT pre-installed as part of JetPack. Do not install the trt10 extra on Jetson platforms.
- Use the system TensorRT:
/usr/lib/aarch64-linux-gnu/libnvinfer.so - Installing PyPI TensorRT packages will cause conflicts
๐ Next Steps¶
- Installation Guide - Detailed installation instructions
- Understand Core Concepts - Understand the architecture
- Supported Models - Browse available models