Microsoft.ML.OnnxRuntime.DirectML
1.22.0-dev-20250402-1105-67216c8996
Prefix Reserved
See the version list below for details.
dotnet add package Microsoft.ML.OnnxRuntime.DirectML --version 1.22.0-dev-20250402-1105-67216c8996
NuGet\Install-Package Microsoft.ML.OnnxRuntime.DirectML -Version 1.22.0-dev-20250402-1105-67216c8996
<PackageReference Include="Microsoft.ML.OnnxRuntime.DirectML" Version="1.22.0-dev-20250402-1105-67216c8996" />
<PackageVersion Include="Microsoft.ML.OnnxRuntime.DirectML" Version="1.22.0-dev-20250402-1105-67216c8996" />
<PackageReference Include="Microsoft.ML.OnnxRuntime.DirectML" />
paket add Microsoft.ML.OnnxRuntime.DirectML --version 1.22.0-dev-20250402-1105-67216c8996
#r "nuget: Microsoft.ML.OnnxRuntime.DirectML, 1.22.0-dev-20250402-1105-67216c8996"
#addin nuget:?package=Microsoft.ML.OnnxRuntime.DirectML&version=1.22.0-dev-20250402-1105-67216c8996&prerelease
#tool nuget:?package=Microsoft.ML.OnnxRuntime.DirectML&version=1.22.0-dev-20250402-1105-67216c8996&prerelease
About
ONNX Runtime is a cross-platform machine-learning inferencing accelerator.
ONNX Runtime can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.
Learn more → here
NuGet Packages
ONNX Runtime Native packages
Microsoft.ML.OnnxRuntime
- Native libraries for all supported platforms
- CPU Execution Provider
- CoreML Execution Provider on macOS/iOS
- XNNPACK Execution Provider on Android/iOS
Microsoft.ML.OnnxRuntime.Gpu
- Windows and Linux
- TensorRT Execution Provider
- CUDA Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.DirectML
- Windows
- DirectML Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.QNN
- 64-bit Windows
- QNN Execution Provider
- CPU Execution Provider
Intel.ML.OnnxRuntime.OpenVino
- 64-bit Windows
- OpenVINO Execution Provider
- CPU Execution Provider
Other packages
Microsoft.ML.OnnxRuntime.Managed
- C# language bindings
Microsoft.ML.OnnxRuntime.Extensions
- Custom operators for pre/post processing on all supported platforms.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 is compatible. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
native | native is compatible. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETCoreApp 0.0
- Microsoft.AI.DirectML (>= 1.15.4)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.22.0-dev-20250402-1105-67216c8996)
-
.NETFramework 0.0
- Microsoft.AI.DirectML (>= 1.15.4)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.22.0-dev-20250402-1105-67216c8996)
-
.NETStandard 0.0
- Microsoft.AI.DirectML (>= 1.15.4)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.22.0-dev-20250402-1105-67216c8996)
-
native 0.0
- Microsoft.AI.DirectML (>= 1.15.4)
GitHub repositories (8)
Showing the top 8 popular GitHub repositories that depend on Microsoft.ML.OnnxRuntime.DirectML:
Repository | Stars |
---|---|
babalae/better-genshin-impact
📦BetterGI · 更好的原神 - 自动拾取 | 自动剧情 | 全自动钓鱼(AI) | 全自动七圣召唤 | 自动伐木 | 自动刷本 | 自动采集/挖矿/锄地 | 一条龙 | 全连音游 - UI Automation Testing Tools For Genshin Impact
|
|
stakira/OpenUtau
Open singing synthesis platform / Open source UTAU successor
|
|
Babyhamsta/Aimmy
Universal Second Eye for Gamers with Impairments (Universal AI Aim Aligner - ONNX/YOLOv8 - C#)
|
|
microsoft/ai-dev-gallery
An open-source project for Windows developers to learn how to add AI with local models and APIs to Windows apps.
|
|
codeproject/CodeProject.AI-Server
CodeProject.AI Server is a self contained service that software developers can include in, and distribute with, their applications in order to augment their apps with the power of AI.
|
|
TensorStack-AI/OnnxStack
C# Stable Diffusion using ONNX Runtime
|
|
Particle1904/DatasetHelpers
Dataset Helper program to automatically select, re scale and tag Datasets (composed of image and text) for Machine Learning training.
|
|
net2cn/Real-ESRGAN_GUI
Real-ESRGAN-based super resolution model inference GUI written in C#.
|
Version | Downloads | Last updated |
---|---|---|
1.22.0-dev-20250409-1104-89... | 0 | 4/9/2025 |
1.22.0-dev-20250404-1104-82... | 0 | 4/4/2025 |
1.22.0-dev-20250402-1105-67... | 0 | 4/2/2025 |
1.22.0-dev-20250327-1104-1f... | 0 | 3/27/2025 |
1.22.0-dev-20250326-1104-51... | 0 | 3/26/2025 |
1.22.0-dev-20250311-1103-33... | 0 | 3/11/2025 |
1.22.0-dev-20250310-1103-fe... | 0 | 3/10/2025 |
1.22.0-dev-20250308-1204-98... | 0 | 3/8/2025 |
1.22.0-dev-20250306-1203-b5... | 0 | 3/6/2025 |
1.22.0-dev-20250305-1203-78... | 0 | 3/6/2025 |
1.22.0-dev-20250303-0628-da... | 0 | 3/7/2025 |
1.21.0-dev-20250228-1912-be... | 0 | 3/4/2025 |
1.16.0-dev-20230908-1021-a9... | 2 | 9/9/2023 |
1.16.0-dev-20230818-1253-d6... | 2 | 8/19/2023 |
1.16.0-dev-20230510-1258-36... | 2 | 5/11/2023 |
1.16.0-dev-20230508-0100-5e... | 1 | 5/9/2023 |
1.16.0-dev-20230505-1257-e1... | 2 | 5/6/2023 |
1.15.1 | 105 | 6/16/2023 |
1.15.0-dev-20230503-1259-d5... | 2 | 5/4/2023 |
Release Def:
Branch: refs/heads/main
Commit: 67216c89965731898a252b23cbcc681a0465c540
Build: https://aiinfra.visualstudio.com/Lotus/_build/results?buildId=737901