Microsoft.ML.OnnxRuntime.Managed
1.22.0-dev-20250327-1104-1f70fc2531
Prefix Reserved
See the version list below for details.
dotnet add package Microsoft.ML.OnnxRuntime.Managed --version 1.22.0-dev-20250327-1104-1f70fc2531
NuGet\Install-Package Microsoft.ML.OnnxRuntime.Managed -Version 1.22.0-dev-20250327-1104-1f70fc2531
<PackageReference Include="Microsoft.ML.OnnxRuntime.Managed" Version="1.22.0-dev-20250327-1104-1f70fc2531" />
<PackageVersion Include="Microsoft.ML.OnnxRuntime.Managed" Version="1.22.0-dev-20250327-1104-1f70fc2531" />
<PackageReference Include="Microsoft.ML.OnnxRuntime.Managed" />
paket add Microsoft.ML.OnnxRuntime.Managed --version 1.22.0-dev-20250327-1104-1f70fc2531
#r "nuget: Microsoft.ML.OnnxRuntime.Managed, 1.22.0-dev-20250327-1104-1f70fc2531"
#addin nuget:?package=Microsoft.ML.OnnxRuntime.Managed&version=1.22.0-dev-20250327-1104-1f70fc2531&prerelease
#tool nuget:?package=Microsoft.ML.OnnxRuntime.Managed&version=1.22.0-dev-20250327-1104-1f70fc2531&prerelease
About
ONNX Runtime is a cross-platform machine-learning inferencing accelerator.
ONNX Runtime can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.
Learn more → here
NuGet Packages
ONNX Runtime Native packages
Microsoft.ML.OnnxRuntime
- Native libraries for all supported platforms
- CPU Execution Provider
- CoreML Execution Provider on macOS/iOS
- XNNPACK Execution Provider on Android/iOS
Microsoft.ML.OnnxRuntime.Gpu
- Windows and Linux
- TensorRT Execution Provider
- CUDA Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.DirectML
- Windows
- DirectML Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.QNN
- 64-bit Windows
- QNN Execution Provider
- CPU Execution Provider
Intel.ML.OnnxRuntime.OpenVino
- 64-bit Windows
- OpenVINO Execution Provider
- CPU Execution Provider
Other packages
Microsoft.ML.OnnxRuntime.Managed
- C# language bindings
Microsoft.ML.OnnxRuntime.Extensions
- Custom operators for pre/post processing on all supported platforms.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-android34.0 is compatible. net8.0-browser was computed. net8.0-ios was computed. net8.0-ios17.2 is compatible. net8.0-maccatalyst was computed. net8.0-maccatalyst17.2 is compatible. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- System.Memory (>= 4.5.5)
-
net8.0
- System.Memory (>= 4.5.5)
- System.Numerics.Tensors (>= 9.0.0)
-
net8.0-android34.0
- System.Memory (>= 4.5.5)
- System.Numerics.Tensors (>= 9.0.0)
-
net8.0-ios17.2
- System.Memory (>= 4.5.5)
- System.Numerics.Tensors (>= 9.0.0)
-
net8.0-maccatalyst17.2
- System.Memory (>= 4.5.5)
- System.Numerics.Tensors (>= 9.0.0)
GitHub repositories (16)
Showing the top 16 popular GitHub repositories that depend on Microsoft.ML.OnnxRuntime.Managed:
Repository | Stars |
---|---|
dotnet/machinelearning
ML.NET is an open source and cross-platform machine learning framework for .NET.
|
|
babalae/better-genshin-impact
📦BetterGI · 更好的原神 - 自动拾取 | 自动剧情 | 全自动钓鱼(AI) | 全自动七圣召唤 | 自动伐木 | 自动刷本 | 自动采集/挖矿/锄地 | 一条龙 | 全连音游 - UI Automation Testing Tools For Genshin Impact
|
|
Webreaper/Damselfly
Damselfly is a server-based Photograph Management app. The goal of Damselfly is to index an extremely large collection of images, and allow easy search and retrieval of those images, using metadata such as the IPTC keyword tags, as well as the folder and file names. Damselfly includes support for object/face detection.
|
|
wangfreexx/wangfreexx-tianruoocr-cl-paddle
天若ocr开源版本的本地版,采用Chinese-lite和paddleocr识别框架
|
|
rocksdanister/weather
Windows native weather app powered by DirectX12 animations
|
|
genielabs/HomeGenie
HomeGenie, the programmable automation intelligence
|
|
techwingslab/yolov5-net
YOLOv5 object detection with C#, ML.NET, ONNX
|
|
microsoft/onnxruntime-training-examples
Examples for using ONNX Runtime for model training.
|
|
sstainba/Yolov8.Net
A .net 6 implementation to use Yolov5 and Yolov8 models via the ONNX Runtime
|
|
unoplatform/Uno.Samples
A collection of code samples for the Uno Platform
|
|
cassiebreviu/StableDiffusion
Inference Stable Diffusion with C# and ONNX Runtime
|
|
FaceONNX/FaceONNX
Face recognition and analytics library based on deep neural networks and ONNX runtime
|
|
TensorStack-AI/OnnxStack
C# Stable Diffusion using ONNX Runtime
|
|
guojin-yan/YoloDeployCsharp
Deploying Yolov8-det, Yolov8-pose, Yolov8-cls, and Yolov8-seg models based on C # programming language.
|
|
Particle1904/DatasetHelpers
Dataset Helper program to automatically select, re scale and tag Datasets (composed of image and text) for Machine Learning training.
|
|
Vincentzyx/VinXiangQi
Xiangqi syncing tool based on Yolov5 / 基于Yolov5的中国象棋连线工具
|
Release Def:
Branch: refs/heads/main
Commit: 1f70fc25319fea9c25c3c19f5f93090dfe30721d
Build: https://aiinfra.visualstudio.com/Lotus/_build/results?buildId=731643