Microsoft.ML.OnnxRuntime.Managed
1.22.0-dev-20250311-1103-333fbdb4a1
Prefix Reserved
dotnet add package Microsoft.ML.OnnxRuntime.Managed --version 1.22.0-dev-20250311-1103-333fbdb4a1
NuGet\Install-Package Microsoft.ML.OnnxRuntime.Managed -Version 1.22.0-dev-20250311-1103-333fbdb4a1
<PackageReference Include="Microsoft.ML.OnnxRuntime.Managed" Version="1.22.0-dev-20250311-1103-333fbdb4a1" />
paket add Microsoft.ML.OnnxRuntime.Managed --version 1.22.0-dev-20250311-1103-333fbdb4a1
#r "nuget: Microsoft.ML.OnnxRuntime.Managed, 1.22.0-dev-20250311-1103-333fbdb4a1"
// Install Microsoft.ML.OnnxRuntime.Managed as a Cake Addin #addin nuget:?package=Microsoft.ML.OnnxRuntime.Managed&version=1.22.0-dev-20250311-1103-333fbdb4a1&prerelease // Install Microsoft.ML.OnnxRuntime.Managed as a Cake Tool #tool nuget:?package=Microsoft.ML.OnnxRuntime.Managed&version=1.22.0-dev-20250311-1103-333fbdb4a1&prerelease
About
ONNX Runtime is a cross-platform machine-learning inferencing accelerator.
ONNX Runtime can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.
Learn more → here
NuGet Packages
ONNX Runtime Native packages
Microsoft.ML.OnnxRuntime
- Native libraries for all supported platforms
- CPU Execution Provider
- CoreML Execution Provider on macOS/iOS
- XNNPACK Execution Provider on Android/iOS
Microsoft.ML.OnnxRuntime.Gpu
- Windows and Linux
- TensorRT Execution Provider
- CUDA Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.DirectML
- Windows
- DirectML Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.QNN
- 64-bit Windows
- QNN Execution Provider
- CPU Execution Provider
Intel.ML.OnnxRuntime.OpenVino
- 64-bit Windows
- OpenVINO Execution Provider
- CPU Execution Provider
Other packages
Microsoft.ML.OnnxRuntime.Managed
- C# language bindings
Microsoft.ML.OnnxRuntime.Extensions
- Custom operators for pre/post processing on all supported platforms.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-android34.0 is compatible. net8.0-browser was computed. net8.0-ios was computed. net8.0-ios17.2 is compatible. net8.0-maccatalyst was computed. net8.0-maccatalyst17.2 is compatible. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- System.Memory (>= 4.5.5)
-
net8.0
- System.Memory (>= 4.5.5)
- System.Numerics.Tensors (>= 9.0.0)
-
net8.0-android34.0
- System.Memory (>= 4.5.5)
- System.Numerics.Tensors (>= 9.0.0)
-
net8.0-ios17.2
- System.Memory (>= 4.5.5)
- System.Numerics.Tensors (>= 9.0.0)
-
net8.0-maccatalyst17.2
- System.Memory (>= 4.5.5)
- System.Numerics.Tensors (>= 9.0.0)
GitHub repositories (16)
Showing the top 5 popular GitHub repositories that depend on Microsoft.ML.OnnxRuntime.Managed:
Repository | Stars |
---|---|
dotnet/machinelearning
ML.NET is an open source and cross-platform machine learning framework for .NET.
|
|
babalae/better-genshin-impact
📦BetterGI · 更好的原神 - 自动拾取 | 自动剧情 | 全自动钓鱼(AI) | 全自动七圣召唤 | 自动伐木 | 自动刷本 | 自动采集/挖矿/锄地 | 一条龙 | 全连音游 - UI Automation Testing Tools For Genshin Impact
|
|
Webreaper/Damselfly
Damselfly is a server-based Photograph Management app. The goal of Damselfly is to index an extremely large collection of images, and allow easy search and retrieval of those images, using metadata such as the IPTC keyword tags, as well as the folder and file names. Damselfly includes support for object/face detection.
|
|
wangfreexx/wangfreexx-tianruoocr-cl-paddle
天若ocr开源版本的本地版,采用Chinese-lite和paddleocr识别框架
|
|
rocksdanister/weather
Windows native weather app powered by DirectX12 animations
|
Release Def:
Branch: refs/heads/main
Commit: 333fbdb4a1161e1a3a8a119bf584ed9549fe9e0f
Build: https://aiinfra.visualstudio.com/Lotus/_build/results?buildId=707557