Replies: 9 comments 9 replies
-
If you can link me to an OpenVino converted model I can give it a shot |
Beta Was this translation helpful? Give feedback.
-
Hi saddam213, I appreciate your message. Below is my steps:
git clone https://github.com/microsoft/onnxruntime.git
cd onnxruntime
.\build.bat --config RelWithDebInfo --use_openvino CPU_FP16 --build_shared_lib --build_nuget --skip_tests Will generate two file like as You can download those from my google drive: https://drive.google.com/drive/folders/1VkzSIg6x6fVS6A0BcCY34YPMe9iqZ9my?usp=drive_link
git clone https://github.com/cassiebreviu/StableDiffusion.git
cd StableDiffusion
Example # In StableDiffusion folder
nuget add C:\Users\kimi0\Desktop\ONNX\onnxruntime\build\Windows\RelWithDebInfo\RelWithDebInfo\Microsoft.ML.OnnxRuntime.OpenVino.1.17.0-dev-20231212-0151-ccf3b2054b.nupkg -Source C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion
nuget add C:\Users\kimi0\Desktop\ONNX\onnxruntime\build\Windows\RelWithDebInfo\RelWithDebInfo\Microsoft.ML.OnnxRuntime.Managed.1.17.0-dev-20231212-0151-ccf3b2054b.nupkg -Source C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion
dotnet add package Microsoft.ML.OnnxRuntime.Managed -v 1.17.0-dev-20231212-0151-ccf3b2054b -s C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion
dotnet add package Microsoft.ML.OnnxRuntime.OpenVino -v 1.17.0-dev-20231212-0151-ccf3b2054b -s C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion
# In StableDiffusion.ML.OnnxRuntime folder
nuget add C:\Users\kimi0\Desktop\ONNX\onnxruntime\build\Windows\RelWithDebInfo\RelWithDebInfo\Microsoft.ML.OnnxRuntime.OpenVino.1.17.0-dev-20231212-0151-ccf3b2054b.nupkg -Source C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion.ML.OnnxRuntime
nuget add C:\Users\kimi0\Desktop\ONNX\onnxruntime\build\Windows\RelWithDebInfo\RelWithDebInfo\Microsoft.ML.OnnxRuntime.Managed.1.17.0-dev-20231212-0151-ccf3b2054b.nupkg -Source C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion.ML.OnnxRuntime
dotnet add package Microsoft.ML.OnnxRuntime.Managed -v 1.17.0-dev-20231212-0151-ccf3b2054b -s C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion.ML.OnnxRuntime
dotnet add package Microsoft.ML.OnnxRuntime.OpenVino -v 1.17.0-dev-20231212-0151-ccf3b2054b -s C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion.ML.OnnxRuntime
sessionOptions.AppendExecutionProvider_OpenVINO(@"CPU_FP16");
sessionOptions.AppendExecutionProvider_CPU();
2023-12-12 10:08:12.9542620 [E:onnxruntime:, inference_session.cc:1887 onnxruntime::InferenceSession::Initialize] Encountered unknown exception in Initialize()
Unhandled exception. Microsoft.ML.OnnxRuntime.OnnxRuntimeException: [ErrorCode:RuntimeException] Encountered unknown exception in Initialize()
at Microsoft.ML.OnnxRuntime.InferenceSession.Init(String modelPath, SessionOptions options, PrePackedWeightsContainer prepackedWeightsContainer)
at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options)
at StableDiffusion.ML.OnnxRuntime.UNet.Inference(String prompt, StableDiffusionConfig config) in C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion.ML.OnnxRuntime\UNet.cs:line 92
at StableDiffusion.Program.Main(String[] args) in C:\Users\kimi0\Desktop\ONNX\stablediffusiononnx\StableDiffusion\Program.cs:line 90 ![]()
Thank you so much!! |
Beta Was this translation helpful? Give feedback.
-
Ok, managed to get some progress So using the nuget packages you built I get the same error, but I think this is because you have compiled them for Using the model here (Float16) the exception does not throw However I suspect if you recompile the nuget for |
Beta Was this translation helpful? Give feedback.
-
Thank you!! |
Beta Was this translation helpful? Give feedback.
-
Hi @kimi0230 - Wondering if you made progress on this. There are lot of new laptops and desktops without a GPU and OpenVino should make Image Gen faster on them. Regards, |
Beta Was this translation helpful? Give feedback.
-
Also, Take a look at this https://github.com/rupeshs/fastsdcpu |
Beta Was this translation helpful? Give feedback.
-
I wonder if the CPU optimized Olive models will be accelerated with Open
Vino
…On Wed, Dec 20, 2023 at 10:11 AM Adam Clark ***@***.***> wrote:
OpenVino seems to work for OnnxStack backend, Hopefully its part of the
production OnnxRuntime.Managed package so so I can include it
—
Reply to this email directly, view it on GitHub
<#67 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AABY4ODELFOIO4J7PXYZBQ3YKMS37AVCNFSM6AAAAABAPPUSA2VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TSMJRGA4DA>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I found out later that the models I created work only if you have DirectML
and a GPU due to the specific optimizations in the graph.
I can create specific Olive optimized CPU based F16 models but the user
will need some way load the correct model based on if they have a GPU or
not.
…On Wed, Dec 20, 2023 at 11:16 AM Adam Clark ***@***.***> wrote:
I cant run olive F16 models at all on my CPU (i7 -12700KF)
But the 32 models should work, I can whip up a console app to test with
this weekend
—
Reply to this email directly, view it on GitHub
<#67 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AABY4OC5NV5PEMZD445TEBLYKM2SFAVCNFSM6AAAAABAPPUSA2VHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM3TSMJRGU2TE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Made a UI build using the OpenVino nuget you posted Using this model stable-diffusion-1.5-onnx-fp16 I can successfully generate images, however its painfully slow 200 seconds each I do not have the OpenVino toolkit installed so that could be the issue, so posting a link to the binaries in case you want to test further OpenVino Branch: |
Beta Was this translation helpful? Give feedback.
-
Hi there,
Thank you for your dedication and contributions.
It looks like OnnxStack supports
DirectML
,CPU
,Cuda
, andCoreML
, but noOpenVINO
.I try to add OpenVINO EP in the C# implementation of Stable Diffusion by Cassie Breviu here: Stable Diffusion with C# and ONNX Runtime.
But it encountered an unknown exception. Similar to this issue: microsoft/onnxruntime#18152
Have you encountered this issue, or do you have any suggestions?
Thank you so much!!
Beta Was this translation helpful? Give feedback.
All reactions