Add adapter classes for Microsoft.Extensions.AI #1022
Nukepayload2
started this conversation in
Ideas
Replies: 1 comment 1 reply
-
Stephen Toub added support for these classes in #964, does that work for you or are there some bits missing? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
As an app developer, I need to let end users control where the LLM model should be running. For Llama models running on local computer in the application's process, we use LlamaSharp. If the user want to use an AI service on the Intranet, we use Ollama. If LlamaSharp implements https://www.nuget.org/packages/Microsoft.Extensions.AI.Abstractions , the model provider logics would be much easier to implement.
Beta Was this translation helpful? Give feedback.
All reactions