最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

artificial intelligence - Adding temperature when using semanticKernel and OLLAMA connector in C# - Stack Overflow

programmeradmin3浏览0评论

I have this code that works:

 public ChatOllama(string? model = null, string? systemPrompt = null, Uri? endpoint = null, TimeSpan? timeout = null)
 {
     SystemPrompt = systemPrompt ?? SystemPrompt;
     string ollamaModel = string.IsNullOrEmpty(model) ? DefaultModel : model.ToLower().Trim();
     Uri ollamaEndpoint = endpoint ?? DefaultUri;
     TimeSpan ollamaTimeout = timeout ?? DefaultHttpTimeout;
     this.ollamaEndpoint = new HttpClient
     {

         Timeout = ollamaTimeout,
         BaseAddress = ollamaEndpoint,
     };

     var builder = Kernel.CreateBuilder();
     builder.Services.AddOllamaChatCompletion(
         ollamaModel,
         httpClient: this.ollamaEndpoint
     );

     var kernel = builder.Build();
     m_chatService = kernel.GetRequiredService<IChatCompletionService>();
     History.AddSystemMessage(SystemPrompt);
 }

How can I set the ollama temperature when I am connecting to ollama using the Semantic Kernel library from Microsoft?

发布评论

评论列表(0)

  1. 暂无评论