I'm trying to follow this tutorial:
This is my source code:
using Microsoft.Extensions.AI;
using Microsoft.Extensions.AI.Ollama;
namespace Chat
{
internal class Program
{
static async Task Main(string[] args)
{
IChatClient chatClient =
new OllamaChatClient(new Uri("http://localhost:11434/"), "phi3:mini");
// Start the conversation with context for the AI model
List<ChatMessage> chatHistory = new();
while (true)
{
// Get user prompt and add to chat history
Console.WriteLine("Your prompt:");
var userPrompt = Console.ReadLine();
chatHistory.Add(new ChatMessage(ChatRole.User, userPrompt));
// Stream the AI response and add to chat history
Console.WriteLine("AI Response:");
var response = "";
await foreach (var item in
chatClient.CompleteStreamingAsync(chatHistory))
{
Console.Write(item.Text);
response += item.Text;
}
chatHistory.Add(new ChatMessage(ChatRole.Assistant, response));
Console.WriteLine();
}
}
}
}
The problem is that I'm getting this error message:
The type or namespace name 'Ollama' does not exist in the namespace
'Microsoft.Extensions.AI' (are you missing an assembly reference?)
'IChatClient' does not contain a definition for
'CompleteStreamingAsync' and no accessible extension method
'CompleteStreamingAsync' accepting a first argument of type
'IChatClient' could be found (are you missing a using directive or an assembly
reference?)
I have included these Nuget packages:
<PackageReference Include="Microsoft.Extensions.AI" Version="9.3.0-preview.1.25114.11" />
<PackageReference Include="Microsoft.Extensions.AI.Abstractions" Version="9.3.0-preview.1.25114.11" />
<PackageReference Include="Microsoft.Extensions.AI.Ollama" Version="9.3.0-preview.1.25114.11" />
I'm trying to follow this tutorial:
https://learn.microsoft/en-us/dotnet/ai/quickstarts/quickstart-local-ai
This is my source code:
using Microsoft.Extensions.AI;
using Microsoft.Extensions.AI.Ollama;
namespace Chat
{
internal class Program
{
static async Task Main(string[] args)
{
IChatClient chatClient =
new OllamaChatClient(new Uri("http://localhost:11434/"), "phi3:mini");
// Start the conversation with context for the AI model
List<ChatMessage> chatHistory = new();
while (true)
{
// Get user prompt and add to chat history
Console.WriteLine("Your prompt:");
var userPrompt = Console.ReadLine();
chatHistory.Add(new ChatMessage(ChatRole.User, userPrompt));
// Stream the AI response and add to chat history
Console.WriteLine("AI Response:");
var response = "";
await foreach (var item in
chatClient.CompleteStreamingAsync(chatHistory))
{
Console.Write(item.Text);
response += item.Text;
}
chatHistory.Add(new ChatMessage(ChatRole.Assistant, response));
Console.WriteLine();
}
}
}
}
The problem is that I'm getting this error message:
The type or namespace name 'Ollama' does not exist in the namespace
'Microsoft.Extensions.AI' (are you missing an assembly reference?)
'IChatClient' does not contain a definition for
'CompleteStreamingAsync' and no accessible extension method
'CompleteStreamingAsync' accepting a first argument of type
'IChatClient' could be found (are you missing a using directive or an assembly
reference?)
I have included these Nuget packages:
<PackageReference Include="Microsoft.Extensions.AI" Version="9.3.0-preview.1.25114.11" />
<PackageReference Include="Microsoft.Extensions.AI.Abstractions" Version="9.3.0-preview.1.25114.11" />
<PackageReference Include="Microsoft.Extensions.AI.Ollama" Version="9.3.0-preview.1.25114.11" />
Share
Improve this question
edited yesterday
desertnaut
60.4k32 gold badges152 silver badges178 bronze badges
asked 2 days ago
OlavTOlavT
2,6666 gold badges34 silver badges61 bronze badges
0
1 Answer
Reset to default 0There have been some breaking changes since the tutorial was published.
Here is a corrected version of the code, which works with the latest versions of the Github packages:
using System.ComponentModel;
using Microsoft.Extensions.AI;
namespace Chat
{
internal class Program
{
static async Task Main(string[] args)
{
IChatClient client = new ChatClientBuilder(new OllamaChatClient(new Uri("http://localhost:11434"), "phi4"))
.UseFunctionInvocation()
.Build();
// Start the conversation with context for the AI model
List<ChatMessage> chatHistory = new();
while (true)
{
// Get user prompt and add to chat history
Console.WriteLine("Your prompt:");
var userPrompt = Console.ReadLine();
chatHistory.Add(new ChatMessage(ChatRole.User, userPrompt));
// Stream the AI response and add to chat history
Console.WriteLine("AI Response:");
var responseString = "";
var response = client.GetStreamingResponseAsync(chatHistory);
await foreach (var update in response)
{
Console.Write(update);
responseString += update;
}
chatHistory.Add(new ChatMessage(ChatRole.Assistant, responseString));
Console.WriteLine();
}
}
}
}