I've hosted the Ollama in EC2 linux machine.
I'm able to run the ollma api/chat directly using the code below. Tihs return me a response.
import requests
import json
url = <urlhere>
payload = json.dumps({
"model": "llama3.2:latest",
"messages": [
{
"role": "user",
"content": "why is the sky blue?"
}
]
})
headers = {
'Content-Type': 'application/json'
}
response = requests.request("POST", url, headers=headers, data=payload, verify=False)
print(response.text)
However when I use Ollama client I got the error "ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible."
from ollama import Client
client = Client(
host = <urlhere>,
headers={'Content-Type': 'application/json'}
)
response = client.chat(model="llama3.2:latest", messages=[
{
"role": "user",
"content": "why is the sky blue?"
}
])
print(response.text)
I'm new to Ollama and can someone help me understand the error? Thank you!