最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

perl - AI::Ollama::Client and 'ollamaollama-curated.yaml' - Stack Overflow

programmeradmin2浏览0评论

I installed AI::Ollama::Client on Strawberrry Perl 5.38 and started the Ollama server.

But, when trying to connect to the Ollama server using AI::Ollama::Client

use strict;
use warnings;
use AI::Ollama::Client;
my $client = AI::Ollama::Client->new(
    server => 'http://127.0.0.1:11434',
);

my $info = $client->listModels()->get;
for my $model ($info->models->@*) {
    say $model->model; # llama2:latest
}

I am getting this error:

Could not open 'ollama/ollama-curated.yaml' for reading: No such file or directory at C:/Dev/Perl/strawberry-perl-5.38.2.2-64bit-portable/perl/site/lib/YAML/PP/Lexer.pm line 141. at C:/Dev/Perl/strawberry-perl-5.38.2.2-64bit-portable/perl/site/lib/YAML/PP/Loader.pm line 94.

Any idea how to fix it?

I installed AI::Ollama::Client on Strawberrry Perl 5.38 and started the Ollama server.

But, when trying to connect to the Ollama server using AI::Ollama::Client

use strict;
use warnings;
use AI::Ollama::Client;
my $client = AI::Ollama::Client->new(
    server => 'http://127.0.0.1:11434',
);

my $info = $client->listModels()->get;
for my $model ($info->models->@*) {
    say $model->model; # llama2:latest
}

I am getting this error:

Could not open 'ollama/ollama-curated.yaml' for reading: No such file or directory at C:/Dev/Perl/strawberry-perl-5.38.2.2-64bit-portable/perl/site/lib/YAML/PP/Lexer.pm line 141. at C:/Dev/Perl/strawberry-perl-5.38.2.2-64bit-portable/perl/site/lib/YAML/PP/Loader.pm line 94.

Any idea how to fix it?

Share Improve this question edited Jan 29 at 12:17 toolic 62.2k20 gold badges79 silver badges127 bronze badges asked Jan 29 at 10:00 davedave 1,1851 gold badge7 silver badges11 bronze badges 0
Add a comment  | 

1 Answer 1

Reset to default 3

I see the files in the distro, but I don't see anything that would cause them to be installed (copied). This appears to be a bug in the distro, and a ticket has been filed.

As a workaround, you can copy the file from the distro into your project, and use the following:

use AI::Ollama::Client qw( );
use FindBin            qw( $RealBin );
use YAML::PP           qw( );

my $yaml_parser = YAML::PP->new( boolean => 'JSON::PP' );
my $schema = $yaml_parser->load_file( "$FindBin/ollama-curated.yaml" );

my $client = AI::Ollama::Client->new(
   server => ...,
   schema => $schema,
);
发布评论

评论列表(0)

  1. 暂无评论