首页
运维笔记
SEO心得
软件程序
网站源码
旗下网站
programmer
登录
标签
pytorchOutput inconsistency when
pytorch - Output inconsistency when using LLM batch inference compared to single input - Stack Overflow
I found single LLM input get different output logits when merging into a batch for inference.Besides,
pytorchOutput inconsistency when using LLM batch inference compared to single inputStack Overflow
admin
12小时前
2
0