最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

linux - How to avoid SIGKILL when dealing with huge arrays in a C++ program - Stack Overflow

programmeradmin1浏览0评论

I have a C++ program that can potentially handle very big data sets. I'm getting a SIGKILL, and I'm trying to avoid that and handle the error the correct way.

I'm pretty sure the SIGKILL happens when allocating certain arrays. I debugged it with gdb and the program crashes in that point.

The new instruction is in a try...catch statement, but no exception is being thrown. It just crashes. I would like to be able to handle the case where the requested data is too big in a graceful manner, but its proving to be harder than expected.

The code I'm using is more less like this:

int result = 0
try
{
  m_array = new double[sizeOfArray];
}
catch(const std::bad_alloc &e)
{
  result = -1;
}
catch(const std::length_error &e)
{
  result = -1;
}

return result;

    

if result != 0 I handle the situation and put info into logs etc. Why is there no exception thrown but SIGKILL is emitted instead? Is there a way to avoid the SIGKILL?

The data size requested is absurdly large for my PC, but not so much for higher performance situations. I just need to handle the error without a crash. I'm running Rocky linux.

发布评论

评论列表(0)

  1. 暂无评论