I have a C++ program that can potentially handle very big data sets. I'm getting a SIGKILL, and I'm trying to avoid that and handle the error the correct way.
I'm pretty sure the SIGKILL happens when allocating certain arrays. I debugged it with gdb and the program crashes in that point.
The new instruction is in a try...catch statement, but no exception is being thrown. It just crashes. I would like to be able to handle the case where the requested data is too big in a graceful manner, but its proving to be harder than expected.
The code I'm using is more less like this:
int result = 0
try
{
m_array = new double[sizeOfArray];
}
catch(const std::bad_alloc &e)
{
result = -1;
}
catch(const std::length_error &e)
{
result = -1;
}
return result;
if result != 0 I handle the situation and put info into logs etc. Why is there no exception thrown but SIGKILL is emitted instead? Is there a way to avoid the SIGKILL?
The data size requested is absurdly large for my PC, but not so much for higher performance situations. I just need to handle the error without a crash. I'm running Rocky linux.