Killed SIGKILL process

I have a process that will be killed immediately after the program is executed. This is the code of the compiled executable file, and it is a small program that reads several graphs represented by numbers from standard input (usually a descriptive file), and finds the minimum spanning tree for each graph using the Prim algorithm (it does not show the results yet, it just finds decision).

#include <stdlib.h> #include <iostream> using namespace std; const int MAX_NODOS = 20000; const int infinito = 10000; int nnodos; int nAristas; int G[MAX_NODOS][MAX_NODOS]; int solucion[MAX_NODOS][MAX_NODOS]; int menorCoste[MAX_NODOS]; int masCercano[MAX_NODOS]; void leeGrafo(){ if (nnodos<0 || nnodos>MAX_NODOS) { cerr << "Numero de nodos (" << nnodos << ") no valido\n"; exit(0); } for (int i=0; i<nnodos ; i++) for (int j=0; j<nnodos ; j++) G[i][j] = infinito; int A,B,P; for(int i=0;i<nAristas;i++){ cin >> A >> B >> P; G[A][B] = P; G[B][A] = P; } } void prepararEstructuras(){ // Grafo de salida for(int i=0;i<nnodos;i++) for(int j=0;j<nnodos;j++) solucion[i][j] = infinito; // el mas cercaano for(int i=1;i<nnodos;i++){ masCercano[i]=0; // menor coste menorCoste[i]=G[0][i]; } } void prim(){ prepararEstructuras(); int min,k; for(int i=1;i<nnodos;i++){ min = menorCoste[1]; k = 1; for(int j=2;i<nnodos;j++){ if(menorCoste[j] < min){ min = menorCoste[j]; k = j; } } solucion[k][masCercano[k]] = G[k][masCercano[k]]; menorCoste[k] = infinito; for(int j=1;j<nnodos;j++){ if(G[k][j] < menorCoste[j] && menorCoste[j]!=infinito){ menorCoste[j] = G[k][j]; masCercano[j] = k; } } } } void output(){ for(int i=0;i<nnodos;i++){ for(int j=0;j<nnodos;j++) cout << G[i][j] << ' '; cout << endl; } } int main (){ while(true){ cin >> nnodos; cin >> nAristas; if((nnodos==0)&&(nAristas==0)) break; else{ leeGrafo(); output(); prim(); } } } 

I found out that I have to use strace to find what is happening, and here is what I get:

 execve("./412", ["./412"], [/* 38 vars */] <unfinished ...> +++ killed by SIGKILL +++ Killed 

I run ubuntu and this is the first time I get this type of error. The program should stop after reading two zeros in a row from the input, which I can guarantee that I have in the graphic descriptive file. Also, the problem occurs even if I run the program without doing input redirection to my graph file.

+6
source share
2 answers

Although I'm not 100% sure that this is a problem, take a look at the sizes of your global arrays:

 const int MAX_NODOS = 20000; int G[MAX_NODOS][MAX_NODOS]; int solucion[MAX_NODOS][MAX_NODOS]; 

Assuming int is 4 bytes, you'll need:

 20000 * 20000 * 4 bytes * 2 = ~3.2 GB 

Firstly, you may not even have much memory. Secondly, if you are at the 32-bit level, it is likely that the OS will not allow one process to have so much memory.

Assuming you are on 64-bit (and assuming you have enough memory), the solution would be to allocate all this at runtime.

+8
source

Each G and solucion contains 400,000,000 integers, which on each machine is about 1.6 GB. If you do not have enough (virtual) memory for this (3.2 gigabytes and counting) and permission to use it (try ulimit -d , which is true for bash on MacOS X 10.7.2), your process will not start and will be killed by SIGKILL (which cannot be trapped, not that the process is really continuing).

+6
source

Source: https://habr.com/ru/post/904866/


All Articles