So, I ran some static code analyzer over some c code, and one thing that surprised me was a warning about:
int val; scanf("%d", &val);
which said that for large enough input, this could lead to segfault. And of course, this can happen. Now the fix is simple enough (specify some width, because we know how many places a real integer can have, depending on the architecture), but I wonder why this happens in the first place and why it doesn't look like an error in libc (and simple at that)?
Now I assume that some of the reasons for this behavior are primarily absent?
Edit: Well, since the question doesn’t seem so clear, there’s a bit more explanation: No code analyzer warns about scanf in general, but about scanf reading a digit without the width specified in a particular one.
So here is a minimal working example:
#include <stdlib.h> #include <stdio.h> int main() { int val; scanf("%d", &val); printf("Number not large enough.\n"); return 0; }
We can get segfault by sending a giant number (using, for example, Python):
import subprocess cmd = "./test" p = subprocess.Popen(cmd, stdin=subprocess.PIPE, shell=True) p.communicate("9"*50000000000000) # program will segfault, if not make number larger
source share