The main reasons why the battery life of a programming language uses stacks?

Many programming language runtimes use stacks as their primary storage structure (for example, see the JVM bytecode for an example run).

Remembering quickly, I see the following advantages:

  • Simple structure (pop / push), trivial to implement
  • Most processors are optimized for stack operations anyway, so they are very fast.
  • Less problems with memory fragmentation, it is always about moving the memory pointer up and down to allocate and free full blocks of memory by resetting the pointer to the last write offset.

Is the list complete or am I missing something? Are there any programming language runtimes that don't use stacks for storage at all?

+3
source share
4 answers

I just decided to include a link to one of the most insightful developers, live (and active), Hotspot JVM architect:

When you compare the speed and power of a CPU executing bytecodes, you will see a lot of hardware complexity around the main problems with execution (I skip on a lot of obvious examples, but here's one: the stack location is sucked in because of a direct dependency stack).

http://www.azulsystems.com/blog/cliff-click/2010-04-21-un-bear-able

+3
source

, , , , , push/pop. , concurrency/, . Go.

" " , (, , ). , , , , , , .

+1

, .., , , .

0
source

The stack is also used to pass parameters between methods. As a rule, the parameters will be pushed onto the stack by the caller, and then the interlocutor will know where the parameters are on the stack (offset -ve from the current stack pointer).

0
source

Source: https://habr.com/ru/post/1785767/


All Articles