Excludes and removes the [] equivalent for basic data types

So, while reviewing the code, my colleague used double* d = new double[foo]; and then called delete d . I told them that they should change it to delete [] d . And they stated that a compiler is not needed for basic data types. I did not agree.

Therefore, I thought I would prove my point from the experiment:

 #define NUM_VALUES 10000 int main(int argc,char** argv) { int i = 0; while (true) { std::cout << i << "|"; double* d = new double[NUM_VALUES]; std::cout << ((void*)d) << std::endl; for (int j = 0; j < NUM_VALUES; j++) d[j] = j; delete d; i++; } return 0; } 

Not only memory usage is not increasing, but each time it is allocated to the same place! (Visual Studio 2010). Is this a fad of a visual studio compiler? Or is this part of the standard?

+6
source share
5 answers

If you use new , you need to use delete

If you use new [] , you must use delete []

They mean different things. double* d = new double(); means the allocation and construction of the double type. delete d; means unallocated and destruct one double type. double* d = new double[NUM_VALUES]; means the allocation of NUM_VALUES doubles, and delete [] d means the unallocated each of the selected NUM_VALUES duplicates.

+10
source

I told them that they should change it to delete [] d

You were right. Using the wrong delete form gives undefined behavior.

Is this a fad of the Visual Studio compiler? Or is this part of the standard?

It is very likely that the implementation will receive memory from the same place for the distributions of "one object" and "array", in which case working with the wrong form delete for trivial types will work. It is also likely that freeing up memory, then allocating the same amount can reuse the same memory. However, none of this is guaranteed by the standard.

+5
source

You are absolutely right in your disagreement.

As has been said many times, new/delete and new[]/delete[] are two separate completely independent ways of managing dynamic memory. They cannot be mixed (as when using delete for the memory allocated by new[] ), no matter what type they are used with.

These two mechanisms can be physically different, i.e. can use completely different memory layouts, completely different memory pools and even work with completely different types of system level.

Anything that does not even mention that at the language level, the raw memory allocation functions used by these mechanisms ( operator new() functions and others) are independently replaced, which means that even if thoughtlessly mixing these operators seems to β€œwork” "for the base types in your implementation, it can easily be broken by replacing the allocation with free memory.

+1
source

First, the accepted answer to this question delete vs delete [] quotes the standard, saying that the behavior is undefined. This should satisfy your colleague. There are more discussions here: Delete [] equal to delete? .

If he is not convinced, you can remind him that the global new/delete , as well as the global new[]/delete[] can be overwritten in these pairs, so even if the pair new[]/delete (for basic types) in the implementation of VS2010 does not crash, there is absolutely no guarantee that another implementation will not fail.

For example, we overwrite the global new[]/delete[] in debug mode to hide the "end of array markers" to test usage. Of course, we expect delete[] be called on double arrays created with new[] for this to work.

However, since I am in the old C ++ er time, I know that the source of your colleague is confused. Think of the simplest implementation of new/delete and new[]/delete[] with C malloc/free for backends and direct calls to constructors / destructors. It is easy to see that for a simple implementation, since the original C ++ implementations were, delete and delete[] were equivalent for types without destructors. On this occasion, a certain folklore was created, which may be the source of your colleague's statement, but it does not really linger and actually never did.

+1
source

For reference, the reason it works is because in your implementation:

  • They both get their memory from one source.
  • As an optimization for double (and presumably some other types) new[] does not use the memory in which the requested array size is stored. For types with destructors delete[] you need to know how many objects need to be destroyed so that the size is saved somewhere.

The fact that this is how a particular implementation behaves is not a good reason to rely on this behavior.

The main reason the code is probably wrong in some implementation is somewhere if new[] allocates space for the array plus space for its size and returns a pointer to the array. delete[] then subtracts the space used for the size before releasing the selection, while delete will not. Your colleague decided to bet that he had never encountered an implementation that does this with new double[] .

This required number cannot always be inferred by the C ++ implementation from the value returned by non-standard functions, for example malloc_size() , since this returns the size of the block used to satisfy the distribution, not the requested size. Therefore, you usually need to expect new[] and delete[] use some sort of trick or other, and this is just a matter of implementation quality, which they avoid.

Even if the implementation is smart enough to avoid the need for additional space with the double array, it can have a debugging mode that deliberately detects an error in any case.

+1
source

Source: https://habr.com/ru/post/958580/


All Articles