Why is it bad practice to distribute the debug version of an application in .NET?

Reading this question , in the first comment by @Cody Gray says:

Umm, you know you don’t intend to redistribute the Debug version, do you?

I am worried about that. In Visual Studio, I usually develop my applications in debug mode, and if I need to distribute the executable, all I do is zip .exe and the required .dll files (in the bin\Debug folder).

Why is this a bad idea?

What is the difference between this and doing exactly the equivalent thing in Release mode?

Edit:
I asked this question a while ago, but I just wanted to change it to add a difference:

When using Debug.Assert in the code to verify and compile it in release mode, all of these lines disappeared, so that could be a different difference.

+4
source share
5 answers

It depends on what language you use to develop your program. When you use C ++, you get overhead / RTC and support for Edit + Continue. They significantly slow down the generated code and make it likely that your application crashes in StackOverflow if you use recursion. Runtime exceptions that you can get from the verification code can be difficult to diagnose without a debugger.

If you use VB.NET, then when using the Debug assembly without a debugger, you can easily disable memory leak. A flaw in the Edit + Continue support code causes a WeakReference leak for each instance of the class containing the WithEvents event. Your application will eventually die of an OutOfMemory exception.

If you use C #, then heckofalot does not go wrong, the JIT compiler simply cannot generate optimized machine code, and garbage collection is not so efficient. Your program will run slowly and consume more memory than necessary. This also applies to VB.NET and C ++ / CLI.

Perf is usually primarily to the mind of the programmer when writing code. So sending the debug build is a little blasphemous. A significant number of programs, however, are completely throttled by I / O, disk, network card, or dbase server. In this case, the raw pp processor is not a big deal.

+8
source

I think performance is a problem, this post contains more details.

+1
source

A pure C # or VB.NET application can work on any computer with the installed .NET Framework redist rule, but for a C ++ or C ++ / CLI application (or a mixed application), the VC redist package is required, which does not contain debug versions of the necessary libraries . Assuming that Visual Studio PC is not installed on your users, only a redistributable package, I would say that you risk that the debug version of your program simply does not work there.

+1
source

There are also legal reasons - Quote: "Please note that the debug versions of the application are not distributed and that none of the debug versions of the various Visual C ++ dynamic link libraries is distributed." From Redistributing Microsoft Visual C ++ 6.0 Applications

And this is for Visual Studio 2010 "Determining which DLLs to redistribute": "You cannot distribute all the files included in Visual Studio, you are allowed to redistribute the files specified in Redist.txt. Debug versions of applications and various Visual C ++ libraries not distributed. From determining which dlls to redistribute

+1
source

In terms of performance, I would be interested to see some indicators regarding debugging and release.

Although I’m sure that the performance will vary depending on what the application is doing, but my guess is that most of the time the end user did not know the difference.

0
source

Source: https://habr.com/ru/post/1341095/


All Articles