Are all languages ​​used in .net?

I know that the answer "Sales" the answer to this question is yes, but it is technically true.

The Common Language Runtime (CLR) is designed as an intermediate language based on imperative programming (IP), but this has obvious consequences when working with declarative programming (DP).

So, how effective is a language based on a different paradigm than an imperative style when implemented in the CLR?

I also get the feeling that the step towards DP will have an additional level of abstraction that might not have been modeled at all, would this be a fair comment?

I did some simple tests using F # and it all looks great, but am I missing something if the programs get more complicated?

+4
source share
5 answers

In the end, all programming languages ​​are compiled into the native machine code of the processor on which they are running, so the same questions can be asked in any language in general (and not just on MSIL compilation).

For languages ​​that are essentially just syntactic variants of each other (e.g. C # or VB.NET), I would not expect that there would be a big difference. But if languages ​​diverge too much (for example, C # versus F #), you really cannot make the right comparison, because you cannot write two "equivalent" nontrivial code samples in both languages ​​anyway.

+2
source

There is no guarantee that languages ​​produce the same IL for equivalent code, so I can say with confidence that there is no guarantee that all .NET languages ​​will be equally efficient.

However, if they produce the same IL output, then there is no difference.

+8
source

First of all, a wide range of languages ​​on the .NET platform definitely contains languages ​​that generate code with different performance, so not all languages ​​are equally effective. All of them are compiled into the same intermediate language (IL), but the generated code may be different, some languages ​​may rely on Reflection or dynamic language execution (DLR), etc.

However, it is true that BCL (and other libraries used by languages) will have the same performance no matter what language you call them in, which means that if you use some library that does expensive calculations or rendering without performing complex calculations yourself It really doesn't matter which language you use to call it.

I think the best way to think about a problem is not to think about languages, but about the various programming functions and styles available in these languages. Some of them are listed below:

  • Insecure code . You can use unsafe code in C ++ / CLI, as well as in C #. This is probably the most efficient way to write certain operations, but you are losing some security guarantees.

  • Statically typed, imperative . This is a common C # and VB.Net programming style, but you can also use the imperative style from F #. It is noteworthy that many functions of tail recursion are compiled into a statically typed, imperative IL code, so this also applies to some F # functions

  • Statically typed, functional . This is used by most F # programs. The generated code is very different from what the imperative category uses, but it is still statically typed, so there is no significant performance penalty. Comparing imperative and functionality is somewhat complicated, since the optimal implementation in both versions looks completely different.

  • Dynamically typed . Languages ​​such as IronPython and IronRuby use dynamic language execution, which implements calls to dynamic methods, etc. This is somewhat slower than statically typed code (but DLR is optimized in many ways). Note that code written using C # 4.0 dynamic also falls into this category.

There are many other languages ​​that may not fall into any of these categories, however I believe that the above list covers most common cases (and definitely covers all Microsoft languages).

+7
source

I am sure there are scenarios where the idiomatic code is a little more efficient if written in one .NET language than another. However, stepping back a little, why does it matter? Do you have a performance goal? Even within the same language, there is often a choice that you can affect performance, and sometimes you have to trade with inoperability or development time. Unless you have a goal for what constitutes acceptable performance, then it is not possible to assess whether any differences in performance between languages ​​are significant or minor.

In addition, compilers are evolving, so what is truly relative performance today will not necessarily continue. And the JIT compiler is also evolving. Even processor designs are variable and evolving, so the same native JITTed code can work differently between processors with different cache hierarchies, pipe sizes, branch prediction, etc.

Having said all this, there are probably several broad rules that are largely preserved:

  • Differences in algorithms are likely to have a greater difference than differences in compilers (at least when comparing statically typed languages ​​running on the CLR).
  • For problems that are easy to parallelize, languages ​​that simplify the use of multiple processors / cores will provide an easy way to speed up your code.
+4
source

A language can simply be thought of as a "front-end" for IL code, so the only difference between the languages ​​is whether the compiler will produce the same IL code or less / more efficient code.

From most of what I read on the Internet, it seems that the C ++ managed compiler does a better job of optimizing the IL code, although I have not seen anything that shows the wonderful difference between the main C # / C ++ / VB.NET languages .

You can even try compiling the following into IL and see !?

F #

 #light open System printfn "Hello, World!\n" Console.ReadKey(true) 

FROM#

 // Hello1.cs public class Hello1 { public static void Main() { System.Console.WriteLine("Hello, World!"); System.Console.ReadKey(true); } } 
0
source

Source: https://habr.com/ru/post/1306457/


All Articles