Using Visual Studio 2013 Pro, I created my first small console application. This console application should be quickly launched on every computer in our network. We have at least Windows XP SP3 running up to Windows 8.1.
Therefore, I created an application for the target .NET 2.0, but, to my surprise, the Windows 8 machine complained that for this it had to install / activate .NET 3.5. Windows 8 is not activated by default!
When I target 2.0, it works on all of our XP and W7, but not on W8. When I target 4.0, it works on all W8 and some W7, but not XP.
This is the same application .. is it really impossible to tell the OS to use the available infrastructure?
I tried to add the configuration file "MyAppName.exe.config" and "App.config" to the project, containing:
<?xml version="1.0" encoding="utf-8" ?> <configuration> <startup> <supportedRuntime version="v2.0.50727"/> <supportedRuntime version="v4.0"/> </startup> </configuration>
I changed the order and set the client profile for .NET 2.0, 3.0, 3.5, and 3.5. But when not targeting> = .NET 4, W8 continues to request activation of .NET 3.5
How can I get this small application running on XP SP3, W7 and W8, without having to install any additional frameworks when the client has .NET 2.0 or higher?
source share