Some guys at work ran into an interesting problem yesterday: since upgrading to VS2010, they found none of their breakpoints were being hit when debugging their apps.
Without giving too much away, my work consists of writing .NET apps which run on top of a custom platform application we’ve developed. These apps are compiled to .NET 2.0. Debugging typically involves setting our platform app as the startup program to debug into, and launch from there. Trust me, this is relevant. Also relevant – none of our developers working on Vista of Windows 7 machines had any issue – just the Windows 2003/XP crowd.
Something about the combination of Visual Studio 2010, .NET 2.0, and Windows XP meant that we could no longer debug into our applications.
So anyway, after a bit of Googling, I managed to find a workaround, thanks to this anonymous Microsoft Employee. The issue seems to be that when debugging into code which is invoked from a native exe, Visual Studio will assume you’re debugging in .NET 4.0. Subsequently all your .NET 2.0 code will run, but the debugger won’t hit any of your breakpoints.
The fix is to give the debugger a helpful hint, in the form of a .exe.config file in the directory you’re launching your startup exe from. And in handily numerated form:
1. Find the exact version of .NET 2.0 installed on your PC. One way to do this is to click here or type
in Internet Explorer. There are probably myriad other ways of doing it. Note the version down.
2. Create a <YourProgram>.exe.config file in your program’s directory with the following contents:
<supportedRuntime version="v2.0.<whatever you noted down>" />
3. Debug to your heart’s content!
Step 3 is essential.
Note that I still have absolutely no idea why the original issue should arise only in Windows 2003 and XP machines. If anyone a) reads this and b) has an idea why, please let me know.