Can optimizations affect the ability to debug a VC

2020-02-04 06:45发布

In order to be able to properly debug release builds a PDB file is needed. Can the PDB file become less usable when the compiler uses different kinds of optimizations (FPO, PGO, intrinsic functions, inlining etc.)? If so, is the effect of optimization severe or merely cause adjacent lines of code to get mixed up?

(I'm using VC2005, and will always choose debugability over optimized performance - but the question is general)

4条回答
小情绪 Triste *
2楼-- · 2020-02-04 06:57

In addition to local variables, the 'this' pointer is typically optimized away in optimized builds. This can be sometimes worked around by going up enough in the call stack to the point where the object pointer or reference exists as a not-optimized-away variable.

In general. single-stepping in the optimized build usually works more or less and lets one to see what logical decisions the code makes. Examining the data on which these decisions are based is usually much more complicated.

查看更多
来,给爷笑一个
3楼-- · 2020-02-04 06:58

Yes. It can be severe at times, although that's usually more the result of inlining or reordering of code.

Local variables also may not be accurately displayed in the watch window as they may only exist in registers and may not be correctly displayed when you switch stack frames.

查看更多
再贱就再见
4楼-- · 2020-02-04 07:01

Optimization can severely impact debugging on any platform (not just VC's PDB files).

Exactly for the reasons you have mentioned, function inlining can in some cases completely confuse which instructions belong to which function (since sometimes they sort of belong to both).

Also a common optimization is to do "dirty" stack frames (-fomit-frame-pointer in GCC) which causes the code to not track the top of stack. This is fine, it frees up an extra register (ebp on x86) for other operations. But it makes it nigh-impossible to unwind the stack to see what is actually going on. It also makes it nigh-impossible to find local variables and function parameters on the stack.

In general: Don't expect to get useful debug information out of "release" builds. If debugging is that important, even on release, then you should be "releasing" debug builds instead.

查看更多
叼着烟拽天下
5楼-- · 2020-02-04 07:07

Yes, optimized code is less debuggable. Not only is some information missing, some information will be very misleading.

The biggest issue in my opinion is local variables. The compiler may use the same stack address or register for multiple variables throughout a function. As other posters mentioned, sometimes even figuring out what the "this" pointer is can take a bit of time. When debugging optimized code you may see the current line jumping around as you single step, since the compiler reorganized the generated code. If you use PGO, this jumping around will probably just get worse.

FPO shouldn't affect debuggability too much provided you have a PDB since the PDB contains all the info necessary to unwind the stack for FPO frames. FPO can be a problem when using tools that need to take stack traces without symbols. For many projects, the perf benefit of FPO nowadays doesn't outweigh the hit for diagnosability; for this reason, MS decided not to build Windows Vista with the FPO optimization (http://blogs.msdn.com/larryosterman/archive/2007/03/12/fpo.aspx).

I prefer to debug unoptimized code but this isn't always possible - some problems only repro with optimized code, customer crash dumps are from the released build, and getting a debug private deployed sometimes isn't possible. Often when debugging optimized code, I use the disassembly view - dissasembly never lies.

This all applies to windbg since I do all native code debugging with it. Visual Studio's debugger might handle some of these cases better.

查看更多
登录 后发表回答