Tracking Time Spent in Debugger

2020-07-16 03:09发布

问题:

[ [ EDIT 2x ] I think I have worded my original question wrong, so I have scooted it down below and rewrote exactly what I am trying to get at, for future readers. ]

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
[ New, Shiny, Clear Question with Better Wording ]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

I have a loop that is running for a simulation / gaming framework. This loop has several places in it where it needs to ascertain how much time - in reality - has passed, so that the logic within these special places - specifically, Rendering and Updating - can work correctly. It has the option of being a Fixed Time Step (unfixed[update/render] is false) or not.

The problem arises when Breakpoint-based Debugging is done in any point in the application, since it uses a stopwatch to figure out how much realtime has passed (for the purpose of physics and animation moving at a realistic speed, and not based on how many frames the computer can churn out).

It looks (roughly) like this, using multiple stopwatches for each 'part' of the application loop that needs to know how much time has passed since that 'part' last occurred:

while ( RunningTheSimulation ) {
    /* ... Events and other fun awesome stuff */

    TimeSpan updatedifference = new TimeSpan( updatestopwatch.ElapsedTicks );

    if ( unfixedupdate || updatedifference > updateinterval ) {

        Time = new GameTime( updatedifference,
                                    new TimeSpan( gamestopwatch.ElapsedTicks ) );
        Update( Time );
        ++updatecount;
        updatestopwatch.Reset( );
        updatestopwatch.Start( );
    }

    TimeSpan renderdifference = new TimeSpan( renderstopwatch.ElapsedTicks );

    if ( unfixedrender || renderdifference > renderinterval ) {

        Time = new GameTime( renderdifference,
                                 new TimeSpan( gamestopwatch.ElapsedTicks ) );
        Render( Time );
        ++rendercount;
        renderstopwatch.Reset( );
        renderstopwatch.Start( );

    }
}   

Some info about the variables:

updatestopwatch is a Stopwatch for the time spent outside of the Update() function,

renderstopwatch is a Stopwatch for the time spent outside the Render() function, and

gamestopwatch is a Stopwatch for the total elapsed time of the simulation/game itself.

The problem arises when I debug anywhere in the application. Because the stopwatches are measuring realtime, the Simulation will be completely thrown off by any kind of Breakpoint-based debugging because the Stopwatches will keep counting time, whether or not I'm debugging the application. I am not using the Stopwatches to measure performance: I am using them to keep track of time between re-occurrences of Update, Render, and other events like the ones illustrated above. This gets extremely frustrating when I breakpoint and analyze and fix an error in Update(), but then the Render() time is so completely off that any display of the results of the simulation is vengefully kicked in the face.

That said, when I stop debugging entirely it's obviously not a problem, but I have a lot of development work to do and I'm going to be debugging for a long time, so just pretending that this isn't inconvenient won't work, unfortunately. =[

I looked at Performance Counters, but I can't seem to wrap my head around how to get them to work in the context of what I'm trying to do: Render and Update can contain any amount of arbitrary code (they're specific to whatever simulation is running on top of this while loop), which means I can't steadily do a PerformanceCounter.Increment() for the individual components of the loop.

I'll keep poking around System.Diagnostics and other .NET namespaces, but so far I've turned up blanks on how to just "ignore" the time spent in the attached Debugger...

Anyone have any ideas or insight?

[ [ EDITS 5x ] Corrected misspellings and made sure the formatting on everything was correct. Sorry about that. ]

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
[ Original, less-clear question ]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

I have a constant loop running in a C# application, which I debug all the time. I am currently using a Stopwatch, but I could use any other mechanism to track the passing time. My problem begins when I do things like use Breakpoints somewhere during this loop (enclosed in a typical while (true) { ... }:

if ( unfixedupdate || updatedifference > updateinterval ) 
{
    Time = new GameTime( updatedifference, 
                        new TimeSpan( gamestopwatch.ElapsedTicks ) );
    Update( Time );
    ++updatecount;
    updatestopwatch.Reset( );
    updatestopwatch.Start( );
}

The time measures itself fine, but it measures actual real time - including any time I spent debugging. Which means if I'm poking around for 17 seconds after updatestopwatch.Reset(), this gets compounded onto my loop and - for at least 1 rerun of that loop - I have to deal with the extra time I spent in real time factoring into all of my calculations.

Is there any way I can hook into the debugger to know when its freezing the application, so I can counter-measure the time and subtract it accordingly? As tagged, I'm using .NET and C# for this, but anything related to Visual Studio as well might help get me going in the right direction.

[ EDIT ] To provide more information, I am using several stopwatches (for update, rendering, and a few other events all in a different message queue). If I set a breakpoint inside Update(), or in any other part of the application, the stopwatches will accurately measure the Real Time spent between these. This includes time I spend debugging various completely unrelated components of my application which are called downstream of Update() or Render() or Input() etc. Obviously the simulation's Timing (controlled by the GameTime parameter passed into the toplevel Update, Render, etc. functions) won't work properly if, even if the CPU only took 13 ms to finish the update function, I spend 13 extra seconds debugging (looking at variables and then Continue with the simulation); the problem being that I will see the other stopwatches suddenly accounting for 13 extra seconds of time. If it still doesn't make sense, I'll chime in again.

回答1:

Use performance counters instead. The process CPU time should give a good indicator (but not as accurate as a realtime stopwatch) and should not interfere with debugging.



回答2:

A possible solution to this would be to write your locals (or whatever data you are trying to debug) to the output using Debug.WriteLine().



回答3:

You failed to explain what you are actually trying to debug and why animation timers doing exactly what they are expected to do with elapsed time is a problem. Know that when you break, time continues on. What is the actual problem?

Also, keep in mind that timings while debugging are not going to be anywhere near the measurement when running in release with optimizations turned on. If you'd like to measure time frame to frame, use a commercial profiling tool. Finding how long a method or function took is exactly what they were made for.

If you'd like to debug whether or not your animation works correctly, create a deterministic test where you supply the time rather than depending on the wall clock, using dependency injection and a time provider interface.

This article has a great example: https://www.toptal.com/qa/how-to-write-testable-code-and-why-it-matters