I'm new to programming and Spritekit in general, and am interested in exploring the relationship between milliseconds & framerate, and how the update function is used as an intermediary between both.
Framerate vs. milliseconds
Essentially, the main difference between framerate and time is that time is always consistant, while framerate is not (it could dip due to intensive graphics procedures). However, time is usually checked and set during SKScene's update event (which is called every frame), so I'm trying to figure out how time is correctly calculated, when you don't know how many frames are going to be in a second.
Example
I am currently looking at the update event of a space shooter game, where the update function is responsible for counting time intervals before spawning another alien. You can view the full code here: http://www.globalnerdy.com/2014/08/10/a-simple-shoot-em-up-game-with-sprite-kit-and-swift-part-one-last-things-first-the-complete-project/
// Called exactly once per frame as long as the scene is presented in a view
// and isn't paused
override func update(currentTime: CFTimeInterval) {
var timeSinceLastUpdate = currentTime - lastUpdateTime
lastUpdateTime = currentTime
if timeSinceLastUpdate > 1 {
timeSinceLastUpdate = 1.0 / 60.0
lastUpdateTime = currentTime
}
updateWithTimeSinceLastUpdate(timeSinceLastUpdate)
}
Problem
I can't seem to figure out why timeSinceLastUpdate is set to 1.0 / 60. I know it has to do with reconciling between framerate and seconds, but can someone explain this to me? Also, why are we allowed to use decimals? I thought time intervals were of type Int.
More importantly, is the purpose of this to keep the gameplay from slowing down during dips in framerate? Thanks for reading!