Animation iOS with sound file

2019-02-15 22:42发布

I have some animation which needs to appear on screen at very specific timings, these are stored in SQLite database. What I am planning to do is use nstimer to keep time and pop the animation when the specific time is reached.

Im thinking of using an NSTimer to count for the duration of the sound file and animation then when certain points are reached pop an on screen animation. The problem is the set timing are like this 55.715000 seconds so are very accurate and these need to sync with an audio track that will be played with the animation.

Firstly is this even possible, secondly how can i compare such specific timings the problem i seem to be facing is the code can't run quick enough and the time jumps more than .001 of a second.

I have no knowledge of openGLES or Cocos2d and learning these is not really feasible for the time scales.

2条回答
干净又极端
2楼-- · 2019-02-15 23:10

If your visuals need to be exactly in sync with the audio (for example a music app where animations have to appear on the beats) then you need to use the following approach. It works on very old iPhone hardware and there is basically nothing that can go wrong at runtime.

  1. Pre-render each "frame" of your visuals so that each one is stored as a full screen image. This could be done on the desktop or you can run the render logic on the phone and then capture the output to a file as pixels.

  2. Once each frame is saved as an array of pixels, play the audio via the standard AVAudioPlayer APIs on iOS. This API takes care of the audio playback and reports a time that you will use to determine which video frame to display.

  3. Once audio is playing, get the "time" and divide it by your video framerate to determine which image from an array of N images to display. Get the image data and wrap it up in a CGImageRef/UIImage, this will blit the image data to the screen in a optimal way.

If you would like to see working source code for this approach, take a look at AVSync. This example code shows the implementation that I used in my own app called iPractice in the app store. It is very fast and can run at 30FPS even on the old iPhone 3G.

查看更多
ゆ 、 Hurt°
3楼-- · 2019-02-15 23:14

Regardless of how accurate your timings are, you aren't going to get the device to display more frequently than about 60Hz (16.7 ms per frame). As I see it, you have at least two options here:

1) Use a CADisplayLink callback to check the playback progress of the audio and trigger each animation as it becomes timely.

Display link timers are created in a similar way to regular NSTimers:

-(void)viewDidLoad
{
    // ...
    self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(scheduleAnimations:)];
    [self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSDefaultRunLoopMode];
}

- (void)scheduleAnimations:(CADisplayLink *)displayLink
{
    // get playback time from player object
    // iterate queue of timings
    //     if the timestamp of a timing is equal to or less than the playback time
    //         create its animation for immediate execution
    //         remove it from the queue
}

2) Create a collection of animations, setting their respective beginTime properties to the appropriate trigger time (or, if using implicit or block-based animations, use the delay parameter).

[CATransaction begin];
// iterate collection of timings
    CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:@"key"];
    animation.startTime = /* time to trigger animation */
    animation.removedOnCompletion = NO;
    [layer addAnimation:animation forKey:nil];
[CATransaction commit];
查看更多
登录 后发表回答