I was really excited to see iOS 6 supports the Web Audio API, since we make HTML5 games. However, I cannot get iOS 6 to play any sound at all using the Web Audio API with examples that work fine in desktop Chrome.
Here is a HTML5 game with touch controls and playing audio via the Web Audio API (if present - if not it will fall back to HTML5 audio):
http://www.scirra.com/labs/sbios6b/
Edit: @Srikumar suggested some workarounds. I applied them at the version below. It still does not work!
http://www.scirra.com/labs/sbios6f/
Everything plays just fine on desktop Chrome, but iOS 6 emits no sound at all. I'm having trouble debugging it because I only do Windows development, and iOS 6 replaced the debug mode with remote web inspector, which apparently is not available on Safari for Windows. Using a few alerts I did find it correctly identifies the Web Audio API, uses it, detects no Vorbis support so falls back to AAC audio, decodes a buffer and then plays it, and no errors are thrown, but I hear nothing. And, of course, I tried turning the volume up to max :)
There should not be a codec problem, because iOS 6 can play AAC just fine - you can browse to one of the .m4a's the game plays and it plays fine visited direct from Safari.
Looking at the Web Audio API examples here on iOS 6: http://chromium.googlecode.com/svn/trunk/samples/audio/samples.html - some of them work, and others don't. For example, the Chrome Audio Visualizer works, but Javascript Drone doesn't.
There must be some subtle incompatibility between Web Audio on iOS 6 and desktop Chrome. What am I missing?
Answering to the original question, I can confirm some troubles with file formats on iPhone 4S/iOS 6 and MacOSX. If an MP3 file is "not good" for Safari, the decoding goes bad and calling AudioContext.createBuffer(array, bool) gives you and error.
The strange thing is about the error: "SYNTAX_ERR, DOM Exception 12", as pointed out by others. This makes me think it is a bug....
Same behavior also on MacOS, with Safari 6.0 (7536.25).
This isn't an actual answer, just a direction to look if things still aren't working. iOS6 has audio issues on some devices (particularly the 64gb 4s manufactured during a particular period, though I've seen others so it may not actually be hardware related) and will mysteriously stop playing some kinds of sounds (not ringtones or voice, for some reason, but many other sounds), and it's volume sliders will vanish. I've found it notoriously difficult to debug as it will usually (thought not always, sometimes you can catch it) happen only when not connected with a power cord.
Look in the console for ASSERTION FAILURE messages from the VirtualAudio_Device and with various codecs. This may have nothing whatsoever to do with your particular issue, but then again, a bug in one area of the sound device may be related to another. At minimum, it's an area to investigate if nothing else is helping.
You can try to debug it using the Web Inspector on Safari 6 on a mac.
It doesn't work out of the box for me, but with a few tries it can help narrow down the problem.
Apparently there is also the thing that audio can only be triggered by a user action. I'm not sure this is true 'cos some code that works on iOS6 on iPhone4 doesn't play any sound on an iPad (also iOS6).
Update: Some success with web audio on iPhone4+iOS6. Found that the "currentTime" remains stuck at 0 for a while as soon as you create a new audio context on iOS6. In order to get it moving, you first need to perform a dummy API call (like
createGainNode()
and discard the result). Sounds play only when currentTime starts to run, but scheduling sounds exactly at currentTime doesn't seem to work. They need to be a little bit into the future (ex: 10ms). You can use the followingcreateAudioContext
function to wait until the context is ready to make noise. User action doesn't seem to be required on iPhone, but no such success on iPad just yet.Subsequently, when playing a note, don't call
.noteOn(ac.currentTime)
, but do.noteOn(ac.currentTime + 0.01)
instead.Please don't ask me why you have to do all that. That's just the way it is at the moment - i.e. crazy.
updated for 2015 solution: hey all, if you are here working on a web audio problem with ios6+ I've found these links as help.
-this is a good article with code solution: http://matt-harrison.com/perfect-web-audio-on-ios-devices-with-the-web-audio-api/
-here is an update to the api after the above ^ solution article was written https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API/Porting_webkitAudioContext_code_to_standards_based_AudioContext
-below is my updated solution to the first article, using the changes from the second article. The issue I was having was iOS 7 safari throwing a strange not-enough-args error. this fixed it:
I've got trouble using all simple solutions. Especially, when I want to play a sound multiple times.
So I'm using this js library: http://pupunzi.open-lab.com/2013/03/13/making-html5-audio-actually-work-on-mobile
I managed to figure out a simple solution which I'm sure must have been documented elsewhere - but sometimes we have to spend hours figuring these things out for ourselves...
So it seems many tutorials (such as this one on html5rocks) instruct you to do the following steps :
Create an instance of
window.AudioContext
and if that doesn't exist (which it doesn't on iOS) then createwindow.webkitAudioContext
.Create an
XMLHttpRequest
to load your sound fileOn the
load
event runcontext.decodeAudioData(....)
and thencreateBufferSource()
, filling it with the decoded data, and finallysource.start(0)
to play the sound.As others have pointed out you must create the
AudioContext
(which incidentally you must store and use for the lifetime of the page) as a result of a user interaction (click or touchstart).HOWEVER : For iOS to 'unlock' its audio capabilities you MUST have audio data available when you create the
AudioContext
. If you load the data asynchronously there's nothing for it to play. It is not sufficient to merely create theAudioContext
inside aclick
event.Here's two solutions for reliable iOS playback:
1) You must load at least one sound file before you even initialize the AudioContext, and then run all the above steps for that sound file immediately within a single user interaction (eg click).
OR 2) Create a sound dynamically in memory and play it.
This is how I did that second option:
REMEMBER - MUST BE within
click
/touch
event for iOS:I imagine this is a common mistake - and I'm surprised after 3 years that nobody seems to have pointed this out or discovered it :-/