There are some bugs that you fix and brag about to your friends afterward, then there are the ones that are just plain embarrassing. Unfortunately, I’m writing about one of the latter today rather than the former. You see this pitfall, though no more or less annoying than the others I cover in this series, is made all the more worse by the fact that I should have known better. If you’ve ever listened to an episode of Coder Radio in which I mention the iOS Simulator, then you know that I always warn developers to never trust it; the reason for this is that your code isn’t really running on iOS but rather a pseudo-OS X platform. Never ever trust the simulator. Ever.
Taking a step back, let’s look at the issue. I have a number of audio files and they all play just fine on the iOS Simulator, but my beta testers are reporting that they can’t hear any audio files at all on their devices. This is odd to say the least. I scoured my AVAudio implementation looking for the most obscure of bugs but still found nothing. The good news, using “good” very loosely, is that I was able to delete the app and reproduce the bug myself.
After a few hours of pouring over LLDB and various Test Flight logs, I found that everything looked fine: the audio URLs (the audio is coming from a server) all looked fine, there didn’t appear to be any NSZombie related issues, and the audio files themselves seemed fine. I was totally lost.
Then I remembered something based on an offhand comment I found on a StackOverflow post — the simulator uses OS X’s audio subsystems which differ from iOS’s. In particular, OS X has access to a few more audio codecs than iOS. As it turns out, the API I was interacting with has different calls that request the audio in different formats. I changed the call to request the MP3 formatted files and was good to go. Facepalm.
I hope you’ve enjoyed taking a look at another programming pitfall and please do leave any comments on Google+ or Twitter. Also, please keep in mind that I am always available for consulting projects.