React-native-track-player: iOS release mode : No sound played

Created on 6 Mar 2020  Â·  20Comments  Â·  Source: react-native-kit/react-native-track-player

Configuration

System:
OS: macOS 10.15.3
CPU: (8) x64 Intel(R) Core(TM) i7-4870HQ CPU @ 2.50GHz
Memory: 302.21 MB / 16.00 GB
Shell: 5.7.1 - /bin/zsh
Binaries:
Node: 10.16.3 - /usr/local/bin/node
Yarn: 1.17.0 - /usr/local/bin/yarn
npm: 6.13.7 - /usr/local/bin/npm
Watchman: 4.9.0 - /usr/local/bin/watchman
SDKs:
iOS SDK:
Platforms: iOS 13.2, DriverKit 19.0, macOS 10.15, tvOS 13.2, watchOS 6.1
Android SDK:
API Levels: 23, 24, 25, 26, 27, 28, 29
Build Tools: 23.0.1, 25.0.0, 26.0.3, 27.0.3, 28.0.2, 28.0.3, 29.0.2
System Images: android-22 | Google APIs ARM EABI v7a, android-22 | Google APIs Intel x86 Atom_64, android-26 | Google APIs Intel x86 Atom_64, android-26 | Google Play Intel x86 Atom, android-28 | Google Play Intel x86 Atom, android-Q | Google APIs Intel x86 Atom
IDEs:
Xcode: 11.3.1/11C504 - /usr/bin/xcodebuild
npmPackages:
react: 16.9.0 => 16.9.0
react-native: 0.61.5 => 0.61.5
npmGlobalPackages:
create-react-native-app: 1.0.0

"react-native-track-player": "^1.2.2"

Issue

I'm currently facing a weird issue, where my mp3 files doesn't play on iOS in release mode. The problem is that there is no crash at all, only the sound doesn't play... It is working in debug mode though, and works perfectly on Android (both release and debug mode). Does anyone faced the same issue ?

Code

    const onMount = async () => {
        await setupPlayer({ ...options, waitForBuffer: true });
        await registerPlaybackService(() => require("./service.js"));
    };

    useEffect(() => {
        // noinspection JSIgnoredPromiseFromCall
        onMount();
    }, []);

    let endOfQueueResolver: (value?: any) => void;
    TrackPlayer.addEventListener(
        "playback-queue-ended",
        () => endOfQueueResolver && endOfQueueResolver(),
    );

    const remount = async (newOptions?: PlayerOptions): Promise<void> => {
        destroy();
        // noinspection ES6MissingAwait
        setupPlayer(newOptions);
    };

    const add = async (sound: any, insertBeforeId?: string): Promise<void> => {
        const track = {
            url: sound,
            id: uuid.v4(),
            artist: "",
            title: "",
        };
        // noinspection ES6MissingAwait
        TPAdd(track, insertBeforeId);
    };

    const replay = async (): Promise<void> => seekTo(0).then(() => TPPlay());

    const play = async (): Promise<void> => {
        await TPPlay();
        await new Promise(resolve => {
            endOfQueueResolver = resolve;
        });
    };
iOS

Most helpful comment

I am getting the same issue, local mp3 doesnt work in release mode, but does on simulator and android. also, all remote files work

All 20 comments

These are some of the issues I came across while working with this library on iOS.

  1. The url should be encoded with the encodeURI function, even if it's a local file.
  2. prepend the url with file://.
  3. iOS app's directory path changes on app update, build or reinstall. So you shouldn't be saving the absolute path of local files, instead save the relative path and build the full path before adding to queue.

Hopefully one of these helps you out.

@asurare are you using local or network files?

@Esirei Thanks for the comment, but I had to make it work so I ended up using another Library (react-native-sound).

@curiousdustin I was using local mp3 files.

I'm facing the same issue, using 2.0.0-rc13

@stefanos1019 are you able to reproduce the issue within the example app?

@curiousdustin sorry for my late response.

To be precise this is happening with rc13 when I’m using it in combination with a recorder (https://github.com/react-native-community/react-native-audio-toolkit).

The idea is that I record something and play it back.
Before something is recorded everything is working fine. When I record something and go back to my “player screen” to try to play something (either local mp4 or online mp4) the player jumps between ready, playing and paused state and it’s stuck at paused.

Adding the recorder library in the example with version 1.2.3 ends up in the same problem.

Things I've tried that gave same results:

  • initialize (call setupPlayer and updateOptions) only once
  • destroy before recording something and call registerPlaybackService, setupPlayer and updateOptions before trying to use the player again
  • calling setupPlayer and updateOptions on every render of the "playerScreen"

So I think it's an issue from the recorder's library side. Maybe they are not releasing some resources and the player assumes that something else is playing although it's not?

I am getting the same issue, local mp3 doesnt work in release mode, but does on simulator and android. also, all remote files work

@stefanos1019, are you able to test a scenario similar to this to help narrow down the cause in your case:

  1. Run your app, and use audio toolkit to do your recording like normal. (Assuming the file gets stored to disk?)
  2. Modify your code to disable any use of audio toolkit.

Will the files generated by audio toolkit work in this scenario? Trying to determine if the issue is with the files, paths, or as you said the combination of 2 audio related plugins...

@scottfits, Can you attempt to reproduce your issue within the example app?

I'm facing the exact same issue as @scottfits and @IlinIgor - Local files will play on Android (debug + release) and on iOS (debug) but not on iOS release builds (device + Simulator) .

I'm not doing any recording (not using react-native-audio-toolkit) and I suppose the other two also don't because they didn't mention it.

My analysis so far:

I logged the state in the function RNTrackPlayer.swift::getState (because I cannot log in JS since it's a release build):

  • When playing for the first time, getState shows idle twice and the state then seems to be stuck somewhere...
  • When playing a second time, the getState shows loading twice but then the state is not clearly defined I guess and I cannot start to play the track.
  • When switching tracks it's then most of the time the state idle which is logged by getState.

@curiousdustin I tried to replicate it in the example app but without success so far - it works in the example app (with the latest commit from May 19). I'll try with the newest version of dev tomorrow.


I'm on the current dev branch

Anything odd about your file names? You could try the latest on dev branch. Some changes related to parsing URLs and paths was just merged.

It now works with the latest on dev branch! 🎉

It looks like the newest commit on the dev branch (i.e. #950 ) fixed this issue.

I don't thing my local file names were really odd, although they contained dashes and started with numbers. Perhaps that caused some issues in the release build?!

However after updating and running pod install I got the following error:

Use of unresolved identifier 'RCTConvert'

I fixed it in #964

Has anyone managed to find a solution for this? @curiousdustin I'm facing the exact same issue. When the component loads I can play audio from an s3 bucket just fine. However when I record a piece of audio using react-native-audio-record, upload it to the s3 bucket then get the returning URL and try and play it the react native tracker goes from buffering >> Playing >> Paused instantly and no audio is played.

@maxckelly did you manage to solve this somehow? facing the same issue now

@andordavoti
try to use this
await TrackPlayer.setupPlayer({
iosCategory:'playAndRecord',
iosCategoryMode:'default',
iosCategoryOptions:[],
waitForBuffer:true
});

@andordavoti - I did the above as well and worked

Unfortunately, it didn't work for me. I'm using @react-native-community/audio-toolkit, however, I found another workaround by just playing the audio recorded with the same lib when the recording is finished. Not perfect, but works for now.

actually the issue not related to this repo
the issue related to @react-native-community/audio-toolkit as after stop recording they don't set AVAudioSession Active false
i suggest you to use react-native-audio as they set AvAudioSession false after stop recording
https://github.com/jsierles/react-native-audio/blob/master/ios/AudioRecorderManager.m#L104

@ebrahimhassan121 unfortunately react-native-audio is no longer actively maintained, so that is no longer a good option. I opened an issue for what you described in @react-native-community/audio-toolkit. Thanks for providing these specifics!

Think I found a work around. In RNAudioRecord.m I've changed it the methods below. I tested the below today and all works. Would be good if we could put a PR in for this change.

On Start:
// most audio players set session category to "Playback", record won't work in this mode // therefore set session category to "Record" before recording [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:nil];

On Stop:
// revert the audio session to Playback [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];

RCT_EXPORT_METHOD(start) {
    RCTLogInfo(@"start");

    // most audio players set session category to "Playback", record won't work in this mode
    // therefore set session category to "Record" before recording
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:nil];

    _recordState.mIsRunning = true;
    _recordState.mCurrentPacket = 0;

    if (_recordState.saveWavBoolean) {
        CFURLRef url = CFURLCreateWithString(kCFAllocatorDefault, (CFStringRef)_filePath, NULL);
        AudioFileCreateWithURL(url, kAudioFileWAVEType, &_recordState.mDataFormat, kAudioFileFlags_EraseFile, &_recordState.mAudioFile);
        CFRelease(url);
    }

    AudioQueueNewInput(&_recordState.mDataFormat, HandleInputBuffer, &_recordState, NULL, NULL, 0, &_recordState.mQueue);
    for (int i = 0; i < kNumberBuffers; i++) {
        AudioQueueAllocateBuffer(_recordState.mQueue, _recordState.bufferByteSize, &_recordState.mBuffers[i]);
        AudioQueueEnqueueBuffer(_recordState.mQueue, _recordState.mBuffers[i], 0, NULL);
    }
    AudioQueueStart(_recordState.mQueue, NULL);
}

RCT_EXPORT_METHOD(stop:(RCTPromiseResolveBlock)resolve
                  rejecter:(__unused RCTPromiseRejectBlock)reject) {
    RCTLogInfo(@"stop");
    if (_recordState.mIsRunning) {
        _recordState.mIsRunning = false;
        AudioQueueStop(_recordState.mQueue, true);
        AudioQueueDispose(_recordState.mQueue, true);
       if (_recordState.saveWavBoolean) {
            AudioFileClose(_recordState.mAudioFile);
       }
    }
   if (_recordState.saveWavBoolean) {
        resolve(_filePath);
   }
    // revert the audio session to Playback
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];

   if (_recordState.saveWavBoolean) {
        unsigned long long fileSize = [[[NSFileManager defaultManager] attributesOfItemAtPath:_filePath error:nil] fileSize];
        RCTLogInfo(@"file path %@", _filePath);
        RCTLogInfo(@"file size %llu", fileSize);
   }
}
Was this page helpful?
0 / 5 - 0 ratings