最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Alternative solution to require audio dynamically using expo-av - Stack Overflow

programmeradmin2浏览0评论

I'm developing an audio application where I download all the audio files at once and then play them using expo-av library. The issue is when I try to require the audio file, I get the following error

Invalid call at line 77: require(audioUrl)
Failed building JavaScript bundle.

I'm aware that we cannot require dynamically in react-native. I would like to know if there is any alternative solution to play downloaded audios?

The function where I require downloaded audio file

const _loadNewPlaybackInstance = async isPlaying => {
    if (playBackInstance != null) {
      await playBackInstance.unloadAsync()
      await setPlaybackInstance(null)
    }
    const source = require(audioUrl)
    const initialStatus = {
      shouldPlay: isPlaying,
      volume,
      rate: audioRate,
      shouldCorrectPitch: true,
      pitchCorrectionQuality: Audio.PitchCorrectionQuality.Low
    }
    const { sound, status } = await Audio.Sound.createAsync(
      source,
      initialStatus,
      _onPlaybackStatusUpdate
    )
    setPlaybackInstance(sound)
    _updateScreenForLoading(false)
  }

I'm developing an audio application where I download all the audio files at once and then play them using expo-av library. The issue is when I try to require the audio file, I get the following error

Invalid call at line 77: require(audioUrl)
Failed building JavaScript bundle.

I'm aware that we cannot require dynamically in react-native. I would like to know if there is any alternative solution to play downloaded audios?

The function where I require downloaded audio file

const _loadNewPlaybackInstance = async isPlaying => {
    if (playBackInstance != null) {
      await playBackInstance.unloadAsync()
      await setPlaybackInstance(null)
    }
    const source = require(audioUrl)
    const initialStatus = {
      shouldPlay: isPlaying,
      volume,
      rate: audioRate,
      shouldCorrectPitch: true,
      pitchCorrectionQuality: Audio.PitchCorrectionQuality.Low
    }
    const { sound, status } = await Audio.Sound.createAsync(
      source,
      initialStatus,
      _onPlaybackStatusUpdate
    )
    setPlaybackInstance(sound)
    _updateScreenForLoading(false)
  }
Share Improve this question edited Feb 11, 2020 at 16:01 David Schumann 14.8k13 gold badges83 silver badges105 bronze badges asked Dec 23, 2019 at 5:11 Sain8z KSain8z K 535 bronze badges
Add a ment  | 

1 Answer 1

Reset to default 9

You are right, require is for static resources, not dynamic ones. According to documentation here, the source parameter can be one of three types.

  • source (object / number / Asset) -- The source of the sound. The following forms are supported:
    • A dictionary of the form { uri: string, headers?: { [string]: string }, overrideFileExtensionAndroid?: string }with a network URL pointing to a media file on the web, [...]
    • require('path/to/file') for an audio file asset in the source code directory.
    • An Asset object for an audio file asset.

You can use the first type of parameter for dynamic resources. You would do:

const source = { uri: audioUrl } where audioUrl can be a local resource as well: file:///path/to/resource

发布评论

评论列表(0)

  1. 暂无评论