Let's make some noise!
In this section, we will guide you through the basic concepts of Audio API. We are going to use core audio components such as AudioContext
and AudioBufferSourceNode
to simply play sound from a file, which will help you develop a basic understanding of the library.
Using audio context
Let's start by bootstrapping a simple application with a play button and creating our first instance of AudioContext
object.
import React from 'react';
import { View, Button } from 'react-native';
import { AudioContext } from 'react-native-audio-api';
export default function App() {
const handlePlay = async () => {
const audioContext = new AudioContext();
};
return (
<View style={{ flex: 1, justifyContent: 'center', alignItems: 'center' }}>
<Button onPress={handlePlay} title="Play sound!" />
</View>
);
}
AudioContext
is an object that controls both the creation of the nodes and the execution of the audio processing or decoding.
Loading an audio file
Before we can play anything, we need to gain access to some audio data. For the purpose of this guide, we will first download it from a remote source using.
import React from 'react';
import { View, Button } from 'react-native';
import { AudioContext } from 'react-native-audio-api';
export default function App() {
const handlePlay = async () => {
const audioContext = new AudioContext();
const audioBuffer = await fetch('https://software-mansion.github.io/react-native-audio-api/audio/music/example-music-01.mp3')
.then((response) => response.arrayBuffer())
.then((arrayBuffer) => audioContext.decodeAudioData(arrayBuffer));
};
return (
<View style={{ flex: 1, justifyContent: 'center', alignItems: 'center' }}>
<Button onPress={handlePlay} title="Play sound!" />
</View>
);
}
We have used the decodeAudioData
method of the BaseAudioContext
, which takes a URL to a local file or bundled audio asset and decodes it into raw audio data that can be used within our system.
Play the audio
The last and final step is to create an AudioBufferSourceNode
, connect it to the AudioContext's
destination, and start playing the sound. For the purpose of this guide, we will play the sound for just 10 seconds.
import React from 'react';
import { View, Button } from 'react-native';
import { AudioContext } from 'react-native-audio-api';
export default function App() {
const handlePlay = async () => {
const audioContext = new AudioContext();
const audioBuffer = await fetch('https://software-mansion.github.io/react-native-audio-api/audio/music/example-music-01.mp3')
.then((response) => response.arrayBuffer())
.then((arrayBuffer) => audioContext.decodeAudioData(arrayBuffer));
const playerNode = audioContext.createBufferSource();
playerNode.buffer = audioBuffer;
playerNode.connect(audioContext.destination);
playerNode.start(audioContext.currentTime);
playerNode.stop(audioContext.currentTime + 10);
};
return (
<View style={{ flex: 1, justifyContent: 'center', alignItems: 'center' }}>
<Button onPress={handlePlay} title="Play sound!" />
</View>
);
}
And that's it! you have just played your first sound using react-native-audio-api. you can hear how it works in the live example below:
Summary
In this guide, we have learned how to create a simple audio player using AudioContext
and AudioBufferSourceNode
as well as how we can load audio data from a remote source. To sum up:
AudioContext
is the main object that controls the audio graph.- the
decodeAudioData
method can be used to load audio data from a remote resource in the form of anAudioBuffer
. AudioBufferSourceNode
can be used with anyAudioBuffer
.- In order to hear the sounds, we need to connect the source node to the destination node exposed by
AudioContext
. - We can control the playback of the sound using
start
andstop
methods of theAudioBufferSourceNode
(and other source nodes, which we will show later).
What's next?
In the next section, we will learn more about how the audio graph works, what audio parameters are, and how we can use them to create a simple piano keyboard.