UPDATE:
ReasonML + BuckleScript is now Rescript.
As the ecosystem has changed around those tools, this blog post is not accurate anymore.
We’ve come quite far with our music player in ReasonReact. We created a React Context with state, a hook to manage the Context, and we started with our UI.
In this post, we will create a component for the music player control buttons, and we will finally implement the functionality to play the track via the DOM API.
You can find the complete code on GitHub.
Create Player Controls
Create src/PlayerControls.re
:
open ReactUtils;
[@react.component]
let make = () => {
let (
playing,
_trackList,
currentTrackName,
pauseTrack,
_playTrack,
playPreviousTrack,
playNextTrack,
) =
UseMusicPlayer.useMusicPlayer();
<>
<div className="box controls has-background-grey-dark">
<div className="current-track has-text-light">
<marquee> {s(currentTrackName)} </marquee>
</div>
<div className="buttons is-centered">
<button
className="button has-text-light has-background-grey-dark"
onClick=playPreviousTrack
disabled={
switch (playing) {
| Playing(_idx) => false
| NotPlaying => true
}
}>
<i className="fas fa-step-backward" />
</button>
<button
className="button has-text-light has-background-grey-dark"
onClick={_ => pauseTrack()}
disabled={
switch (playing) {
| Playing(_idx) => false
| NotPlaying => true
}
}>
{
switch (playing) {
| Playing(_idx) => <i className="fas fa-pause" />
| NotPlaying => <i className="fas fa-play" />
}
}
</button>
<button
className="button has-text-light has-background-grey-dark"
onClick=playNextTrack
disabled={
switch (playing) {
| Playing(_idx) => false
| NotPlaying => true
}
}>
<i className="fas fa-step-forward" />
</button>
</div>
</div>
</>;
};
There’s nothing new here. We create a new component with the [@react.component]
syntax. Then we load the Context with MusicPlayer.useMusicPlayer
.
The JSX contains our HTML markup as well as some onClick
functions. The logic for those functions lives in useMusicPlayer
.
We disable all buttons if we are in the NotPlaying
state.
Add the component to src/App.re
:
open ReactUtils;
[@react.component]
let make = () =>
<div className="section is-fullheignt">
<div className="container">
<div className="column is-6 is-offset-4">
<h1 className="is-size-2 has-text-centered">
{s("Reason Music Player")}
</h1>
<br />
<MusicPlayer>
<TrackList />
<PlayerControls /> // * new *
</MusicPlayer>
</div>
</div>
</div>;
Let The Music Play!
Everything works now - except there’s no music! 🎶
We’ll need some audio files to play, and we also need to play the music with new Audio()
.
Like in the original tutorial, I grabbed three random mp3 tracks from bensound.com.
I saved them in the src
folder.
Webpack
Webpack will load the mp3 files. Install the file loader
plugin:
npm install file-loader --save-dev
Modify webpack.config.js
:
const path = require('path')
const HtmlWebpackPlugin = require('html-webpack-plugin')
const outputDir = path.join(__dirname, 'build/')
const isProd = process.env.NODE_ENV === 'production'
module.exports = {
entry: './src/Index.bs.js',
mode: isProd ? 'production' : 'development',
output: {
path: outputDir,
filename: 'Index.js',
},
+ module: {
+ rules: [
+ {
+ test: /\.mp3$/,
+ loader: 'file-loader',
+ },
+ ],
+ },
plugins: [
new HtmlWebpackPlugin({
template: './src/index.html',
favicon: './src/favicon.ico',
inject: false,
}),
],
devServer: {
compress: true,
contentBase: outputDir,
port: process.env.PORT || 8000,
historyApiFallback: true,
},
}
Interop With The DOM API
We want to create a new Audio()
HTML Element, which can play the music track.
bs-webapi is a library that provides bindings to the DOM and other Web APIs.
Unfortunately, HTMLAudioElement
is still on the roadmap. That means that we have to write the bindings ourselves.
Create src/JsAudio.re
:
type audio;
[@bs.new] external make: string => audio = "Audio";
[@bs.send] external play: audio => unit = "play";
[@bs.send] external pause: audio => unit = "pause";
We use [@bs.new]
to initialize a new instance of the HTML Audio Element, see here.
The command creates a make
function, which takes a string and returns an audio element via the external
keyword.
We use [@bs.send]
for the functions HTMLMediaElement.play() and HTMLMediaElement.pause(), see here.
Now let’s adjust our state to include the audio element (src/SharedTypes.re
):
type state = {
tracks: musicTracks,
playing,
audioPlayer: JsAudio.audio, // * new *
};
We now modify src/MusicPlayer.re
. First, we need to import our mp3 files; then, we add our audio player element to state. We set up the audio player with an empty string.
Import the files with [@bs.module]
.
[@bs.module "./bensound-summer.mp3"] external summer: string = "default";
[@bs.module "./bensound-ukulele.mp3"] external ukulele: string = "default";
[@bs.module "./bensound-creativeminds.mp3"]
external creativeminds: string = "default";
let initialState: SharedTypes.state = {
tracks: [|
{name: "Benjamin Tissot - Summer", file: summer},
{name: "Benjamin Tissot - Ukulele", file: ukulele},
{name: "Benjamin Tissot - Creative Minds", file: creativeminds},
|],
playing: NotPlaying,
audioPlayer: JsAudio.(make("")) // * new *
};
Note that we use a shorthand syntax to “open” the JSAudio
module locally, see the Reason documentation here.
Every time we click on a “play” button, we fire off the PlayTrack(index)
action (inside src/UseMusicPlayer.re
). At the same time, we’d like to initialize the HTML Audio Element with the correct file.
/* src/MusicPlayer.re */
// previous code
let withPlayTrack = (state: SharedTypes.state, index) => {
...state,
playing: Playing(index),
audioPlayer: JsAudio.(make(state.tracks[index].file)), // * new *
};
We open the JSAudio
module locally, then call the make
function with the correct file name (which is a string).
Still, the app won’t play or pause a track with HTML Audio.
We’ll need to employ the useEffect
hook to invoke the “play”, and “pause” functions from JSAudio
:
/* src/MusicPlayer.re */
// previous code
[@react.component]
let make = (~children) => {
let (state, dispatch) = React.useReducer(reducer, initialState);
/* new */
React.useEffect1(
() => {
switch (state.playing) {
| Playing(_idx) => JsAudio.(state.audioPlayer |> play)
| NotPlaying => JsAudio.(state.audioPlayer |> pause)
};
None; // (A)
},
[|state.playing|], // (B)
);
// JSX here
You can read more about ReasonReact’s Hooks API on the ReasonReact documentation website.
We have to explicitly state how many dependencies useEffect
has and use the correct function (i.e., useEffect0
, useEffect1
).
We return None
from the function (A
). It won’t unmount, but that doesn’t matter in our case.
Alternatively, we could return an unmount function, for example: Some(() => Js.log("unmount"));
.
At line B
, we declare our dependencies.
You can find the complete file on GitHub.
There’s still a minor problem. If we play a track and skip to another one, the current track doesn’t pause. Fix it in src/useMusicPlayer.re
:
// previous code
let playTrack = index =>
switch (playing) {
| Playing(idx) =>
index === idx ?
pauseTrack() :
{
JsAudio.(state.audioPlayer |> pause); // * new *
MusicPlayer.PlayTrack(index) |> dispatch;
}
| NotPlaying => MusicPlayer.PlayTrack(index) |> dispatch
};
// more code
The above code stops the currently playing track with JSAudio
before it dispatches the new action.
Find the complete file on GitHub.
Recap
In this post, we learned how to import files and how to write BuckleScript bindings to use the DOM API with ReasonReact.
We applied useEffect
in ReasonReact to trigger side effects.
During this blog post series, we build a music player app that can use the Web API with ReasonReact and hooks: useContext
, useEffect
, useReducer
.
I hope you had fun, and you learned something about ReasonML and BuckleScript.
Again, many thanks to James King for his original React.js tutorial.
Also, thanks to Yawar Amin and Florian Hammerschmidt for helping me out with translating the JavaScript code to ReasonML.
You can find the complete code on GitHub.