So, I have a working persistent music player on a project I am working on, thanks to this article:
And it's just great. But The information for the music is contained in the Context File, like so:
const MusicPlayerContext = React.createContext([{}, () => {}]);
const MusicPlayerProvider = (props) => {
const [state, setState] = useState({
audioPlayer: new Audio(),
tracks: [
{
name: 'Baktun',
artist: 'RYKR',
file: Baktun,
artwork: BaktunArtwork
},
{
name: 'Bash',
artist: 'RYKR',
file: Bash,
artwork: BashArtwork
},
{
name: 'Frost',
artist: 'RYKR',
file: Frost,
artwork: FrostArtwork
},
{
name: 'Greyskull',
artist: 'RYKR',
file: Greyskull,
artwork: GreyskullArtwork
},
{
name: 'Sprial Up',
artist: 'RYKR',
file: SpiralUp,
artwork: SpiralUpArtwork
}
],
currentTrackIndex: null,
isPlaying: false,
})
return (
<MusicPlayerContext.Provider value={[state, setState]}>
{props.children}
</MusicPlayerContext.Provider>
)
}
export { MusicPlayerContext, MusicPlayerProvider }
...but rather than manually input all this information into the context file via a JS object, I would rather store the music in folders with an MDX file per song and the artwork and audio file in there as well, and then inject that information into the Context file, I assume via GraphQL.
I don't really know how to go about doing that, however, so any guidance or examples would be great, thank you :-)
I'm currently importing all of the audio and artwork files manually into the Context file and manually inputing the track information as well, which is not optimal and scalable.
First you need create the mdx nodes with gatsby-plugin-mdx
.
plugins: [
{
resolve: `gatsby-source-filesystem`,
options: {
name: `tracks`,
path: `${__dirname}/src/tracks/`,
},
},
{
resolve: `gatsby-plugin-mdx`,
options: {
...
},
},
]
Then you can use useStaticQuery to query for tracks in Provider
.
const MusicPlayerProvider = (props) => {
const tracks = useStaticQuery(graphql`
query Tracks {
allMdx {
edges {
node {
...
}
}
}
}
`)
const [state, setState] = useState({
audioPlayer: new Audio(),
currentTrackIndex: null,
isPlaying: false,
})
return (
<MusicPlayerContext.Provider value={[{ tracks, ...state }, setState]}>
{props.children}
</MusicPlayerContext.Provider>
)
}
Hi @universse,
So I've added this query, which is the same query that I am using from your other example, to query in the context file:
const MusicPlayerContext = React.createContext([{}, () => {}])
const MusicPlayerProvider = (props) => {
const tracks = useStaticQuery(graphql`
query Tracks {
allMdx(filter: {fileAbsolutePath: {regex: "/content/music/"}}) {
totalCount
edges {
node {
fields {
slug
}
frontmatter {
name
artist
genre
bpm
artwork {
childImageSharp {
fluid(maxWidth: 1000) {
...GatsbyImageSharpFluid
}
}
}
alt
description
release(formatString: "MMMM Do, YYYY")
audio {
absolutePath
}
}
}
}
}
}
`)
const [state, setState] = useState({
audioPlayer: new Audio(),
currentTrackIndex: null,
isPlaying: false,
})
return (
<MusicPlayerContext.Provider value={[{ tracks, ...state }, setState]}>
{props.children}
</MusicPlayerContext.Provider>
)
}
export { MusicPlayerContext, MusicPlayerProvider }
...but without doing anything different to any other file that is accessing and using this context, I am getting this error in the browser:
...so I'm not sure what to do to fix this, but I also have a custom hook called useMusicPlayer
that is using this context, like so:
const useMusicPlayer = () => {
const [state, setState] = useContext(MusicPlayerContext)
// Play a specific track
function playTrack(index) {
if (index === state.currentTrackIndex) {
togglePlay()
} else {
state.audioPlayer.pause()
state.audioPlayer = new Audio(state.tracks[index].file)
state.audioPlayer.play()
setState(state => ({ ...state, currentTrackIndex: index, isPlaying: true }))
}
}
// Toggle play or pause
function togglePlay() {
if (state.isPlaying) {
state.audioPlayer.pause()
} else {
state.audioPlayer.play()
}
setState(state => ({ ...state, isPlaying: !state.isPlaying }))
}
// Play the previous track in the tracks array
function playPreviousTrack() {
const newIndex = ((state.currentTrackIndex + -1) % state.tracks.length + state.tracks.length) % state.tracks.length
playTrack(newIndex)
}
// Play the next track in the tracks array
function playNextTrack() {
const newIndex = (state.currentTrackIndex + 1) % state.tracks.length
playTrack(newIndex)
}
// Get the current time of the currently playing track
function currentTime() {
if (state.isPlaying) {
state.audioPlayer.currentTime()
}
}
return {
playTrack,
togglePlay,
currentTrackName:
state.currentTrackIndex !== null && state.tracks[state.currentTrackIndex].name,
currentTrackArtist:
state.currentTrackIndex !== null && state.tracks[state.currentTrackIndex].artist,
currentTrackArtwork:
state.currentTrackIndex !== null && state.tracks[state.currentTrackIndex].artwork,
currentTime,
trackList: state.tracks,
isPlaying: state.isPlaying,
playPreviousTrack,
playNextTrack,
}
}
export default useMusicPlayer
...which I finally then try to use in a TrackList
file, like so:
const TrackList = () => {
const {
trackList,
currentTrackName,
currentTrackArtist,
currentTrackArtwork,
playTrack,
isPlaying
} = useMusicPlayer()
return (
<>
{trackList.map((track, index) => (
<Card>
{/* <Artwork src={track.frontmatter.artwork} alt="Album Artowrk."/> */}
<Artwork src={currentTrackArtwork} alt="Album Artwork."/>
<Button
whileHover={{ scale: 1.1 }}
whileTap={{ scale: 0.9 }}
onClick={() => playTrack(index)}
>
{
currentTrackName === track.frontmatter.name && isPlaying ?
<img src={PauseButton} />
:
<img src={PlayButton} />
}
</Button>
<Text>
<h1>{currentTrackName}</h1>
<h3>{currentTrackArtist}</h3>
</Text>
</Card>
))}
</>
)
}
....and all of these files worked fine when I was using static info hard coded into the context file, but I am getting that error message from before.
So, I'm thinking that maybe I need to add track
or tracks
in there somewhere, but not sure where I would do that. Or maybe the issue lies somewhere else, but I am unsure as to what the problem might be...
For your TrackList
component, you need to map trackList
like so
...
trackList.edges.map((track, index) => {
const { frontmatter } = track.node
})
...
Basically because you are querying using graphql, the data structure of trackList
has changed and you haven't updated <TrackList />
to reflect that.
Same for useMusicPlayer
hook. For e.g. tracks.edges.length
instead of tracks.length
, tracks.edges[index].node
instead of tracks[index]
etc.
Another way is converting the new trackList
data structure to match the current one in MusicPlayerProvider
. That way you don't need to change other files' code.
Not sure I'm missing anything else but you can try that first.
hmmmmmm....having a bit of trouble following exactly how to go about changing everything across multiple files. I also think that it would be nice to be able to convert the trackList
into what it needs to be to work with what has already been established. SO in my head I am thinking that one way to do that would be to create an array and then iterate through all the graphql data that has been returned from the static query to the keys of an object. Don't really know how to do that, tho. If you have any other ideas as to how I could go about this I am all ears. Trying to figure it out on my end as well.
Thinking along these lines, but have no idea how to go about it:
tracks: [
// iterate over all of the tracks created via the staticQuery
{
name: data.allMdx.edges.node.frontmatter.name,
artist: data.allMdx.edges.node.frontmatter.artist
},
{},
{}
],
So I guess I don't know how to manipulate the data I get back from the query very well, or even at all. I would think I need to inject info from the query into an array of objects called tracks
but I'm not sure how to even start doing that.
You can do that in MusicPlayerProvider
const tracks = useStaticQuery(graphql`
...
`)
// need useMemo to avoid re-computation when state change
const trackList = useMemo(
() =>
tracks.allMdx.edges.map(track => {
const { frontmatter } = track.node
const { name, artist } = frontmatter
return { name, artist }
}),
[tracks]
)
well, after all that help from you most of this is working! Thank you! XD
....except for a few remaining things...
for example, the album artwork
shows up in the TrackList
as expected if i use gatsby-image
, which makes sense, but I can't get it to show up as the current track artwork in the MusicPlayer
.
This is how I am creating the currentAlbumArtwork
in the useMusicPlayer hook:
currentTrackArtwork:
state.currentTrackIndex !== null && state.tracks[state.currentTrackIndex].artwork,
...but I'm thinking that since it is an image using gatsby-image
childImageSharp
I need to do something different here, I just don't know what needs to be changed. Since this is outside of JSX I'm not sure how to alter it to work properly.
But the biggest and last hurdle I am facing is getting the audio files to actually play. I don't know if I am querying for them right or how to get them to work in the same way that they were working outside of Graphql. I am querying for audio
in my frontmatter, which is what I have labeled the field in the mdx file, like so:
---
name: Bash
artist: RYKR
genre: Electro
bpm: 124 bpm
artwork: bash.jpg
alt: Bash artwork.
audio: bash.mp3
description: What streamers and confetti sound like if they have souls.
release: 2019-02-01
---
...and the audio file is in the same folder as the mdx file, and the artwork in working using this approach as well, so I'm not sure what the difference is.
I'm querying the absolutePath of the audio
, which is one of the options in the explorer, like so:
audio {
absolutePath
}
...but i don't know if this is actually getting the audio file, rather than the audio files location. Trying to understand how to work with audio and video files in graphql and gatsby and mdx, but still not clear on this.
This is the last piece to getting this working, so hopefully you might have some insight into how to do this, or where to look to figure out how.
For that you need to query for path to audio + image source in the public folder.
// useMusicPlayer.js
const [state, setState] = useContext(MusicPlayerContext)
// query all mp3 and png files from /content/music/
const assets = useStaticQuery(graphql`
query Assets {
allFile(filter: {extension: {in: ["mp3", "jpg"]}, absolutePath: {regex: "/content/music/"}}) {
edges {
node {
publicURL
relativePath
}
}
}
}
`)
// convert to obj for fast lookup
const assetObj= useMemo(
() =>
assets.allFile.edges.reduce((obj, file) => {
const { publicURL, relativePath } = file.node
obj[relativePath] = publicURL
return obj
}, {}),
[assets]
)
const artwork = state.tracks[state.currentTrackIndex].artwork // bash.jpg
const currentTrackArtworkSrc = assetObj[artwork] // /static/bash-[some-hash].jpg
const audio = state.tracks[state.currentTrackIndex].audio // bash.mp3
const currentAudioSrc = assetObj[audio] // /static/bash-[some-hash].mp3
And you don't need to query absolutePath
on audio
, just audio
is enough.
Hi @universse, apologies for not replying sooner, as work has been rough this past week.
So, I threw this into the app, and the part that trips the app up is this part:
const artwork = state.tracks[state.currentTrackIndex].artwork // bash.jpg
const currentTrackArtworkSrc = assetObj[artwork] // /static/bash-[some-hash].jpg
const audio = state.tracks[state.currentTrackIndex].audio // bash.mp3
const currentAudioSrc = assetObj[audio] // /static/bash-[some-hash].mp3
...which gives me the following error:
...so, i guess i don't understand where to use these new consts
in the app...I tired this:
currentTrackArtwork:
state.currentTrackIndex !== null && currentTrackArtworkSrc,
...but that didn't seem to make any difference. How would creating something like currentTrackArtworkSrc
be used in the rest of the files? What should it replace, if anything?
So in useMusicPlayer
you are having this
return {
playTrack,
togglePlay,
currentTrackName:
state.currentTrackIndex !== null && state.tracks[state.currentTrackIndex].name,
currentTrackArtist:
state.currentTrackIndex !== null && state.tracks[state.currentTrackIndex].artist,
currentTrackArtwork:
state.currentTrackIndex !== null && state.tracks[state.currentTrackIndex].artwork,
currentTime,
trackList: state.tracks,
isPlaying: state.isPlaying,
playPreviousTrack,
playNextTrack,
}
Now it becomes
const assets = useStaticQuery(graphql`
query Assets {
allFile(filter: {extension: {in: ["mp3", "jpg"]}, absolutePath: {regex: "/content/music/"}}) {
edges {
node {
publicURL
relativePath
}
}
}
}
`)
// convert to obj for fast lookup
const assetObj= useMemo(
() =>
assets.allFile.edges.reduce((obj, file) => {
const { publicURL, relativePath } = file.node
obj[relativePath] = publicURL
return obj
}, {}),
[assets]
)
return {
playTrack,
togglePlay,
currentTrackName:
state.currentTrackIndex && state.tracks[state.currentTrackIndex].name,
currentTrackArtist:
state.currentTrackIndex && state.tracks[state.currentTrackIndex].artist,
currentTrackArtwork:
state.currentTrackIndex && assetObj[state.tracks[state.currentTrackIndex].artwork],
currentTrackAudio:
state.currentTrackIndex && assetObj[state.tracks[state.currentTrackIndex].audio],
currentTime,
trackList: state.tracks,
isPlaying: state.isPlaying,
playPreviousTrack,
playNextTrack,
}
hmmmm...still having the same issue:
...and we're still not using the constants that we created here:
const artwork = state.tracks[state.currentTrackIndex].artwork // bash.jpg
const currentTrackArtworkSrc = assetObj[artwork] // /static/bash-[some-hash].jpg
const audio = state.tracks[state.currentTrackIndex].audio // bash.mp3
const currentAudioSrc = assetObj[audio] // /static/bash-[some-hash].mp3
...so what are the currentTrackArtworkSrc
and currentAudioSrc
for? Should we not use them somewhere? I tried this but it did not work either:
currentTrackArtwork:
state.currentTrackIndex
&&
currentTrackArtworkSrc[state.tracks[state.currentTrackIndex].artwork],
currentTrackAudio:
state.currentTrackIndex
&&
currentAudioSrc[state.tracks[state.currentTrackIndex].audio],
Would you mind sharing your repo so I can take a look later?
But, of course, thank you XD
First, in the content/music
folder, guess you can safely delete artwork
and audio
folder.
Here's the final code with some comments. I cleaned up the code a bit also.
// MusicPlayerContext
import React, { useState, useMemo } from 'react'
import { useStaticQuery, graphql } from 'gatsby'
const MusicPlayerContext = React.createContext([{}, () => {}])
const MusicPlayerProvider = props => {
const tracks = useStaticQuery(graphql`
query Tracks {
allMdx(filter: { fileAbsolutePath: { regex: "/content/music/" } }) {
edges {
node {
fields {
slug
}
frontmatter {
name
artist
genre
bpm
# ADD BASE HERE
artwork {
base
childImageSharp {
fluid(maxWidth: 1000) {
...GatsbyImageSharpFluid
}
}
}
alt
description
release(formatString: "MMMM Do, YYYY")
audio {
absolutePath
base
}
}
}
}
}
}
`)
// need useMemo to avoid re-computation when state change
const trackList = useMemo(
() =>
tracks.allMdx.edges.map(track => {
const { frontmatter } = track.node
const {
name,
artist,
genre,
bpm,
artwork,
alt,
description,
audio,
} = frontmatter
return { name, artist, genre, bpm, artwork, alt, description, audio }
}),
[tracks]
)
const [state, setState] = useState({
audioPlayer: new Audio(),
tracks: trackList,
currentTrackIndex: null,
isPlaying: false,
})
return (
<MusicPlayerContext.Provider value={[state, setState]}>
{props.children}
</MusicPlayerContext.Provider>
)
}
export { MusicPlayerContext, MusicPlayerProvider }
// useMusicPlayer.js
import { useContext, useMemo } from 'react'
import { useStaticQuery, graphql } from 'gatsby'
import { MusicPlayerContext } from './MusicPlayerContext'
// frost.mp3 -> frost
function basename(name) {
return name.slice(0, name.lastIndexOf('.'))
}
const useMusicPlayer = () => {
const [state, setState] = useContext(MusicPlayerContext)
// query all mp3 and jpg files from /content/music/
const assets = useStaticQuery(graphql`
query Assets {
allFile(
filter: {
extension: { in: ["mp3", "jpg"] }
absolutePath: { regex: "/content/music/" }
}
) {
edges {
node {
publicURL
relativePath
}
}
}
}
`)
// convert to obj for fast lookup
const assetObj = useMemo(
() =>
assets.allFile.edges.reduce((obj, file) => {
const { publicURL, relativePath } = file.node
obj[relativePath] = publicURL
return obj
}, {}),
[assets]
)
// Play a specific track
function playTrack(index) {
if (index === state.currentTrackIndex) {
togglePlay()
} else {
state.audioPlayer.pause()
const base = state.tracks[index].audio.base // frost.mp3
const baseName = basename(base) // frost
// new Audio() does not support relative path
// hence the need for window.location.origin
const audioPlayer = new Audio(
`${window.location.origin}${assetObj[`${baseName}/${base}`]}`
) // new Audio('http://www.domain.com/static/frost-[hash].mp3')
audioPlayer.play()
setState(state => ({
...state,
currentTrackIndex: index,
isPlaying: true,
audioPlayer,
}))
}
}
// Toggle play or pause
function togglePlay() {
if (state.isPlaying) {
state.audioPlayer.pause()
} else {
state.audioPlayer.play()
}
setState(state => ({ ...state, isPlaying: !state.isPlaying }))
}
// Play the previous track in the tracks array
function playPreviousTrack() {
const newIndex =
(((state.currentTrackIndex + -1) % state.tracks.length) +
state.tracks.length) %
state.tracks.length
playTrack(newIndex)
}
// Play the next track in the tracks array
function playNextTrack() {
const newIndex = (state.currentTrackIndex + 1) % state.tracks.length
playTrack(newIndex)
}
// Get the current time of the currently playing track
function currentTime() {
if (state.isPlaying) {
state.audioPlayer.currentTime()
}
}
let currentTrackArtwork, currentTrackAudio
if (state.currentTrackIndex !== null) {
const base = state.tracks[state.currentTrackIndex].audio.base // frost.mp3
const baseName = basename(base) // frost
currentTrackArtwork =
assetObj[
`${baseName}/${state.tracks[state.currentTrackIndex].artwork.base}`
] // assetObj['frost/frost.jpg']
currentTrackAudio =
assetObj[
`${baseName}/${state.tracks[state.currentTrackIndex].audio.base}`
] // assetObj['frost/frost.mp3']
}
return {
playTrack,
togglePlay,
currentTrackName:
state.currentTrackIndex !== null &&
state.tracks[state.currentTrackIndex].name,
currentTrackArtist:
state.currentTrackIndex !== null &&
state.tracks[state.currentTrackIndex].artist,
currentTrackArtwork,
currentTrackAudio,
currentTime,
trackList: state.tracks,
isPlaying: state.isPlaying,
playPreviousTrack,
playNextTrack,
}
}
export default useMusicPlayer
WOW!!! You are amazing! This all works so well now, and I'm starting to understand things better as I go through the code. Things like regex
and useMemo
and a bunch of other stuff are all new to me, so thank you for showing me these things and how they can be used.
This is truly amazing XD !!!!
Do you have any way to be paid or tipped or anything? This is truly helpful to me :-)
Haha thanks but it's not necessary. Guess you can pay it forward next time :)
Hi @universse!
I'm trying to finish this music player off, but having trouble...dunno where to go, so i figured I'd ask you, if you don't mind.
Trying to make a progress bar that shows the current time in minutes and seconds on the left, a bar that visualizes the duration of the track that the listener can click on and/or drag on to move around in the track in the middle, and the total duration on the other side. This is very much like every music player that exists, I would imagine. Also want a volume slider, but I'll save that for later.
Anyway, been googling and trying a bunch of stuff out and I got to here:
function getTime(time) {
if(!isNaN(time)) {
return Math.floor(time / 60) + ':' + ('0' + Math.floor(time % 60)).slice(-2)
}
}
useEffect(() => {
const currentTime = setInterval(() => {getTime(state.audioPlayer.currentTime)}, 1000)
return () => clearInterval(currentTime)
}, []);
...where the first function formats the time into minutes and seconds and the second useEffect hook is trying to update the currentTime every second.
My issue is that I get the currentTime
to update, but only when I hit the pause button, rather than updating every second. Some with the duration, which starts off as NaN
but the first time I hit pause it shows up.
I feel like I might be close, but not quite there yet, and I can't see what I'm missing or doing wrong.
Perhaps you could try this. You need to put currentTime
into the component's state and update it.
const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)
useEffect(() => {
const timeoutId= setInterval(() => {
setCurrentTime(getTime(state.audioPlayer.currentTime))
}, 1000)
return () => clearInterval(timeoutId)
}, [state.audioPlayer]);
Also, I suggest you add https://github.com/facebook/react/tree/master/packages/eslint-plugin-react-hooks to your setup to avoid potential bugs.
yup! that basically works! There is just a second delay if i click play on a new track for the seconds to update to the new track, which makes sense, i think. I'm thinking maybe a conditional statement that looks for the track index to change, and if so clears out the currentTimes state instantly, or something along those lines...
So i needed to have a second useState
then, which i'm thinking i can use to make the progress bar as well. I was not really clear on the idea of useState, but I'm assuming that I can have as many useState
's as needed?
I will also grab that hook linter as well, thank you again :-)
You can try this also
const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)
// both formattedTime and progress are states derived from audioPlayer.currentTime
// so no need another useState
const formattedTime = getTime(currentTime)
const progress = currentTime / state.audioPlayer.duration
useEffect(() => {
const timeoutId= setInterval(() => {
setCurrentTime(state.audioPlayer.currentTime)
}, 1000)
return () => {
// clean up function run when state.audioPlayer changes
// reset currentTime to 0
setCurrentTime(0)
clearInterval(timeoutId)
}
}, [state.audioPlayer]);
yeah, that's working a bit better now, except there is still a little bit of lag between clicking a songs play button and having the number reset to '0:00'...maybe i need some default placeholder values or something, but not a big deal...
next up are the sliders, one to show and change the progress of the song, and another to change the volume of the song...doing research now, but if you have any thoughts, or suggestions, I'm all ears :-)
For the progress slider, you can use <input type='range' />
, something along this line.
<input
max={state.audioPlayer.duration}
min='0'
step='1'
type='range'
value={currentTime}
onChange={e => {
setCurrentTime(e.target.value)
state.AudioPlayer.currentTime = e.target.value
}}
/>
As for volume, you can also use <input type='range' />
. Anw, it would be great if you can share your repo, cuz I am not sure how to go about doing it.
Here's the repo:
https://github.com/rchrdnsh/RYKR
trying to wrap my head around using the state from useMusicPlayer
in this example, but it's still a bit confusing for me :-/
hmmmm...so with your example would i be adding this code to the useMusicPlayer
hook file? I don't think I have access to state.AudioPlayer.currentTime
outside of it, do I?
Would I make a function and name it something like this?:
function progressSlider() {
return (
<input
max={state.audioPlayer.duration}
min='0'
step='1'
type='range'
value={currentTime}
onChange={e => {
setCurrentTime(e.target.value)
state.AudioPlayer.currentTime = e.target.value
}}
/>
)
}
...then export it from the useMusicPlayer.js
file and import it into the PlayerControls.js
file?
I am currently doing that now and the slider shows up but does not move with the currentTime, then when I try to click on the slider to change the position of the music I get the following error:
...which i am thinking it means that i need to set up some state management in the PlayerControls.js
file...not sure if that's correct, or how to do it, though...
ProgressSlider is a new component. It can access state.audioPlayer.currentTime
via useContext
.
function ProgressSlider() {
const [{ audioPlayer }] = useContext(MusicPlayerContext)
return (
<input
max={audioPlayer.duration}
min='0'
step='1'
type='range'
value={currentTime}
onChange={e => {
setCurrentTime(e.target.value)
// not AudioPlayer
state.audioPlayer.currentTime = e.target.value
}}
/>
)
}
hmmmm...so it's kinda working...couple things, though...
Here's a gif showing the number thing, and also not draggable 'thumb', as it seems to be called:
I hope that makes sense...I' m trying to figure it out myself, but as you know, I am not super good at this kind of stuff...🤷♂️
Ah I think the strange number is the time in seconds. So I guess you need to format it
<input
max={audioPlayer.duration}
min='0'
step='1'
type='range'
value={currentTime} // this should have alr been formatted to minutes and seconds
onChange={e => {
setCurrentTime(getTime(e.target.value)) // getTime will format time to minutes and seconds, you probably already have it somewhere
state.audioPlayer.currentTime = e.target.value
}}
/>
Try this and see if you can drag the slider.
I tried doing exactly that earlier, but it did not, and still does not seem to work, neither for the formatting or for the slider control.
The following is also the getTime
code as well as the currentTime
and duration
, among other things...the number does format eventually, but there is a delay of a few milliseconds or so, which is enough to be noticeable to the user(me! XD)...
// Transform the currentTime info into minutes and seconds
function getTime(time) {
if(!isNaN(time)) {
return Math.floor(time / 60) + ':' + ('0' + Math.floor(time % 60)).slice(-2)
}
}
// both formattedTime and progress are states derived from audioPlayer.currentTime
// so no need another useState
const formattedTime = getTime(currentTime)
const progress = currentTime / state.audioPlayer.duration
useEffect(() => {
const timeoutId= setInterval(() => {
setCurrentTime(getTime(state.audioPlayer.currentTime))
// setCurrentTime(formattedTime)
}, 1000)
return () => {
// clean up function run when state.audioPlayer changes
// reset currentTime to 0
setCurrentTime(0)
clearInterval(timeoutId)
}
}, [state.audioPlayer]);
// get and display the duration of the track, in minutes and seconds
const duration = getTime(state.audioPlayer.duration)
dunno if you can spot anything off in there...still working on adding a volume control as well :-)
I downloaded your code. I will refactor quite a bit. Will need some time.
I guess what you seem unclear about is the difference between sharing state vs sharing stateful logic.
What you want is sharing state about the music being played across all components. What your useMusicPlayer
hook is doing is sharing stateful logic, which is unnecessary for your app.
// MusicPlayerContext
import React, { useState, useMemo, useContext, useEffect } from 'react'
import { useStaticQuery, graphql } from 'gatsby'
const MusicPlayerContext = React.createContext([{}, () => {}])
const MusicPlayerProvider = props => {
// COMMENT_ADDED
// query both tracks and assets since only one staticQuery per file
const { tracks, assets } = useStaticQuery(graphql`
query Tracks {
tracks: allMdx(
filter: { fileAbsolutePath: { regex: "/content/music/" } }
) {
edges {
node {
fields {
slug
}
frontmatter {
name
artist
genre
bpm
# ADD BASE HERE
artwork {
base
childImageSharp {
fluid(maxWidth: 1000) {
...GatsbyImageSharpFluid
}
}
}
alt
description
release(formatString: "MMMM Do, YYYY")
audio {
absolutePath
base
}
}
}
}
}
# query all mp3 and jpg files from /content/music/
assets: allFile(
filter: {
extension: { in: ["mp3", "jpg"] }
absolutePath: { regex: "/content/music/" }
}
) {
edges {
node {
publicURL
relativePath
}
}
}
}
`)
// need useMemo to avoid re-computation when state change
const trackList = useMemo(
() =>
tracks.edges.map(track => {
const { frontmatter } = track.node
const {
name,
artist,
genre,
bpm,
artwork,
alt,
description,
audio,
} = frontmatter
return { name, artist, genre, bpm, artwork, alt, description, audio }
}),
[tracks]
)
const [state, setState] = useState({
audioPlayer: new Audio(),
// COMMENT_ADDED
// don't really need trackList in state
// tracks: trackList,
currentTrackIndex: null,
isPlaying: false,
})
const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)
// both formattedTime and progress are states derived from audioPlayer.currentTime
// so no need another useState
const formattedTime = getTime(currentTime)
const progress = currentTime / state.audioPlayer.duration
// get and display the duration of the track, in minutes and seconds
const formattedDuration = getTime(state.audioPlayer.duration)
useEffect(() => {
// COMMENT_ADDED
// reset currentTime to 0 when state.audioPlayer changes
setCurrentTime(0)
}, [state.audioPlayer])
useEffect(() => {
// COMMENT_ADDED
// if isPlaying, start the timer
if (state.isPlaying) {
const timeoutId = setInterval(() => {
setCurrentTime(currentTime => currentTime + 1)
}, 1000)
return () => {
// COMMENT_ADDED
// clear interval run when paused i.e. state.isPlaying is false
clearInterval(timeoutId)
}
}
}, [state.isPlaying])
// convert to obj for fast lookup
const assetObj = useMemo(
() =>
assets.edges.reduce((obj, file) => {
const { publicURL, relativePath } = file.node
obj[relativePath] = publicURL
return obj
}, {}),
[assets]
)
function playTrack(index) {
if (index === state.currentTrackIndex) {
togglePlay()
} else {
state.audioPlayer.pause()
const base = trackList[index].audio.base // frost.mp3
const baseName = basename(base) // frost
// new Audio() does not support relative path
// hence the need for window.location.origin
const audioPlayer = new Audio(
`${window.location.origin}${assetObj[`${baseName}/${base}`]}`
) // new Audio('http://www.domain.com/static/frost-[hash].mp3')
audioPlayer.play()
setState(state => ({
...state,
currentTrackIndex: index,
isPlaying: true,
audioPlayer,
}))
}
}
// Toggle play or pause
function togglePlay() {
if (state.isPlaying) {
state.audioPlayer.pause()
} else {
state.audioPlayer.play()
}
setState(state => ({ ...state, isPlaying: !state.isPlaying }))
}
// Play the previous track in the tracks array
function playPreviousTrack() {
const newIndex =
(((state.currentTrackIndex + -1) % trackList.length) + trackList.length) %
trackList.length
playTrack(newIndex)
}
// Play the next track in the tracks array
function playNextTrack() {
const newIndex = (state.currentTrackIndex + 1) % trackList.length
playTrack(newIndex)
}
let currentTrackName,
currentTrackArtist,
currentTrackArtwork,
currentTrackAudio
// COMMENT_ADDED
// simplify things a bit
if (state.currentTrackIndex !== null) {
const { currentTrackIndex } = state
const currentTrack = trackList[currentTrackIndex]
const base = currentTrack.audio.base // frost.mp3
const baseName = basename(base) // frost
currentTrackName = currentTrack.name
currentTrackArtist = currentTrack.artist
currentTrackArtwork = assetObj[`${baseName}/${currentTrack.artwork.base}`] // assetObj['frost/frost.jpg']
currentTrackAudio = assetObj[`${baseName}/${currentTrack.audio.base}`] // assetObj['frost/frost.mp3']
}
return (
<MusicPlayerContext.Provider
value={{
playTrack,
togglePlay,
currentTrackName,
currentTrackArtist,
currentTrackArtwork,
currentTrackAudio,
currentTime,
// COMMENT_ADDED
// setCurrentTime to be used by ProgressSlider
setCurrentTime,
formattedDuration,
formattedTime,
// volume,
audioPlayer: state.audioPlayer,
trackList,
isPlaying: state.isPlaying,
playPreviousTrack,
playNextTrack,
}}
>
{props.children}
</MusicPlayerContext.Provider>
)
}
// COMMENT_ADDED
// access global state from MusicPlayerContext
function useMusicPlayerState() {
return useContext(MusicPlayerContext)
}
export { useMusicPlayerState, MusicPlayerProvider }
// frost.mp3 -> frost
function basename(name) {
return name.slice(0, name.lastIndexOf('.'))
}
// Transform the currentTime info into minutes and seconds
function getTime(time) {
if (!isNaN(time)) {
return Math.floor(time / 60) + ':' + ('0' + Math.floor(time % 60)).slice(-2)
}
}
// TrackList.js
import React from 'react'
import styled from 'styled-components'
import { motion } from 'framer-motion'
import Img from 'gatsby-image'
import { H1, H2, H3 } from '../components/Typography'
import PlayButton from '../images/controls/play-button.svg'
import PauseButton from '../images/controls/pause-button.svg'
import { useMusicPlayerState } from './MusicPlayerContext'
// const TrackGrid = styled.div`
// margin: 1rem;
// display: grid;
// grid-template-rows: auto;
// grid-template-columns: 1fr 1fr 1fr;
// grid-gap: 1rem;
// border: 1px solid red;
// `
const Card = styled.div`
margin: 0;
padding: 0;
text-decoration: none;
line-height: 1;
background: black;
cursor: pointer;
box-sizing: border-box;
width: 100%;
height: auto;
display: block;
position: relative;
/* display: grid;
grid-template-columns: repeat(8, 1fr);
grid-template-rows: repeat(16, 1fr); */
background: #000;
transition: All 200ms ease;
z-index: 1;
/* border: 1px solid white; */
`
const Artwork = styled(Img)`
position: relative;
margin: 0;
padding: 0;
max-width: 25rem;
z-index: 1;
border-radius: 16px;
`
const Text = styled.div`
position: relative;
z-index: 1;
background: #222;
border-radius: 16px;
margin: -3rem 1rem 1rem 1rem;
padding: 1rem;
box-shadow: 0px 0px 16px rgba(0, 0, 0, 0.75);
> p {
font-size: 24px;
}
`
const Button = styled(motion.button)`
margin: -6rem 0 0 15rem;
padding: 1.5rem;
width: 6rem;
height: 6rem;
border: none;
border-radius: 3rem;
background: #333;
position: relative;
box-shadow: 0px 0px 16px rgba(0, 0, 0, 0.5);
z-index: 5;
:focus {
outline: none;
}
:active {
outline: none;
box-shadow: 0px 0px 16px white;
/* background: black; */
}
`
const TrackList = () => {
const {
trackList,
currentTrackName,
playTrack,
isPlaying,
} = useMusicPlayerState()
return (
<>
{trackList.map((track, index) => (
// COMMENT_ADDED: add key here
<Card key={index}>
<Artwork
fluid={track.artwork.childImageSharp.fluid}
alt={track.alt}
/>
<Button
whileHover={{ scale: 1.1 }}
whileTap={{ scale: 0.9 }}
onClick={() => playTrack(index)}
>
{currentTrackName === track.name && isPlaying ? (
<img src={PauseButton} alt="Pause Button" />
) : (
<img src={PlayButton} alt="PlayButton" />
)}
</Button>
<Text>
<H1>{track.name}</H1>
{/* <h3>{track.artist}</h3> */}
{/* <p>{track.genre}</p> */}
{/* <p>{track.bpm}</p> */}
</Text>
</Card>
))}
</>
)
}
export default TrackList
// PlayerControls.js
import React, { useState } from 'react'
import styled from 'styled-components'
import { H1, H2, H3 } from '../components/Typography'
import { useMusicPlayerState } from './MusicPlayerContext'
import previousButton from '../images/controls/previous-button.svg'
import playButton from '../images/controls/play-button.svg'
import pauseButton from '../images/controls/pause-button.svg'
import nextButton from '../images/controls/next-button.svg'
const Button = styled.button`
margin: 1rem;
padding: 0.8rem;
width: 3rem;
height: 3rem;
border: none;
background: #333;
border-radius: 32px;
:focus {
outline: none;
/* box-shadow: 0px 0px 16px white; */
}
:active {
outline: none;
box-shadow: 0px 0px 16px white;
background: black;
}
`
const Image = styled.img`
margin: 0;
padding: 0;
height: 1.5rem;
width: 1.5rem;
`
const FlexStart = styled.div`
display: flex;
align-items: center;
justify-content: start;
`
const FlexCenter = styled.div`
display: flex;
align-items: center;
justify-content: center;
`
const TrackName = styled(H1)`
margin: 0 1rem;
padding: 0;
`
const TrackDuration = styled.h3`
margin: 0;
padding: 0.5rem;
`
const TrackArtwork = styled.img`
margin: 0 0.5rem;
padding: 0;
width: 4rem;
height: 4rem;
border: none;
`
// const ProgressBar = styled.progress`
// margin: 0 1rem;
// background-color: #333;
// `
// COMMENT_ADDED
// ProgressSlider is a standalone component
// it accesses global music state with the useMusicPlayerState hook
// jump to anywhere on the track using a progress bar style interface
function ProgressSlider() {
const {
audioPlayer,
currentTime,
formattedTime,
setCurrentTime,
} = useMusicPlayerState()
return (
<input
max={audioPlayer.duration}
min="0"
step="1"
type="range"
value={currentTime}
onChange={event => {
// COMMENT_ADDED
// convert event.target.value, a string, to number
const newTiming = parseInt(event.target.value, 10)
setCurrentTime(newTiming)
// not AudioPlayer
audioPlayer.currentTime = newTiming
}}
// COMMENT_ADDED
// onInput is not needed
/>
)
}
const Controls = () => {
// const [state, setState] = useContext(MusicPlayerContext)
// const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)
const {
isPlaying,
currentTrackName,
currentTrackArtwork,
currentTime,
formattedDuration,
formattedTime,
volume,
togglePlay,
playPreviousTrack,
playNextTrack,
} = useMusicPlayerState()
// const [time, setTime] = useState(currentTime)
// const handleChange = event => {
// setTime(event.target.value)
// }
// e => {
// setCurrentTime(e.target.value)
// state.AudioPlayer.currentTime = e.target.value
// }
return (
<>
<FlexStart>
<TrackArtwork src={currentTrackArtwork} alt="Album Artwork." />
<TrackName>{currentTrackName}</TrackName>
</FlexStart>
<FlexCenter>
<Button onClick={playPreviousTrack} disabled={!currentTrackName}>
<Image src={previousButton} alt="Previous Button" />
</Button>
<Button onClick={togglePlay} disabled={!currentTrackName}>
{isPlaying ? (
<Image src={pauseButton} alt="Pause Button" />
) : (
<Image src={playButton} alt="Play Button" />
)}
</Button>
<Button onClick={playNextTrack} disabled={!currentTrackName}>
<Image src={nextButton} alt="Next Button" />
</Button>
</FlexCenter>
<FlexCenter>
<TrackDuration>{formattedTime}</TrackDuration>
<ProgressSlider />
<TrackDuration>{formattedDuration}</TrackDuration>
</FlexCenter>
</>
)
}
export default Controls
I added some comments as well. Just search for COMMENT_ADDED
. If there's any part you are confused about, feel free to clarify.
hmmmmm....I had not heard of or thought about the difference between state and stateful logic, so thank you very much for that :-)
So, to clarify, state would be the currently playing track and all of its information, and stateful logic would be the music player that manipulates the currently playing track, like the progress bar, that allows the listener to navigate in time within the currently playing track?
I think I'm starting to understand this a bit better. Gonna need some time to go through what you have done, but it looks like absolute magic to my inexperienced eyes. The progress you have helped me achieve is truly amazing and has motivated me so much, and it is much much much appreciated XD
so I did this to create placeholder text for the formattedDuration
:
const formattedDuration = state.isPlaying ? getTime(state.audioPlayer.duration) : '0:00'
...and it generally works, except I'm getting that same delay between updates when the song changes...like so:
I'm thinking I need a useState
but I'm not sure...maybe it can be done in the <ProgressSlider>
itself
Stateful logic involves how you manipulate that state, what happens when that state changes, those sorts of things. So let say you want to create another music player, you can share those logic with the new player instead of rewriting it from scratch. On the other hand, if you share state between those two music players, they will play the same song etc...
I think state.audioPlayer.duration
might be undefined
during those delay. So maybe you can do
const formattedDuration = getTime(state.audioPlayer.duration) || '0:00'
yeah, that totally worked :-) ....also something I have not seen yet, which is the logical OR operator, yes? (just googled it)
Logically speaking this means:
thing 1 || thing 2
do thing 1...................................but if that is not there...................................then do thing 2
yes?
That is great to know :-)
Yep. If thing1 is a falsy value, like undefined
, null
, 0
, or ''
.
Hey @universse, got one more thing I want to try and do, but I'm not sure how to go about it. I have many mdx files that are articles about music and I am putting musical examples in them. There will be a varied amount of examples in each article, but what i want them to do is be included in the audio context so that they play through the persistent player, stopping whatever might be playing. Trying to figure out how to do this, but I am a bit lost. If you have any thoughts or suggestions on where to even begin, I am all ears.
An example would be:
---
title: Music Stuff
category: Theory
---
# Heading
Some words and stuff.
<Audio src="example-audio.mp3"/>
More words and stuff.
Where the inline audio example would play through the persistent music player and be controlled by the player controls as well. I hope that makes sense.
Can you share your code for <Audio />
?
I don't have anything yet for the <Audio/>
tag.
I tried using a plain <audio>
tag in an mdx file, and that didn't work, then I tried using gatsby-remark-audio
and that didn't work either, so I'm currently learning the Web Audio API to see if there is anything in there that could help me with this.
I'm also running into bugs in mdx itself, like these:
https://github.com/gatsbyjs/gatsby/issues/19785
https://github.com/gatsbyjs/gatsby/issues/19825
...which are roughly related to this, I think:
https://github.com/gatsbyjs/gatsby/issues/16242
...and I believe @ChristopherBiscardi is working on this issue, but not sure when mdx will be viable again.
I'm thinking maybe wrapping the mdx template file in the audio context and then selecting all the audio tags and adding them to the context somehow? Not sure how to do that yet, though, or if that is even a good idea 🤔
I guess first you need a playArticleTrack
function in MusicPlayerProvider.js
, similar to playTrack
function.
function playArticleTrack(src) {
state.audioPlayer.pause()
const audioPlayer = new Audio(src)
audioPlayer.play()
setState(state => ({
...state,
currentTrackIndex: -1, // I assume article tracks are not parts of the original track list
isPlaying: true,
audioPlayer,
}))
}
Then in your Audio
component
function Audio ({ src }) {
const { playArticleTrack } = useMusicPlayerState()
return ...
}
And you need to wrap your mdx article with <MusicPlayerProvider />
, using https://www.gatsbyjs.org/docs/browser-apis/#wrapPageElement
yes, the article tracks are not part of the original tracklist...so, I am trying to implement your suggestion, and I have done this so far to the Audio.js
component:
import React from 'react'
import { useMusicPlayerState } from '../player/MusicPlayerContext'
function Audio ({ src }) {
const { playArticleTrack } = useMusicPlayerState()
return (
<audio controls onPlay={playArticleTrack} src={src}></audio>
)
}
export default Audio
...and I'm trying to implement that component in the index page first to try and get it to work, like so:
// ...other imports before these...
import Audio from '../components/Audio'
import Groove from './groove.mp3'
const IndexPage = () => (
<Container>
<SEO title="Home" />
<Audio src={Groove}></Audio>
<TrackList />
</Container>
)
and I get the following error message when i hit play on the controls of the
...so I'm trying to wrap my head around what it means and what I need to do, but if you have any thoughts...
Can you try this and see how it goes?
function playArticleTrack(src) {
state.audioPlayer.pause()
const audioPlayer = new Audio(src)
audioPlayer.play()
setState(state => ({
...state,
currentTrackIndex: 0, // I change to 0 instead
isPlaying: true,
audioPlayer,
}))
}
If currentTrackIndex
is -1
, currentTrack
is undefined
.
so I tried that and it sort of worked, except it seems now that whatever track 0 is in the index is replaced by this other audio file, and then the app breaks when I try to play that other track again. It even shows the artwork of the track at index position 0 when I click play on the article Audio track in question...
I'm also thinking that I don't really even need a tracklist
at all. Or, that is to say, that what I really want is audio files on many different pages and template pages(inside mdx files, etc...) to all simply play through the global, persistent music player, and to all have metadata and artwork and to all be added to a listening history of some sort.
So maybe making a tracklist
is not the correct approach, and instead having an audio component that i can use anywhere on any page that taps into the global music context and then adds itself to the playlist history is more in line with what i am needing for this project...
hmmmm, so I set the currentTrackIndex
to null when the article track plays, and it sort of works, although it keeps playing without sound after the file is done playing. Code looks like this:
function playArticleTrack(src) {
state.audioPlayer.pause()
const audioPlayer = new Audio(src)
audioPlayer.play()
setState(state => ({
...state,
// currentTrackIndex: 0, // I change to 0 instead
currentTrackIndex: null, // Trying null and it seems to work, although the player keeps going after it's done.
isPlaying: true,
audioPlayer,
}))
}
...is this ok to do? Will this cause other issues in the future that I am unaware of?
also, are you on www.codementor.io or anything, or have any time or interest in helping me finish this project beyond open source help? I realize I am quite in over my head but would love to get this done sooner than later and have a working product to help build my music businesses upon. If not, no worries, just thought I would ask you first before expanding my search :-)
Are you open to contacting over Whatsapp/Telegram?
I haven't used those before, but sure :-)
before we get to that, though. I'm currently trying to link each track card in the trackList
to open an mdx file about that track, but I'm not having much luck.
I'm trying to add a slug
constant and then access it in the trackList
to link to individual MDX files about each track, like so:
// need useMemo to avoid re-computation when state change
const trackList = useMemo(
() =>
tracks.edges.map(track => {
// add slug to the data...
const slug = track.node.fields.slug
const { frontmatter } = track.node
const {
name,
artist,
genre,
bpm,
artwork,
alt,
description,
audio,
} = frontmatter
// then add slug to the return...
return slug, { name, artist, genre, bpm, artwork, alt, description, audio }
}),
[tracks]
)
...then use that to create the link to the programmatically created mdx file in the tracklist
component, like so:
const TrackList = () => {
const {
trackList,
currentTrackName,
playTrack,
isPlaying,
} = useMusicPlayerState()
return (
<>
{trackList.map((track, index) => (
// use the track slug to link to the mdx file with the same slug...
<Card key={index} to={track.slug}>
<Artwork
fluid={track.artwork.childImageSharp.fluid}
alt={track.alt}
/>
<Button
whileHover={{ scale: 1.1 }}
whileTap={{ scale: 0.9 }}
onClick={() => playTrack(index)}
>
{currentTrackName === track.name && isPlaying ? (
<img src={PauseButton} alt="Pause Button" />
) : (
<img src={PlayButton} alt="Play Button" />
)
}
</Button>
<Text>
<H1>{track.name}</H1>
<H3>{track.artist}</H3>
{/* <p>{track.genre}</p> */}
{/* <p>{track.bpm}</p> */}
</Text>
<LinkButton>Learn More</LinkButton>
</Card>
))}
</>
)
}
export default TrackList
...but this is currently not working at all in any way, and I get an error that the the value of track.slug
is not defined in the console:
So, I'm thinking that I'm not too far off here, but I'm not sure what I'm missing to make it work right, as I don't fully understand what I'm doing :-(
Is that a typo?
return { slug, name, artist, genre, bpm, artwork, alt, description, audio }
instead of
return slug, { name, artist, genre, bpm, artwork, alt, description, audio }
so, i'm a little confused... this looks like an object to me, due to the curly braces:
return { slug, name, artist, genre, bpm, artwork, alt, description, audio }
...but it's returning what looks like an array of data? There are no property: value
pairs, just single words, so I thought that maybe it is similar in syntax to ES module imports and exports where you have to name the imports and exports inside of curly braces if they are not the default export, which I also don't fully understand, to be honest. That's why I put the slug outside of the curly braces, as I don't understand why it would be inside or outside of them.
I will try to put it inside and see if that works :-)
It is called object shorthand. You can read more here https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Object_initializer
Basically, it is a shortcut. So instead o
const obj = { a: a, b: b, c: c }
You can write
const obj = { a, b, c }
ooooooooh...that clears quite a few things up in my head :-)
ok, so now I'm trying to figure out how to get the play button on the individual track pages to work properly...
So, currently, there is a grid of pieces of music:
...and when the user clicks the play button, that piece will play. If a user clicks on the learn more button, then they are taken to a programmatically generated mdx page with an article describing the piece and how it was created:
...where there is also a play button on that page as well in the title card, which I would like to be linked to the play button from the grid view of all the pieces, so that if the user clicked on the play button in the grid, then clicked the learn more button and navigated to the article explaining the piece, then the play button there would reflect the current state of the music playing, meaning it would have the pause icon. But if the user did not click the play button, but instead simply navigated to the individual article page, then the play button would be the play icon, and when it is clicked, that track would play.
So, I'm starting to do this by literally copying and pasting the play button from the tracklist component over into the article template:
const Track = ({ data }) => {
const {
currentTrackName,
playTrack,
isPlaying,
} = useMusicPlayerState()
const { frontmatter, timeToRead, body } = data.mdx
return (
<MDXProvider components={{...components }}>
<ArticleBlock>
<TitleBlock>
<TitleText>
<Title>{frontmatter.name}</Title>
<Subtitle>{frontmatter.artist}</Subtitle>
<Subtitle>{frontmatter.genre}</Subtitle>
<Subtitle>{frontmatter.bpm}</Subtitle>
{/* <Subtitle>{frontmatter.description}</Subtitle> */}
</TitleText>
<PlayButton
whileHover={{ scale: 1.1 }}
whileTap={{ scale: 0.9 }}
onClick={() => playTrack()}
>
{currentTrackName === frontmatter.name && isPlaying ? (
<img src={PauseButtonIcon} alt="Pause Button" />
) : (
<img src={PlayButtonIcon} alt="Play Button" />
)
}
</PlayButton>
<Artwork fluid={frontmatter.artwork.childImageSharp.fluid} alt="cool stuff." />
</TitleBlock>
<TitleScrim />
<TitleImage fluid={frontmatter.artwork.childImageSharp.fluid} alt="cool stuff." />
<Content>
<MDXRenderer>{body}</MDXRenderer>
</Content>
</ArticleBlock>
</MDXProvider>
)
}
...but when i click the play button I get the following error message:
in Chrome:
...and in Safari:
...so I'm thinking I need to tell it to select that particular track to be connected to the play button, but not sure how to go about doing that.
The problem is you are not passing the index
to playTrack
. To do that, you needs to associate each individual track page with the correct index, and pass that index to playtrack
click handler of the play button.
Maybe you can do something like
// HERE get slug from pageContext
const Track = ({ data, pageContext: { slug } }) => {
const {
currentTrackName,
playTrack,
isPlaying,
trackList
} = useMusicPlayerState()
const { frontmatter, timeToRead, body } = data.mdx
const index = trackList.findIndex(track => track.slug === slug) // HERE - find the correct track index
return (
<MDXProvider components={{...components }}>
<ArticleBlock>
<TitleBlock>
<TitleText>
<Title>{frontmatter.name}</Title>
<Subtitle>{frontmatter.artist}</Subtitle>
<Subtitle>{frontmatter.genre}</Subtitle>
<Subtitle>{frontmatter.bpm}</Subtitle>
{/* <Subtitle>{frontmatter.description}</Subtitle> */}
</TitleText>
<PlayButton
whileHover={{ scale: 1.1 }}
whileTap={{ scale: 0.9 }}
onClick={() => playTrack(index)} // HERE
>
{currentTrackName === frontmatter.name && isPlaying ? (
<img src={PauseButtonIcon} alt="Pause Button" />
) : (
<img src={PlayButtonIcon} alt="Play Button" />
)
}
</PlayButton>
<Artwork fluid={frontmatter.artwork.childImageSharp.fluid} alt="cool stuff." />
</TitleBlock>
<TitleScrim />
<TitleImage fluid={frontmatter.artwork.childImageSharp.fluid} alt="cool stuff." />
<Content>
<MDXRenderer>{body}</MDXRenderer>
</Content>
</ArticleBlock>
</MDXProvider>
)
}
wow, so that...just...worked! Super awesome!
Now I'm working on the inline audio samples component again and I'm running into a couple issues. So far I have this:
import React from 'react'
import styled from 'styled-components'
import { motion } from 'framer-motion'
import PlayButton from '../images/controls/play-button.svg'
import PauseButton from '../images/controls/pause-button.svg'
import { useMusicPlayerState } from '../player/MusicPlayerContext'
const Button = styled(motion.button)`
// styles n' stuff...
`
const AudioBlock = styled.div`
// styles n' stuff...
`
const AudioText = styled.p`
// styles n' stuff...
`
function Audio ({ src, children }) {
const {
currentTrackName,
playArticleTrack,
isPlaying,
} = useMusicPlayerState()
const articleTrackName = src;
return (
<AudioBlock>
<Button
whileHover={{ scale: 1.13 }}
whileTap={{ scale: 0.9 }}
onClick={() => playArticleTrack(src) }
src={src}
>
{ currentTrackName === articleTrackName && isPlaying ? (
<img src={PauseButton} alt="Pause Button" />
) : (
<img src={PlayButton} alt="Play Button" />
)}
</Button>
<AudioText>{children}</AudioText>
</AudioBlock>
)
}
export default Audio
...where I'm trying to pass in the src as an audio file, and have the play button function as it always has in other instances.
Issue 1:
Issue 2:
Issue 3:
Issue 4 (actually, a more global issue, possibly):
currentTime
function in the central player controls just keep counting, and does not stop, which it shouldn't do, of course.so, I'm trying to amend the playArticleTrack()
function like this:
function playArticleTrack(src) {
if (src === state.currentTrackIndex)
{
togglePlay()
} else {
state.audioPlayer.pause()
const audioPlayer = new Audio(src)
audioPlayer.play()
setState(state => ({
...state,
currentTrackIndex: src,
isPlaying: true,
audioPlayer,
}))
}
}
...so my Audio.js
element can look like this, roughly:
const Audio = ({ src, children }) => {
const {
currentTrackName,
playArticleTrack,
isPlaying,
} = useMusicPlayerState()
return (
<Box>
<AudioBlock>
<Button
whileHover={{ scale: 1.13 }}
whileTap={{ scale: 0.9 }}
onClick={() => playArticleTrack(src) }
// onClick={togglePlay}
src={src}
>
{ currentTrackName === articleTrackName && isPlaying ? (
<img src={PauseButton} alt="Pause Button" />
) : (
<img src={PlayButton} alt="Play Button" />
)}
</Button>
<AudioText>{children}</AudioText>
</AudioBlock>
</Box>
)
}
...so it can be this in an mdx file:
<Audio src={Groove}>Example Groove That is Super Duper Awesome!</Audio>
...and this in the app itself:
...but it's not working right yet...when i had it a little simpler I couldn't get the pause button to show up when it was playing, and now that I'm trying to more closely mirror the playTrack()
function, but not sure how to add the new src
's to the trackIndex, I guess...also, this is currently happening with the current version of the playArticleTrack()
function:
...but I'm not sure why that function is interested in that line of code.
The problem is src
is a string ("someurl.com/name.mp3") while currentIndex
is a number. currentTrack
which is trackList[currentTrackIndex]
will be undefined
when you use src
as the currentTrackIndex
. That's why the above line cannot be evaluated since you are getting audio
property of undefined
.
From what I can see, you need to refactor quite a big chunk of your music player functionalities to handle this new use case.
yeah, that's what i was thinking as well...i think for now I can just use the <audio/>
element, as it basically works, and then come back around to this later.
I have another, more pressing issue, however. I cannot get the project to successfully build using gatsby build
, as the new Audio()
thing is not supported during SSR builds, so I need to figure out how to get around that so that the app can actually even work outside of gatsby develop
mode.
Here is the related issue:
https://github.com/gatsbyjs/gatsby/issues/19522
I asked @DSchau on twitter the other day and he said this:
...so i'm currently trying to understand what he means, but I have many holes in my hooks knowledge as well as my general knowledge.
What I don't understand is how to set state inside of another hook...like...am I using useState
inside of useEffect?
Going by what he said, this is all I can come up with:
useEffect(() =>{
[],
const [client, isClient] = useState();
})
...obviously this is wrong and doesn't work and is full of errors, but I'm not sure what he means in regards to how to set it up, and how the syntax should work. If you have any thoughts, they, of course, would be much appreciated :-)
You can try this
const [state, setState] = useState({
audioPlayer: {}, // instead of new Audio()
currentTrackIndex: null,
isPlaying: false,
})
Hi @universse XD
So, @DSchau and @jlengstorf recommended using an isClient
conditional approach to making this app actually build, and I am currently trying to make this approach work. They recommended doing something along these lines, in which I create a state hook to set whether the code is running on the client or not:
// ---------------------------------------------------------
// Establishing if the code is running on the client
// ---------------------------------------------------------
// state hook for setting a starting state for wether or not this code is running on the client
const [isClient, setIsClient] = useState(false)
// useEffect hook to set the client boolenan to true
useEffect(() => {
setIsClient(true)
}, [])
...then set state for the new Audio()
HTMLAudioElement using the setState function, like so:
// state hook for setting a starting state for the Audio() constructor which creates a new HTMLAudioElement
const [state, setState] = useState({})
// conditional statement to set up a few different things if the the current code is being run on the client.
if (isClient) {
setState({
// a key value pair creating a new Audio() HTMLAudioELement using the constructor function
audioPlayer: new Audio(),
currentTrackIndex: null,
isPlaying: false,
})
}
...but this only ended up making other things break, so I am now to trying to wrap as many things as needed into isClient
conditionals, and I keep moving on to new errors each time I wrap something else, like setting the currentTime
for example:
// ---------------------------------------------------------
// Setting Current Time
// ---------------------------------------------------------
// A useState hook that will read and write the current time of the HTMLAudioElement
const [currentTime, setCurrentTime] = useState()
// if the code is running on the client then set the current time
if (isClient) {
setCurrentTime(state.audioPlayer.currentTime)
}
// This const declaration uses the getTime() function to properly format the current time of the currently playing audio file. If no audio is playing then the current time will display 0:00
const formattedTime = getTime(currentTime) || `0:00`
...but even though I did that I'm still getting error messages like this:
...which is odd since that is what I just wrapped with an isClient
conditional.
Anyway, the following is where my AudioContext
file is at now...if you can see anything I need to fix using this approach, it would be much appreciated. @DSchau and @jlengstorf as well if you have the time or energy to take a look XD
// ---------------------------------------------------------
// Audio Context
// ---------------------------------------------------------
import React, { useState, useMemo, useContext, useEffect } from 'react'
import { useStaticQuery, graphql } from 'gatsby'
const AudioContext = React.createContext([{}, () => {}])
const AudioProvider = props => {
// ---------------------------------------------------------
// Data Querying
// ---------------------------------------------------------
// This is a static query to get all of the information and audio and artwork for all the pieces of music that are being stored as folders containing mdx files with frontmatter and any associated files, like audio and image files, among other. This queries both track information and the asociated assets since only one staticQuery per file is sufficient to do this properly.
const { tracks, assets } = useStaticQuery(graphql`
query Tracks {
tracks: allMdx(filter: { fileAbsolutePath: { regex: "/content/music/" } }
) {
edges {
node {
fields {
slug
}
frontmatter {
name
artist
genre
bpm
# ADD BASE HERE
artwork {
base
childImageSharp {
fluid(maxWidth: 1000) {
...GatsbyImageSharpFluid
}
}
}
alt
description
release(formatString: "MMMM Do, YYYY")
audio {
absolutePath
base
}
}
}
}
}
# query all mp3 and jpg files from /content/music/
assets: allFile(
filter: {
extension: { in: ["mp3", "jpg"] }
absolutePath: { regex: "/content/music/" }
}
) {
edges {
node {
publicURL
relativePath
}
}
}
}
`)
// This uses the map function to create an array of objects, where each object is a track and all it's information and assets. We need to use `useMemo` to avoid re-computation when the state changes, as this information is all static.
const trackList = useMemo(
() =>
tracks.edges.map(track => {
const slug = track.node.fields.slug
const { frontmatter } = track.node
const {
name,
artist,
genre,
bpm,
artwork,
alt,
description,
audio,
} = frontmatter
return { slug, name, artist, genre, bpm, artwork, alt, description, audio }
}),
[tracks]
)
// ---------------------------------------------------------
// Utility Functions
// ---------------------------------------------------------
// This a function to turn a file name, which is a string, into a new string without the file type at the end. For example: frost.mp3 -> frost
function basename(name) {
return name.slice(0, name.lastIndexOf('.'))
}
// A function that transforms the audio.currentTime value into human readable minutes and seconds
function getTime(time) {
if (!isNaN(time)) {
return Math.floor(time / 60) + ':' + ('0' + Math.floor(time % 60)).slice(-2)
}
}
// ---------------------------------------------------------
// Establishing if the code is running on the client
// ---------------------------------------------------------
// state hook for setting a starting state for wether or not this code is running on the client
const [isClient, setIsClient] = useState(false)
// useEffect hook to set the client boolenan to true
useEffect(() => {
setIsClient(true)
}, [])
// ---------------------------------------------------------
// Creating the Audio Context
// ---------------------------------------------------------
// -------- ORIGINAL VERSION -------- //
// using a useState hook to set the state of the audio player, including the audioPlyer itself, the currentTrackIndex and the isPlaying boolean
// const [state, setState] = useState({
// audioPlayer: new Audio(),
// // audioPlayer: {}, // instead of new Audio()
// // don't really need trackList in state
// // tracks: trackList,
// currentTrackIndex: null,
// isPlaying: false,
// })
// -------- END ORIGINAL VERSION -------- //
// state hook for setting a starting state for the Audio() constructor which creates a new HTMLAudioElement
const [state, setState] = useState({})
// conditional statement to set up a few different things if the the current code is being run on the client.
if (isClient) {
setState({
// a key value pair creating a new Audio() HTMLAudioELement using the constructor function
audioPlayer: new Audio(),
currentTrackIndex: null,
isPlaying: false,
})
}
// ---------------------------------------------------------
// Setting Current Time
// ---------------------------------------------------------
// -------- ORIGINAL VERSION -------- //
// A useState hook that will read and write the current time of the HTMLAudioElement
// const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)
// This const declaration uses the getTime() function to properly format the current time of the currently playing audio file. If no audio is playing then the current time will display 0:00
// const formattedTime = getTime(currentTime) || `0:00`
// -------- END ORIGINAL VERSION -------- //
// A useState hook that will read and write the current time of the HTMLAudioElement
const [currentTime, setCurrentTime] = useState()
// if the code is running on the client then set the current time
if (isClient) {
setCurrentTime(state.audioPlayer.currentTime)
}
// This const declaration uses the getTime() function to properly format the current time of the currently playing audio file. If no audio is playing then the current time will display 0:00
const formattedTime = getTime(currentTime) || `0:00`
// ---------------------------------------------------------
// Setting Duration
// ---------------------------------------------------------
// -------- ORIGINAL VERSION -------- //
// This const declaration uses the getTime() function to properly format the duration of the currently playing audio file. If no audio is playing then the duration will display 0:00
// const formattedDuration = getTime(state.audioPlayer.duration) || `0:00`
// This const declaration uses the getTime() function to properly format the duration of the currently playing audio file. If no audio is playing then the duration will display 0:00
// const formattedDuration = getTime(currentDuration) || `0:00`
// -------- END ORIGINAL VERSION -------- //
// A useState hook that will read and write the duration of the HTMLAudioElement
const [currentDuration, setCurrentDuration] = useState()
// if the code is running on the client then set the current time
if (isClient) {
setCurrentDuration(state.audioPlayer.duration)
}
// This const declaration uses the getTime() function to properly format the duration of the currently playing audio file. If no audio is playing then the duration will display 0:00
const formattedDuration = getTime(currentDuration) || `0:00`
// ---------------------------------------------------------
// Setting Time
// ---------------------------------------------------------
// useEffect hook that will reset currentTime to 0 when state.audioPlayer changes
useEffect(() => {
setCurrentTime(0)
}, [state.audioPlayer])
useEffect(() => {
// if isPlaying is true, then this useEffect hook will start the timer of the currentTime and set it again every second, written in milliseconds
if (state.isPlaying) {
const timeoutId = setInterval(() => {
setCurrentTime(currentTime => currentTime + 1)
}, 1000)
return () => {
// clear interval run when paused i.e. state.isPlaying is false
clearInterval(timeoutId)
}
}
}, [state.isPlaying])
// convert to obj for fast lookup
const assetObj = useMemo(
() =>
assets.edges.reduce((obj, file) => {
const { publicURL, relativePath } = file.node
obj[relativePath] = publicURL
return obj
}, {}),
[assets]
)
// This function will Toggle between play or pause
function togglePlay() {
if (state.isPlaying) {
state.audioPlayer.pause()
} else {
state.audioPlayer.play()
}
// calls the setState function and passes in an arrow function that is then passed an object with the current state being passed in the spread operator, then the isPlaying property being updated with the opposite boolean state of what it currently is.
setState(state => ({ ...state, isPlaying: !state.isPlaying }))
}
// This function plays the current track
function playTrack(index) {
if (index === state.currentTrackIndex) {
togglePlay()
} else {
state.audioPlayer.pause()
const base = trackList[index].audio.base // frost.mp3
const baseName = basename(base) // frost
// new Audio() does not support relative path
// hence the need for window.location.origin
const audioPlayer = new Audio(
`${window.location.origin}${assetObj[`${baseName}/${base}`]}`
) // new Audio('http://www.domain.com/static/frost-[hash].mp3')
audioPlayer.play()
setState(state => ({
...state,
currentTrackIndex: index,
isPlaying: true,
audioPlayer,
}))
}
}
// Play the previous track in the tracks array
function playPreviousTrack() {
const newIndex =
(((state.currentTrackIndex + -1) % trackList.length) + trackList.length) %
trackList.length
playTrack(newIndex)
}
// Play the next track in the tracks array
function playNextTrack() {
const newIndex = (state.currentTrackIndex + 1) % trackList.length
playTrack(newIndex)
}
// This function is for pausing the main music that is being played and play the audio clip that has been selected somewhere in the application
function playArticleTrack(src) {
state.audioPlayer.pause()
const audioPlayer = new Audio(src)
audioPlayer.play()
setState(state => ({
...state,
currentTrackIndex: null, // I assume article tracks are not parts of the original track list
isPlaying: true,
audioPlayer,
}))
}
// function playArticleTrack(src) {
// if (src === state.currentTrackIndex)
// {
// togglePlay()
// } else {
// state.audioPlayer.pause()
// const audioPlayer = new Audio(src)
// audioPlayer.play()
// setState(state => ({
// ...state,
// currentTrackIndex: src,
// isPlaying: true,
// audioPlayer,
// }))
// }
// }
// function playArticleTrack(src) {
// state.audioPlayer.pause()
// const audioPlayer = new Audio(src)
// if (src === state.currentTrackIndex) {
// togglePlay()
// } else {
// state.audioPlayer.pause()
// }
// setState(state => ({
// ...state,
// // currentTrackIndex: 0, // I change to 0 instead
// // currentTrackIndex: null, // Trying null and it seems to work, although the player keeps going after it's done.
// currentTrackIndex: src,
// isPlaying: true,
// audioPlayer,
// }))
// }
let currentTrackName,
currentTrackArtist,
currentTrackArtwork,
currentTrackAudio
// simplify things a bit
if (isClient && state.currentTrackIndex !== null) {
// if (state.currentTrackIndex !== null) {
const { currentTrackIndex } = state
const currentTrack = trackList[currentTrackIndex]
const base = currentTrack.audio.base // frost.mp3
const baseName = basename(base) // frost
currentTrackName = currentTrack.name
currentTrackArtist = currentTrack.artist
currentTrackArtwork = assetObj[`${baseName}/${currentTrack.artwork.base}`] // assetObj['frost/frost.jpg']
currentTrackAudio = assetObj[`${baseName}/${currentTrack.audio.base}`] // assetObj['frost/frost.mp3']
}
return (
<AudioContext.Provider
value={{
playTrack,
playArticleTrack,
togglePlay,
currentTrackName,
currentTrackArtist,
currentTrackArtwork,
currentTrackAudio,
currentTime,
// setCurrentTime to be used by ProgressSlider
setCurrentTime,
formattedTime,
currentDuration,
setCurrentDuration,
formattedDuration,
// volume,
audioPlayer: state.audioPlayer,
trackList,
isPlaying: state.isPlaying,
playPreviousTrack,
playNextTrack,
}}
>
{props.children}
</AudioContext.Provider>
)
}
// access global state from MusicPlayerContext
function useAudioState() {
return useContext(AudioContext)
}
export { useAudioState, AudioProvider }
The isClient
approach is fine but in your case it's over-complicating stuffs a little bit. You don't need new Audio()
as the initial state. An empty {}
as suggested above is fine. You can try and see if it builds.
hmmm...tried that before but it didn't work...trying it again and the play buttons don't work and then I get this error message:
also, i don't quite understand how one can have audio without Audio()
...
So, to clarify, it seems to build, but then it doesn't work.
you can simplify a bit:
const [state, setState] = useState({})
useEffect(() => {
setState({
// a key value pair creating a new Audio() HTMLAudioELement using the constructor function
audioPlayer: new Audio(),
currentTrackIndex: null,
isPlaying: false,
})
}, [])
the error you're getting looks like it's not initializing once it loads in the browser
you can't play audio without a browser, and when Gatsby is building, there's no client — it's server-side rendering — which means Audio and other browser APIs don't exist
so what you're doing is saying, "build the site _without_ the Audio player, then initialize the Audio player once the site loads in a browser"
does that make sense?
Hi @jlengstorf, thank you for taking some time to help me out :-)
So, I understand what you are saying conceptually very clearly, it's just how to go about making it work in the app, which is very challenging for me.
I copied and pasted your code snippet from the previous comment verbatim and got this error message:
...so I'm thinking that I clearly need to do one or (many) more things to re-factor the code to make it work inside of Gatsby and get the code passed the SSR build step.
So now I move down the page and add a useEffect
hook to the currentTime
declaration, like so:
// -------- `useEffect` VERSION -------- //
// A useState hook that will read and write the current time of the HTMLAudioElement
const [currentTime, setCurrentTime] = useState()
// useEffect hook to set the currentTime when the code is running in the client
useEffect(() => {
setCurrentTime(state.audioPlayer.currentTime)
}, [])
// This const declaration uses the getTime() function to properly format the current time of the currently playing audio file. If no audio is playing then the current time will display 0:00
const formattedTime = getTime(currentTime) || `0:00`
// -------- End `useEffect` VERSION -------- //
...and now I get this error message:
...so then I move down the page and add a useEffect
hook to the duration
declaration, like so:
// -------- `useEffect` VERSION -------- //
// A useState hook that will read and write the duration of the HTMLAudioElement
const [currentDuration, setCurrentDuration] = useState()
// useEffect hook to set the currentDuration when the code is running in the client
useEffect(() => {
setCurrentDuration(state.audioPlayer.duration)
}, [])
// This const declaration uses the getTime() function to properly format the duration of the currently playing audio file. If no audio is playing then the duration will display 0:00
const formattedDuration = getTime(currentDuration) || `0:00`
// -------- End `useEffect` VERSION -------- //
...and get this error message:
...so now I move down the page and try adding a `useEffect hook to the if statement that this line of code is contained within, like so:
// -------- `useEffect` VERSION -------- //
useEffect(() => {
if (state.currentTrackIndex !== null) {
const { currentTrackIndex } = state
const currentTrack = trackList[currentTrackIndex]
const base = currentTrack.audio.base // frost.mp3
const baseName = basename(base) // frost
currentTrackName = currentTrack.name
currentTrackArtist = currentTrack.artist
currentTrackArtwork = assetObj[`${baseName}/${currentTrack.artwork.base}`] // assetObj['frost/frost.jpg']
currentTrackAudio = assetObj[`${baseName}/${currentTrack.audio.base}`] // assetObj['frost/frost.mp3']
}
},[])
// -------- END `useEffect` VERSION -------- //
...and now I get the same error message that I started with!
...only this time i already have a useEffect
wrapped around the currentTime
declaration, so...............i mean.............................i............................just................................................
...maybe I need to wrap a useEffect
hook inside another useEffect
, possible creating an infinite regression of useEffect
s...? (I think that is humor, although it's hard for me to tell at this point ;-P
...so I'm not exactly sure what to do next to make it all the way to the finish line, or what error message is waiting for me there, although getting the same error message after having seemingly solved that issue is stumping me at this point.
Can you try this
const [state, setState] = useState({
audioPlayer: {},
currentTrackIndex: null,
isPlaying: false,
})
const [currentTime, setCurrentTime] = useState(state.audioPlayer.currentTime)
useEffect(() => {
setState({
audioPlayer: new Audio(),
currentTrackIndex: null,
isPlaying: false,
})
}, [])
const formattedTime = getTime(currentTime) || `0:00`
const formattedDuration = getTime(state.audioPlayer.duration) || `0:00`
......
let currentTrackName
let currentTrackArtist
let currentTrackArtwork
let currentTrackAudio
if (state.currentTrackIndex !== null) {
const { currentTrackIndex } = state
const currentTrack = trackList[currentTrackIndex]
const base = currentTrack.audio.base // frost.mp3
const baseName = basename(base) // frost
currentTrackName = currentTrack.name
currentTrackArtist = currentTrack.artist
currentTrackArtwork = assetObj[`${baseName}/${currentTrack.artwork.base}`]
currentTrackAudio = assetObj[`${baseName}/${currentTrack.audio.base}`]
}
.......
Holy Shit!!!!!
Thank you @universse !!!!!
...and thank you @jlengstorf and @DSchau for taking time out to help me as well. Learning so much these days from giving people like yourselves :-)
@rchrdnsh I hardly helped that was all @jlengstorf. Thanks for the kind words and thanks for using Gatsby 💜
Hi @DSchau XD,
With all due respect to @jlengstorf (and he certainly deserves maaaad respect ) it was actually @universse 's solution that ended up working for me...he is also generally a badass as well and very generous and giving of his time and expertise...dunno if he needs work and/or if gatsby is hiring, but he is a legend none the less, at least to me :-)
But in the end, it is everybody together as a collective open source hive mind that really helps push open source forward, which is super awesome, and I couldn't do what i have done so far without everybody that has taken the time to help me out, including yourself and @jlengstorf and @universse and everybody else who gives of themselves to push web technology forward :-)
Actually, found a new issue @universse and @DSchau and @jlengstorf , but only with the live production version of the app running live on the web via netlify...
When I'm running the app in gatsby develop
mode locally the tracks play instantly (of course)
When I gatsby build
the app then gatsby serve
it the tracks play instantly (makes sense)
When I upload the site to netlify and run it in production at https://rykr.netlify.com the tracks take roughly 8-10 seconds before they play, even thought the progress bar starts moving right away. I thought at first that they were just taking some time to load, but it also happens when i replay a track that has been previously played, so I'm not sure where the issue even lies...is it a gatsby issue? a netlify issue? an html audio issue? all of the above?
Any thoughts would be much appreciated. Googling the issue has not helped much yet...
sounds like a loading issue to me — I'd try uploading plain HTML/vanilla JS to Netlify to see if the same delay happens. if that loads fast, then you can add Gatsby back in and see if it gets reintroduced.
I'd bet that link rel="prefetch"
and caching are going to be your friends here
hmmmm...they are also pretty big files... in the 10 to 15 MB range, even as mp3...trying to figure out how to compress them further without too much audio loss, but yeah...maybe i need to look into libraries for audio and video streaming on the web...
@rchrdnsh Cloudinary is pretty good at this — they make it sound like they only support video, but I use it for e.g. Honkify and it's super easy + automatically resamples for smaller file size
hmmmm...so, i can make the audio files smaller myself, and netlify is also a cdn...so is there any other feature of cloudinary that makes it better than rolling my own? I'm not seeing any sort of audio or video streaming features listed, like breaking the audio up into pieces and loading the pieces into a buffer so that it can play instantly, but of course I could be missing it...
I like that it doesn't require me to roll my own, but the trade-offs may be different for you
full details on what they can do with audio: https://cloudinary.com/documentation/audio_transformations
Awesome, had not found that page, thank you :-)
Sooooooo...getting back to the whole preload and cache idea.........ummmmmm............how?
Been googling for Gatsby solutions but getting a bit lost...
I ask because there are really no links, per say, as the audio files are being gathered via a graphic static query, so I don’t have any knowledge as to how to implement preloading and caching to assists from a graphql query, or if it should be done via something in the template, or via a web pack plugin or something...
...yeah...🤷🏼♂️
I'll let @DSchau take over for the caching question — I'm not sure what the current official Gatsby stance is on that
for preloading: https://developer.mozilla.org/en-US/docs/Web/HTML/Preloading_content
Hi @universse!
The site is now working not too badly with gatsby-plugin-offline
turned on, which is nice.
Running into an odd issue in safari, however. The audioPlayer.duration
is returning infinity:aN
, instead of the duration, but it works fine in chrome and firefox...
googled around a bit and found a couple things, but I don't understand them at all...
if you have a chance, would you mind looking at the issues and see if they make any sense to you? I'm gonna keep trying to wrap my head around it as well.
PS - I should also mention that it works in gatsby develop
but not after gatsby build
while running gatsby serve
, or live on the internet via netlify:
Hi @universse,
I think i figured out what's causing the the infinity:aN
issue...gatsby-plugin-offline
when I disable that plugin then the duration works properly in safari, but when I re-enable that plugin and push it back to the server, then the infinity:aN
comes back...no idea how to fix it yet, but at least I'm starting to narrow it down...also found this article, but have yet to understand it at all in any way...
https://philna.sh/blog/2018/10/23/service-workers-beware-safaris-range-request/
Can you console.log(state.audioPlayer.duration)
? I wonder what the un-formatted value is.
So, in develop mode or with gatsby-plugin-offline
disabled, I'm getting the raw duration values, like so:
...in which duration is NaN
without audio selected or playing, but then changes to an integer value while playing.
With gatsby-plugin-mode
enabled, this happens:
...where the value is NaN
before audio plays and infinity
after a play button has been pressed, and every time the setTimeout
runs the infinity value is re-calcualted? I think.
As a side note, I'm exploring the possibility of using a library like howlerjs instead of continuing to roll my own, as they have accounted for many edge cases and have a nice suite of features built-in, like streaming and chunking and sprite-ing and loading states, spatialization, etc...
There's even a nice little library called:
https://github.com/E-Kuerschner/useAudioPlayer
...which is a react hooks wrapper around howler, and the developer is super chill as well.
Been learning so much doing this though, so maybe even digging into the howler code and seeing if I can port any of the techniques over might be a good idea as well.
Seems to be an issue with service worker serving audio. I'm not entirely sure how to fix that. I suggest you open a new issue for this.
Most helpful comment
Can you try this