Implementing Dual Resolution Waveform Visualization with Wavesurfer.js #3538
Unanswered
cubancodepath
asked this question in
Q&A
Replies: 1 comment
-
This might get you started: https://wavesurfer.xyz/examples/?3fbd8d8b4b6c26435f312d5e9e09a07c The main idea in that example is to share the same media element between two wavesurfer instances. That way, when you seek on the zoomed-out low-res waveform, the zoomed-in hi-res one will jump to the same position. However, since your hi-res waveform only renders a certain portion of the audio, you might have to pad the peaks array with zeroes and re-render on every timeupdate. lowresWavesurfer.on('seek', async () => {
hiresWavesurfer.setOptions({
peaks: await getNewPeaks(lowresWavesurfer.getCurrentTime()),
duration: yourZoomedSegmentDuaration,
})
}) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm working on a project involving audio visualization using WaveSurfer.js and I've encountered a specific challenge for which I'm seeking guidance or solutions. My goal is to render two waveforms for the same audio file: a low-resolution version that displays the entire audio and a second, "zoomed-in" version of a specific section of the audio, but with significantly higher resolution.
To achieve this, I am using two different services:
Low-Resolution Service: This service provides me with the peaks necessary to render the full, low-resolution waveform.
High-Resolution Service: I pass the start time and end time to this service to get the peaks for a specific section of the audio in greater detail.
My question is: Is it possible to implement this functionality using WaveSurfer.js? I'm particularly interested in how to manage two instances of WaveSurfer (if necessary) to display both waveforms simultaneously and how to update the high-resolution waveform based on user selection in the low-resolution waveform.
Furthermore, is there any recommended design pattern or approach to synchronize playback and interaction between these two waveforms?
Any guidance, code suggestions, or references to similar examples would be greatly appreciated.
Thank you in advance for your help.
Beta Was this translation helpful? Give feedback.
All reactions