-
Notifications
You must be signed in to change notification settings - Fork 108
Description
I'm trying to visualise a period of N (time in seconds), using a HLS stream handled by hls.js.
I've created an AudioContext and connected it up with my media element correctly. I then constructed a new process from the context using createScriptProcessor.
Binding a function to onaudioprocess, I grab each audioProcessingEvent.inputBuffer (AudioBuffer) over N seconds, and append them to each other, ultimately creating a AudioBuffer representing N period of time.
I then pass the constructed AudioBuffer to WaveformData.createFromAudio with a scale of 128. The output waveform seems ok at a glance, although i'm not too sure how to verify this...
I'm unable to represent the waveform data using the canvas example in the README.
Are there any tools i can use to verify the data i've produced is correct? Or at least any points to look for.
Should i normalise the data in the ArrayBuffer produced between 0 and 1 before trying to render it? I've noticed there's lots of peaks and troughs.
Furthermore, i've tried to pass the waveform data produced to be represented by peaks.js. The duration of the output is correct, however there are no data points displayed.