Blazor Audio Recorder.
Add audio recording features to your Blazor Webassembly app by harnessing JavaScript interop.
Please view the live demo at https://arvinboggs.github.io/AudioRecorderLive
Project page: https://github.com/arvinboggs/AudioRecorder
Open (or create) a Blazor WebAssembly project.
- Create a JavaScript file, named
AudioRecorder.js
, inside thewwwroot/scripts
folder. Create thescripts
folder if it does not exist yet.At the time of writing this article, Blazor has several things that it cannot do out of the box. One of these things is recording audio. Therefore, we will use JavaScript interop to achieve our goal.
Put the initial content of
AudioRecorder.js
:1 2 3 4 5
// AudioRecorder.js var BlazorAudioRecorder = {}; (function () { })();
We will be needing Blazor to call our JavaScript functions later and the only JavaScript objects that are visible to Blazor are the global-level ones. So we declared
BlazorAudioRecorder
as a global variable. This variable will be the parent object for the functions that we will be creating later.- Add code to start the recording.
1 2 3 4 5 6 7 8
var mStream; var mMediaRecorder; BlazorAudioRecorder.StartRecord = async function () { mStream = await navigator.mediaDevices.getUserMedia({ audio: true }); mMediaRecorder = new MediaRecorder(mStream); mMediaRecorder.start(); };
The preceding code will start the recording. But, by itself, will not be able to output anything useful.
- Add the function to capture data while recording. Insert this code before the
mMediaRecorder.start()
line.1 2 3
mMediaRecorder.addEventListener('dataavailable', vEvent => { mAudioChunks.push(vEvent.data); });
- Before stopping recording, we should be able to capture the final bits of data and convert the entire recording into a usable audio file.
1 2 3 4
mMediaRecorder.addEventListener('stop', () => { var pAudioBlob = new Blob(mAudioChunks, { type: "audio/mp3;" }); var pAudioUrl = URL.createObjectURL(pAudioBlob); });
- Add the function to stop the recording.
1 2 3 4
BlazorAudioRecorder.StopRecord = function () { mMediaRecorder.stop(); mStream.getTracks().forEach(pTrack => pTrack.stop()); };
- It’s time to edit our
Index.razor
file. Add the buttons.1 2 3 4 5 6
<!-- Index.razor --> <button>Start Record</button> <button>Pause</button> <button>Resume</button> <button>Stop</button> <button>Download Audio</button>
- Then inject
IJSRuntime
at the top of the page.@inject IJSRuntime mJS
Add a
Click
event handler to the Start Record button. TheBlazorAudioRecorder.StartRecord
is from the JavaScript file that we created earlier.1
<button @onclick="butRecordAudioStart_Click">Start Record</button>
1 2 3 4 5 6
@code { void butRecordAudioStart_Click() { mJS.InvokeVoidAsync("BlazorAudioRecorder.StartRecord"); } }
Likewise, add a
Click
event handler to each of the remaining buttons. Each event handler is just a line of code that invokes its respective JavaScript function.- In our JavaScript file, inside the
MediaRecorder
stop
event, we need to find a way for the JavaScript to pass the audio URL back to our Blazor code. First, we need to pass an instance of our Razor class to JavaScript. The following code will call theInitialize
(which we will code later) function in our JavaScript file passing the current instance of our Razor class as the parameter.1 2 3 4 5
protected override async Task OnInitializedAsync() { await base.OnInitializedAsync(); await mJS.InvokeVoidAsync("BlazorAudioRecorder.Initialize", DotNetObjectReference.Create(this)); }
- Implement
BlazorAudioRecorder.Initialize
in our JavaScript file.1 2 3 4 5 6
// AudioRecorder.js var mCaller; BlazorAudioRecorder.Initialize = function (vCaller) { mCaller = vCaller; };
- Now that we have an instance of our Razor class (in a form of
DotNetObjectReference
), we can now pass the audio URL to our Blazor code.1 2 3 4 5
mMediaRecorder.addEventListener('stop', () => { var pAudioBlob = new Blob(mAudioChunks, { type: "audio/mp3;" }); var pAudioUrl = URL.createObjectURL(pAudioBlob); mCaller.invokeMethodAsync('OnAudioUrl', pAudioUrl); });
In the preceding code, a call was made to
OnAudioUrl
(which will code in the next step) passing the audio URL. Implement the
OnAudioUrl
function in our Razor file. The function must bepublic
and must be decorated withJSInvokable
attribute so that it can be called from JavaScript.1 2 3 4 5 6 7 8 9
// Index.razor string mUrl; [JSInvokable] public async Task OnAudioUrl(string vUrl) { mUrl = vUrl; await InvokeAsync(() => StateHasChanged()); }
- Pass the audio URL to the
audio
HTML element. By doing this, the user now playback the recording.1
<audio controls autoplay src=@mUrl></audio>
- For the Download Audio button’s
click
event, make a call to JavaScript.1 2 3 4
void butDownloadBlob_Click() { mJS.InvokeVoidAsync("BlazorAudioRecorder.DownloadBlob", mUrl, "MyRecording.mp3"); }
- Implement the
DownloadBlob
function in our JavaScript file. Note that in this “download”, everything happens in the browser and no server is involved.1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
BlazorAudioRecorder.DownloadBlob = function (vUrl, vName) { // Create a link element const link = document.createElement("a"); // Set the link's href to point to the Blob URL link.href = vUrl; link.download = vName; // Append link to the body document.body.appendChild(link); // Dispatch click event on the link // This is necessary as link.click() does not work on the latest firefox link.dispatchEvent( new MouseEvent('click', { bubbles: true, cancelable: true, view: window }) ); // Remove the link from the body document.body.removeChild(link); };