Dart, WASM and AssemblyScript - Oh my!

30 Jul 2021

dart heart as logos

Introduction

With the availability of first FFI in Dart and now WebAssembly FFI interop, a whole new range of possibilities have opened up for using Dart to create rich, non-GUI based applications on desktop OS's such as Linux, Windows and MacOS and even on "large embedded" devices such as Raspberry Pis.

In this article, I want to cover how I have made use of the brand new Dart WASM package to allow me to build the beginnings of a Dart-based audio synthesizer, making use of a rich existing codebase developed in AssemblyScript and which can serve as working example of how to make use of code written in AssemblyScript from Dart.

From Synth1 to as-audio

screenshot of youtube video of wasm music talk

This particular journey started for me when I first came across Peter Salomonsen's fantastic work on creating browser-based synthesizer music in this great presentation he did back in Feb of 2020. While Peter's AssemblyScript code (compiled to Wasm) was very cool, it did not really fit with the Dart hardware-based audio experimental project I had started working on, so while I had a fun time playing with the online demo and learnt a lot from watching the talk and looking at the code, I moved on to other things.

That is until I happened to come across the in-progress Dart Wasm package at which point the stars aligned and I could see a path to making use of part of Peter's excellent wasm "synth1" code base.

While I wanted to use the synth1 as is, given it was intended to run in a web browser environment meant that some parts were not really relevant to what I was doing and other parts such as midi processing and sequencing I was already handling in my own Dart code, so I decided to create a new mini project called as-audio where I could piece-by-piece bring across parts of the Synth1 AssemblyScript codebase that I needed for my project and easily build into a standalone wasmbinary file.

Using AssemblyScript Wasm from Dart

While, even in it's pre-published state, the wasm package had good documentation and an example of making use of wasm compiled from C++, there was understandably no documentation on how to use it with Wasm compiled from AssemblyScript. So to first understand how to use AS wasm from Dart, I and now you dear reader need to take a diversion into learning a bit about AssemblyScript itself.

Assembly what?

AssemblyScript describes itself as being: "Designed for WebAssembly: AssemblyScript targets WebAssembly's feature set specifically, giving developers low-level control over their code". In terms of syntax, I would describe the language as essentially a subset of Typescript, the subset being focused on being strictly typed in a manner suitable for compiling into Webassembly.

Another nice aspect of AS from my point of view is that with a background of C, Java, JS, Kotlin and Dart, I found it quite straight forward to pick, it really does have a simple to setup toolchain for compiling into binary wasm (just npm installas advertised on it's website) and importantly its also very flexible when it comes to things like memory management.

Given this article is focused on using wasm from Dart that just happens to be compiled from AS, I'll not go into further details on how I setup my as-audio project to build a binary .wasmfile and instead direct the interested reader to the as-audio github repo.

Dart and not yet Wasm

Once I had the initial version of my as-audio setup to build a binary .wasm file I was ready to start trying to use it from Dart. But before I made a start on calling the Webassembly code, I decided I wanted to have a simpler starting reference point so I first created an interface for the oscillator:

abstract class Oscillator {
  double next();
}

And then proceeded to create a Dart port of the AS code for something like the "helloworld" of audio synthesis code, a Sinewave oscillator:

class DartSineOscillator implements Oscillator {
  int position = 0;
  final double frequency;
  final sampleRate;

  DartSineOscillator(this.sampleRate, this.frequency);

  @override
  double next() {
    final ret = sin(pi * 2 * (position) / (1 << 16));
    position =
        (((position) + (frequency / sampleRate) * 0x10000).toInt()) & 0xffff;

    return ret;
  }
}

Having this meant that I now had a way to generate sound samples and test their playback (more on that shortly) before I needed to deal with any complications with using Webassembly.

A rose by another name

Finally at this point I could try actually to make use of the wasm module from Dart. As described in the excellent announcement/tutorial article on the newpackage:wasmusing a compiled wasm binary module is very straightforward:

var wasmfile = Platform.script.resolve(wasmfilepath);
var moduleData = File(wasmfile.path).readAsBytesSync();
WasmModule _wasmModule = WasmModule(moduleData);
WasmInstance _instance = _wasmModule.builder().build();

and likewise it shows that there is a very nice method in the package to get you a list of all the modules imports and exports:

print(module.describe());

In the case of my as-audio module you get:

export global: var float32 SAMPLERATE
export global: const int32 SineOscillator
export function: int32 SineOscillator#get:position(int32)
export function: void SineOscillator#set:position(int32, int32)
export function: float32 SineOscillator#get:frequency(int32)
export function: void SineOscillator#set:frequency(int32, float32)
export function: float32 SineOscillator#next(int32)
export function: int32 SineOscillator#constructor(int32)
...

I'll come back to SAMPLERATE a little later, but for now if you compare the above to the source AS of the SineOscillator class:

export class SineOscillator {
    position: u32 = 0;
    frequency: f32 = 0;

    next(): f32 {
        let ret = sin(PI * 2 * (this.position as f32) / (1 << 16 as f32));
        this.position = (((this.position as f32) + (this.frequency / SAMPLERATE)
            * 0x10000 as f32) as u32) & 0xffff;

        return ret as f32;
    }
}

One thing that you notice that is not covered in the C code "Brotli" example in the article, is that if you compare the source code with the list of exports is that the exported function names don't exactly match the ones in the AS code. This is because WebAssembly has no concept of OOP (Object Oriented Programming) and so no concept of classes that AS has. Because of this, the AS compiler "mangles" the names to encode both the class name and the method name in the resulting Wasm function name.

This process should be quite familiar to anyone who has had to deal with interfacing C to C++ code and the simliar name mangling C++ compilers use by default, though in the case of AS, the name mangling scheme is very straight forward as you can see above.

Another thing that is different when it comes to interfacing with AS compiled wasm vs wasm compiled from C code is the curious case of the extra int32 argument being added to the exported wasm functions. Again this could be familiar to developers coming from a OOP background, as it's the "classic" OOP trick of adding an implicit thisreference to every method call on an object. But where do you get the reference to the object? Well that of course comes from the return value of calling the objects classes constructor, so to instantiate and then use a SineOscillator class in Dart you would have something like:

final cons = helper.getMethod(oscClassname, 'constructor');
// calling a Assemblyscript constructor returns a i32 which is "reference" to the object created by it
_oscObjectRef = cons(0);
_setFrequency = helper.getMethod(oscClassname, 'set:frequency');

and the helper simply encodes our understanding of the AS name-mangling scheme:

dynamic getMethod(String className, String methodName) {
  return _instance.lookupFunction('$className#$methodName');
}

I intentionally used the setting the frequency as an example above because it also demonstrates that there is no magic to AS object properties and that the AS compiler simply defines implicit getter and setter methods using a colon separator name-mangling convention and this should be quite familiar to Dart users where Dart does pretty much the same thing. More details about AS's name mangling of exported functions, including how to customise it can be found in its very nice documentation.

A native FFI detour

Of course generating audio samples is not that much use without a means to play them, so I needed a means to send the audio samples to my computers audio card. While there are already several audio plugins available for use in Flutter apps, unfortunately they are all focused on the admittedly more common use case of playing back audio (usually compressed-audio) files or network streams, rather than small sets of audio samples generated in real-time.

Because of the above limitation in existing Flutter plugins, I had previously started building a Dart (FFI to the rescue again!) FFI wrapper for the standard Linux ALSA asound library. But while I did get initial playback working, I was very happy to find in the meantime that someone else had already done the same thing for libao which not only provides a nicer/simpler API to ALSA but does so also for the PulseAudio sound-server as well as a number of other OS's (macos, windows, etc) so I decided to make use of it for playback.

Making use of the libao package is about as easy as audio sample playback gets, essentially consisting of some initial setup of parameters and opening the playback device:

    const bits = 16;
      const channels = 2;
      const rate = 44100;
    
      final device = ao.openLive(
        driverId,
        bits: bits,
        channels: channels,
        rate: rate,
        matrix: 'R'
      );

Followed by the actual playback of the buffer of audio samples we previously generated in our synth code:

for (var i = 0; i < rate; i++) {
    final sample = (osc.next() * volume * 32768.0).toInt();
    // Left = Right.
    buffer[4 * i] = buffer[4 * i + 2] = sample & 0xff;
    buffer[4 * i + 1] = buffer[4 * i + 3] = (sample >> 8) & 0xff;
  }

  ao.play(device, buffer);

A full yet minimal example of how to both call the AS wasm code from Dart and then pass that through to libao for playback can be found here. And I cover more about using Dart FFI with native libraries in a previous article.

Globals are Good for Guests too

Having worked out the basics of calling a wasm function from Dart, one initial roadblock I ran into was that the AS synth code made use of "guest globals" and I found that in it's initial pre-published state the wasm package did not expose the API for this. Yet I was very pleasantly surprised that within a few days of my opening an issue about it, the feature was implemented by the Dart team!

So what are guest globals and why did I need them?

Well as described in the wasmer documentation, they are as the name suggests global variables that WebAssembly code can expose to it's host environment to read/write to. In the case of the as-audio code, this was needed in order for the host environment to be able to set the audio samplerate used:

export  const  SAMPLERATE: f32  =  44100;

NOT the need, no need for speed

While using the AS compiled wasm is likely reasonably performant (though I haven't actually bothered to do any benchmarking yet), I wanted to point out that my motivation for doing this was not for performance reasons, but rather for the ability to make use of an existing, well written and tested body of code for audio processing that I could make use of within my Dart application. Likewise as pointed out by Michael Thomsen in the article announcing the wasm package:

However, because C modules are platform specific, distributing shared packages with native C modules is complicated: it requires either a universal build system, or distributing multiple binary modules (one for each desired platform). Distribution would be much easier if a single Wasm binary assembly format could be used across all platforms. Then, rather than compiling your library to platform-specific binary code for every target platform, you could compile it once to a Wasm binary module and run that everywhere. This would potentially open the door to easy distribution on pub.dev of packages that contain native code.

The path goes ever on

I have only just started on my plan to get all the oscillators, filters, effects and even instruments of Synth1 moved over to as-audio and then exposed to Dart, but if you would like to make use of it as is or keep track of my progress, the code is published and of course PRs are always welcome!

While I've covered here the basics of using AS generated wasm from Dart through to a basic working example, this is not enough for actual usage in a real synthesizer application as the code all runs on the same single Dart run-loop and makes use of a blocking audio output API which means that we would soon hear serious glitches in the audio output if we attempted to play audio for a continuous amount of time while generating the audio samples in real time.

For this we need to make use of Dart Isolates to effectively give use multiple threads of concurrent execution and is something I will be covering in a upcoming article, so please subscribe if you would like to be notified of when it is available.