Rust Audio

Example DSP app using druid for UI

I’ve made an app for mixing sound (https://github.com/derekdreery/jack-mixer). It’s not finished (the eq doesn’t do anything yet) but the volume faders work and there is enough for it to be interesting to look at.

I’ve always wanted to get involved with audio and DSP with rust, but For the past 1.5 years I’ve been doing a university course. I’ve just handed in a big report so I thought I’d treat myself to a few days of rust/audio/gui to see how everything has progressed since I last looked.

I have to say that using druid has been a very positive experience. Although it’s new, it’s already very fully featured, including things like localization. It also uses Flutter’s layout algorithm which is performant as it involves only a single traversal of the widget graph. I’m currently using crossbeam channels to communicate with the realtime (RT) jack thread, but I’m not sure if these are optimal since they still involve some synchronization. So far I haven’t had any buffer underruns. I would welcome suggestions for better strategies for sharing data with the RT thread.

My next task is to understand convolution filters, and build a 3-band equalizer that distorts the signal as little as possible in the unmodified position. I hope that this project inspires others, since it demonstrates that DSP with UI in rust is certainly feasible.

Screenshot

a screenshot of jack-mixer

2 Likes

Awesome stuff @dodj, thanks so much for sharing!

I’m curious to hear more about your experience using druid too - I’ve taken a quick look at it in the past but have not yet had a chance to create something with it. Do you happen to know whether druid requires working with a specific graphics backend, or if it is agnostic? I’d like to experiment with using it alongside nannou, but one requirement would be that I can render it with wgpu and use it alongside the rest of nannou’s graphics stuff.

So I think I can answer this question. The druid project is roughly split into 3 parts:

  1. druid - the UI manager, including things like widget hierarchy, layout, event propagation, issuing paint commands

  2. piet - a HAL for 2d graphics. There is a RenderContext trait that is implemented for multiple backends: currently Direct2D on windows and cairo on mac/linux. wasm support is in the works.

  3. kurbo - A library for converting shapes like arcs, lines, etc into bezier curves, for consumption by a piet backend. So for my app I used basically lines and arcs, but there are other shapes, and the idea is that the collection will grow over time.

There are also plans for a GPU backend, and they are hoping to experiment with a novel algorithm making use of GPU architecture to achieve very high performance (see a video of a talk on it). In that video they talk about the various choices including wgpu, vulkan, gfx-hal, etc. for the implementation. Experiments are using vulkan but there are no firm decisions yet about which to use for the final thing.

I think there is a place for another piet backend, for older GPU hardware like GLES2, using a “traditional” 2D gpu approach, where shapes are converted into a mesh with something like lyon on the CPU and maybe cached/shipped to the GPU as triangles.

So to answer your question: it is renderer-agnostic, but you will probably have to write your own renderer. However I think writing a renderer is certainly do-able. The RenderContext trait provides a very clean interface to implement. And either way I would certainly have a play with it. It’s still fairly unstable and so there isn’t lots of documentation yet, but I think my jack-mixer project is a good example of how to structure a druid app - it contains some custom widgets, but also uses some of the built-in layout widgets (from the Flex family). It uses a top-level widget to be able to intercept changes to application state and relay them to the realtime thread. Communicating with the outside world is one of the things that feels like it may change to be more ergonomic at some point.

The Lens idea is really cool as well - it comes from Haskell and basically given a way to map from T to U can turn a Widget<U> into a Widget<T>, by knowing how to get the U data out of the T. It means that if you have some list of data you can use the same widget for all of the data points and just use different lenses to map to different places in the list.

Hope this is helpful!

Right I’m going to have to go back to my proper work now. But the mixer is now fully functional. The filter is implemented as a FIR with order ~ 20, and it runs fast (1-2% of a single core per channel for me), and you can bypass it to not waste the cycles. Enjoy! :slight_smile:

It’s also on crates.io as cargo install jack-mixer.