Crate rsynth[−][src]
Rsynth
An API abstraction for API’s for audio plugins and applications. Use it to write real-time audio effects, software synthesizers, … and target different platforms (vst, jack, offline audio rendering, …). It is currently most suitable for real-time or “streaming” audio processing. E.g. you cannot use it to reverse audio in time.
Back-ends
rsynth
currently supports the following back-ends:
jack
(behind thebackend-jack
feature)vst
(behind thebackend-vst
feature)combined
combine different back-ends for audio input, audio output, midi input and midi output, mostly for offline rendering and testing (behind various features)
See the documentation of each back-end for more information.
Features and how to use them
rsynth
puts common functionality of the different backends behind common traits.
Conversely, a plugin can be used for different backends by implementing common traits.
A mix-and-match approach is used: if a backend doesn’t require a certain functionality,
you don’t need the corresponding trait.
Starting the backend/entry point for the host
Meta-data
There are a number of traits that an application or plugin needs to implement in order to define
meta-data. Every plugin or application should implement these, but it can be tedious, so you can
implement these traits in a more straightforward way by implementing the Meta
trait.
However, you can also implement these trait “by hand”.
Meta-data for Jack
Applications need to implement
CommonPluginMeta
(name of the plugin etc)AudioHandlerMeta
(number of audio ports)CommonAudioPortMeta
(names of the audio in and out ports)MidiHandlerMeta
(number of midi ports)CommonMidiPortMeta
(names of the audio in and out ports)
No meta-data for offline rendering
Applications do not need to implement special traits describing meta-data.
Meta-data for VST 2.4
Plugins need to implement
CommonPluginMeta
(name of the plugin etc)AudioHandlerMeta
(number of audio ports)CommonAudioPortMeta
(names of the audio in and out ports)VstPluginMeta
(vst-specific meta-data)
Rendering audio
All backends require the plugin/application to implement the ContextualAudioRenderer
trait.
ContextualAudioRenderer
has two type parameters and the type parameter depends on the
backends to use.
One type parameter is the data type used to represent a sample.
The other type parameter is called the “context” and can be used to access functionality of
the backend in the audio rendering itself.
Common functionality of the context is defined in the HostInterface
trait.
The application or plugin can have either a generic implementation of the ContextualAudioRenderer
or choose to use different, specialized implementations if different behaviour is needed.
Rendering audio with Jack
Applications need to implement
Rendering audio offline
Applications need to implement
ContextualAudioRenderer
<S,
MidiWriterWrapper
<
Timed
<
RawMidiEvent
>>>
Note: the type parameterS
, which represents the sample data type, is free.
Rendering audio with VST 2.4
Plugins need to implement
Note: HostCallback
is re-exported from the vst crate, but implements rsynth
’s
HostInterface
, which defines functionality shared by all backends.
Handling (midi) events
A plugin or application can handle events (typically midi events) by implementing the
ContextualEventHandler
trait. This trait is generic over the event type. It also has
a second type parameter, the context, which typically corresponds to the host, so that
plugins or applications can have access to the host while handling events.
Handling events with Jack
Applications need to implement
ContextualEventHandler
<
Indexed
<
Timed
<
RawMidiEvent
>>,
JackHost
>
,ContextualEventHandler
<
Indexed
<
Timed
<
SysExEvent
>>,
JackHost
>
Handling events with the “offline” backend
Applications need to implement
Note: EventHandler
is similar to ContextualEventHandler
, but without the context.
We would like to make this more uniform in a future version and also require
ContextualEventHandler
here.
Handling events with VST 2.4 Plugins need to implement
ContextualEventHandler
<
Timed
<
RawMidiEvent
>,
HostCallback
>
andContextualEventHandler
<
Timed
<
SysExEvent
>,
HostCallback
>
.
Note: VST 2.4 does not support sample-accurate events; a dummy timestamp of 0
is always added.
Note: HostCallback
is re-exported from the vst crate, but implements rsynth
’s
HostInterface
, which defines functionality shared by all backends.
Generating midi events
The “context” parameter passed in the methods from the ContextualAudioRenderer
and
ContextualEventHandler
traits allows to access features from the host/backend, such as
generating midi events.
Generating midi events with Jack
JackHost
implements the following traits:
Generating midi events with offline rendering
MidiWriterWrapper
implements
Generating midi events with VST 2.4 is not possible
Stopping the backend
The “context” parameter passed in the methods from the ContextualAudioRenderer
and
ContextualEventHandler
traits allows to access features from the host/backend, such as
stopping the backend.
All “backends” implement the HostInterface
trait, which defines a stop
method.
The stop
method only actually does something if the backend additionally implements
the Stop
trait.
Stopping Jack
Stopping Jack is possible: JackHost
implements the Stop
trait.
Stopping offline rendering
Stopping offline rendering is possible: MidiWriterWrapper
implements the Stop
trait.
Additionally, offline rendering automatically stops when the fill_buffer
method of the
AudioReader
indicates that no frames are to be expected anymore.
Stopping VST 2.4 is not possible
Modules
backend | Backends. |
buffer | Audio buffers. |
envelope | Deprecated This module has not been thoroughly tested, so expect some rough edges here and there. |
event | Event handling |
meta | Mechanisms for defining the meta-data of a plugin or application. |
rsor | Re-exports from the |
test_utilities | Utilities for testing. |
utilities | Deprecated |
vecstorage | Re-exports from the |
Macros
audio_chunk | Create an audio chunk. |
vst_init | A wrapper around the |
Traits
AudioHandler | Define how sample-rate changes are handled. |
AudioHandlerMeta | Define the maximum number of audio inputs and the maximum number of audio outputs. |
AudioRenderer | Defines how audio is rendered. |
CommonAudioPortMeta | Provides some meta-data of the audio-ports used by the plugin or application to the host.
This trait can be more conveniently implemented by implementing the |
CommonMidiPortMeta | Provides some meta-data of the midi-ports used by the plugin or application to the host.
This trait can be more conveniently implemented by implementing the |
CommonPluginMeta | Provides common meta-data of the plugin or application to the host.
This trait is common for all backends that need this info.
This trait can be more conveniently implemented by implementing the |
ContextualAudioRenderer | Defines how audio is rendered, similar to the |
MidiHandlerMeta | Define the maximum number of midi inputs and the maximum number of midi outputs.
This trait can be more conveniently implemented by implementing the |