System Architecture
Understanding the relationship between Device, Source, Mixer, and Effect components in OwnAudioSharp
System Overview
OwnAudioSharp uses a two-layer architecture with clear separation between low-level platform integration and high-level audio processing:
User Application
Your application code using OwnAudioSharp API
API Layer (OwnaudioNET)
AudioMixer, Source Management, Effects, Features
Engine Layer (Ownaudio.Core)
IAudioEngine, Platform-Specific Engines, Decoders
Hardware Device
WASAPI / PulseAudio / Core Audio / AAudio
1. Device Layer (IAudioEngine)
🔊 Audio Device - Hardware Communication
The Device layer provides direct communication with hardware audio devices. OwnAudioSharp uses native C/C++ engines by default (PortAudio/Miniaudio) for optimal performance, with optional managed C# implementations available for specific scenarios.
AudioEngineFactory.CreateDefault() returns native PortAudio or Miniaudio engines for professional-grade, glitch-free audio. Managed engines (WASAPI/PulseAudio/CoreAudio) are available but optional.
Engine Implementations
| Engine Type | Platform | API/Library | Status |
|---|---|---|---|
| Native (Default) | All Platforms | PortAudio / Miniaudio | Recommended |
| Managed (Optional) | Windows | WASAPI (C# P/Invoke) | Optional |
| Managed (Optional) | Linux | PulseAudio (C# P/Invoke) | Optional |
| Managed (Optional) | macOS | Core Audio (C# P/Invoke) | Optional |
| Managed (Optional) | Android | AAudio (C# P/Invoke) | Optional |
Key Methods
IAudioEngine engine = AudioEngineFactory.CreateDefault();
// Initialize device (50-5000ms - DO NOT call on UI thread!)
engine.Initialize(config);
// Start playback
engine.Start();
// Send audio samples (10-50ms - BLOCKING!)
engine.Send(samples);
// Stop playback (up to 2000ms - BLOCKING!)
engine.Stop();
// Device management
List<AudioDeviceInfo> outputs = engine.GetOutputDevices();
engine.SetOutputDeviceByName("Speakers");
engine.OutputDeviceChanged += OnDeviceChanged;
Initialize(), Send(), and Stop()
are BLOCKING operations! Never call them from UI threads.
2. Source Layer (IAudioSource)
🎵 Audio Source - Data Provider
Sources provide audio data from various origins (files, microphone, generated samples).
Available Source Types
| Type | Description | Use Case |
|---|---|---|
FileSource |
Plays MP3/WAV/FLAC files | Music playback, audio files |
InputSource |
Records from microphone/line-in | Voice recording, audio capture |
SampleSource |
Plays pre-loaded samples | Sound effects, short clips |
GhostTrackSource |
Silent track for synchronization | Timeline sync, DAW-style alignment |
Core Interface
interface IAudioSource
{
// Identity & State
Guid Id { get; }
AudioState State { get; } // Playing/Paused/Stopped
// Configuration
AudioConfig Config { get; }
AudioStreamInfo StreamInfo { get; }
// Playback Control
float Volume { get; set; } // 0.0 - 1.0
double Position { get; } // Current position (seconds)
double Duration { get; } // Total duration
// Tempo & Pitch (SoundTouch)
float Tempo { get; set; } // 1.0 = normal speed
float PitchShift { get; set; } // 0 = no shift
// Playback Operations
void Play();
void Pause();
void Stop();
bool Seek(double seconds);
// Audio Data (HOT PATH - zero allocation!)
int ReadSamples(Span<float> buffer, int frameCount);
}
Usage Example
var fileSource = new FileSource("music.mp3", engine);
fileSource.Volume = 0.8f;
fileSource.Tempo = 1.2f; // 120% speed
fileSource.Play();
3. Effect Layer (IEffectProcessor)
🎛️ Audio Effect - Real-Time Processing
Effects modify audio in real-time. They can be applied to individual sources or the master output.
Available Effects
CompressorEffect
LimiterEffect
AutoGainEffect
EqualizerEffect
Equalizer30BandEffect
ReverbEffect
DelayEffect
ChorusEffect
FlangerEffect
PhaserEffect
RotaryEffect
DistortionEffect
OverdriveEffect
DynamicAmpEffect
SmartMasterEffect
EnhancerEffect
VST3EffectProcessor
(VST3 plugin hosting)
Core Interface
interface IEffectProcessor
{
Guid Id { get; }
string Name { get; }
bool Enabled { get; set; } // Enable/disable
float Mix { get; set; } // Wet/Dry mix (0.0-1.0)
void Initialize(AudioConfig config);
void Process(Span<float> buffer, int frameCount); // HOT PATH!
void Reset(); // Clear buffers
}
Two Application Methods
A) Source-Specific Effects (SourceWithEffects)
var fileSource = new FileSource("guitar.mp3", engine);
var sourceWithEffects = new SourceWithEffects(fileSource);
// Add effects ONLY to this source
sourceWithEffects.AddEffect(new DistortionEffect(0.5f));
sourceWithEffects.AddEffect(new DelayEffect(300f, 0.3f));
mixer.AddSource(sourceWithEffects);
B) Master Effects (AudioMixer)
var mixer = new AudioMixer(engine);
// Effects applied to FINAL MIX of all sources
mixer.AddMasterEffect(new EqualizerEffect(...));
mixer.AddMasterEffect(new LimiterEffect(0.95f));
4. Mixer Layer (AudioMixer)
🎚️ Audio Mixer - Multi-Source Combining
The AudioMixer combines multiple audio sources and sends the mixed result to the device.
Architecture Diagram
AUDIO MIXER
Vol: 0.8
+ Effects
Vol: 1.0
+ Effects
Vol: 0.5
+ Effects
(Multi-threaded)
(EQ, Compressor...)
[Optional]
Usage Example
var engine = AudioEngineFactory.CreateDefault();
var mixer = new AudioMixer(engine, bufferSizeInFrames: 512);
// Add sources
var track1 = new FileSource("drums.mp3", engine);
var track2 = new FileSource("bass.mp3", engine);
var track3 = new FileSource("vocals.mp3", engine);
mixer.AddSource(track1);
mixer.AddSource(track2);
mixer.AddSource(track3);
// Master effects
mixer.AddMasterEffect(new EqualizerEffect(...));
mixer.AddMasterEffect(new CompressorEffect(...));
// Control
mixer.MasterVolume = 0.9f;
mixer.Start();
// Synchronized playback using Master Clock
track1.AttachToClock(mixer.MasterClock);
track2.AttachToClock(mixer.MasterClock);
track3.AttachToClock(mixer.MasterClock);
track1.Play();
track2.Play();
track3.Play();
// Level metering
Console.WriteLine($"L: {mixer.LeftPeak:F2}, R: {mixer.RightPeak:F2}");
// Recording
mixer.StartRecording("output.wav");
Thread Architecture
Dedicated thread!
Key Features
- Lock-free source management - ConcurrentDictionary usage
- Parallel mixing - Multi-core support
- Master clock sync - Timeline-based synchronization (v2.4.0+)
- Real-time level metering - L/R channel peak levels
- WAV recording - Final mix recording
Complete Data Flow
Simple Playback (1 source → 1 device)
Multi-Track Mixing with Effects
Equalizer, Limiter
Practical Examples
Example 1: Simple File Playback
var engine = AudioEngineFactory.CreateDefault();
await Task.Run(() => engine.Initialize(new AudioConfig()));
engine.Start();
var source = new FileSource("music.mp3", engine);
source.Play();
Example 2: Multi-Track DAW-Style Mixing
var engine = AudioEngineFactory.CreateDefault();
var mixer = new AudioMixer(engine);
// Load tracks
var drums = new FileSource("drums.wav", engine);
var bass = new FileSource("bass.wav", engine);
var vocals = new FileSource("vocals.wav", engine);
// Add effects to vocals
var vocalsWithFX = new SourceWithEffects(vocals);
vocalsWithFX.AddEffect(new ReverbEffect(roomSize: 0.7f));
vocalsWithFX.AddEffect(new DelayEffect(200f, 0.3f));
// Add to mixer
mixer.AddSource(drums);
mixer.AddSource(bass);
mixer.AddSource(vocalsWithFX);
// Master chain
mixer.AddMasterEffect(new Equalizer30BandEffect());
mixer.AddMasterEffect(new CompressorEffect(threshold: -10f));
mixer.AddMasterEffect(new LimiterEffect(ceiling: 0.95f));
// Start mixer
mixer.Start();
// Synchronized playback - Method 1: Master Clock (Recommended)
drums.AttachToClock(mixer.MasterClock);
bass.AttachToClock(mixer.MasterClock);
vocalsWithFX.AttachToClock(mixer.MasterClock);
drums.Play();
bass.Play();
vocalsWithFX.Play();
// OR Method 2: Sync Groups (Legacy, still supported)
// mixer.CreateSyncGroup("multitrack", drums, bass, vocalsWithFX);
// mixer.StartSyncGroup("multitrack");
// Recording
mixer.StartRecording("final_mix.wav");
Example 3: VST3 Plugin Usage
var mixer = new AudioMixer(engine);
var source = new FileSource("guitar.wav", engine);
// Load VST3 effect
var vstEffect = new VST3EffectProcessor("C:\\VST3\\Distortion.vst3");
vstEffect.Initialize(source.Config);
var sourceWithVST = new SourceWithEffects(source);
sourceWithVST.AddEffect(vstEffect);
mixer.AddSource(sourceWithVST);
mixer.Start();
Summary
| Component | Responsibility | Instances |
|---|---|---|
| Device (IAudioEngine) | Hardware communication | 1 per application |
| Source (IAudioSource) | Audio data provider | N (unlimited) |
| Effect (IEffectProcessor) | Audio processing | N (per source or master) |
| Mixer (AudioMixer) | Multi-source combining | 1 or more (rare) |
Relationships
N:1 - Multiple sources to one mixer
1:N - One source with multiple effects (SourceWithEffects)
1:N - One mixer with multiple master effects
1:1 - One mixer uses one engine
🎯 Key Takeaways
- Device handles hardware I/O (blocking operations)
- Source provides audio data with playback control
- Effect processes audio in real-time (source or master)
- Mixer combines sources, applies master effects, and sends to device
- Effects can be applied at two levels: per-source or master
- Mixer uses dedicated high-priority thread for mixing
- System supports parallel mixing for multi-core performance