Core Ownaudio.Core API Reference

Low-level cross-platform audio engine providing real-time audio playback and recording with native performance.

🚀 Native Engine (Default) The native C++ engine is the default, providing GC-free, glitch-free audio processing. Uses PortAudio (if installed) or embedded miniaudio fallback. Managed C# engines are available for development/testing.

Overview

Ownaudio.Core is the foundation for cross-platform audio applications:

Supported Platforms

Platform Native API Assembly
Windows WASAPI Ownaudio.Windows
macOS Core Audio Ownaudio.macOS
Linux PulseAudio Ownaudio.Linux
iOS Core Audio Ownaudio.iOS
Android AAudio Ownaudio.Android

Getting Started

Quick example to get you started with Ownaudio.Core:

Basic Usage Example
using Ownaudio.Core;
using Ownaudio.Decoders;

// Create and initialize engine
using var engine = AudioEngineFactory.CreateDefault();

// Start the engine
engine.Start();

// Load and decode audio file
using var decoder = AudioDecoderFactory.Create(
    "music.mp3",
    targetSampleRate: 48000,
    targetChannels: 2
);

// Read and play audio
var buffer = new byte[4096 * 2 * sizeof(float)];
while (true)
{
    var result = decoder.ReadFrames(buffer);
    if (result.IsEOF) break;
    
    if (result.FramesRead > 0)
    {
        int bytesRead = result.FramesRead * 2 * sizeof(float);
        var samples = System.Runtime.InteropServices.MemoryMarshal
            .Cast(buffer.AsSpan(0, bytesRead));
        engine.Send(samples);
    }
}

// Stop the engine
engine.Stop();

Interfaces

IAudioEngine

Platform-independent audio engine interface for real-time playback and recording.

IAudioEngine Interface
public interface IAudioEngine : IDisposable
{
    // Properties
    int FramesPerBuffer { get; }
    
    // Lifecycle
    int Initialize(AudioConfig config);
    int Start();
    int Stop();
    int OwnAudioEngineActivate();
    int OwnAudioEngineStopped();
    
    // Audio I/O
    void Send(Span samples);
    int Receives(out float[] samples);
    IntPtr GetStream();
    
    // Device Management
    List GetOutputDevices();
    List GetInputDevices();
    int SetOutputDeviceByName(string deviceName);
    int SetOutputDeviceByIndex(int deviceIndex);
    int SetInputDeviceByName(string deviceName);
    int SetInputDeviceByIndex(int deviceIndex);
    
    // Events
    event EventHandler OutputDeviceChanged;
    event EventHandler InputDeviceChanged;
    event EventHandler DeviceStateChanged;
}

Key Methods

Method Description
Initialize(AudioConfig) Initialize engine with configuration. Returns 0 on success.
Start() Start audio engine (thread-safe, idempotent).
Stop() Stop audio engine gracefully (thread-safe, idempotent).
Send(Span<float>) Send audio samples to output (blocking, zero-allocation).
Receives(out float[]) Receive audio samples from input (uses pooled buffers).

IAudioDecoder

Audio file decoder interface for WAV, MP3, and FLAC formats.

IAudioDecoder Interface
public interface IAudioDecoder : IDisposable
{
    AudioStreamInfo StreamInfo { get; }
    
    // Zero-allocation reading (recommended)
    AudioDecoderResult ReadFrames(byte[] buffer);
    
    // Seeking
    bool TrySeek(TimeSpan position, out string error);
}

Usage Example

Zero-Allocation Decoder Usage
// Create decoder
using var decoder = AudioDecoderFactory.Create(
    "music.mp3",
    targetSampleRate: 48000,
    targetChannels: 2
);

var info = decoder.StreamInfo;
Console.WriteLine($"Duration: {info.Duration}");
Console.WriteLine($"Sample Rate: {info.SampleRate} Hz");
Console.WriteLine($"Channels: {info.Channels}");

// Create reusable buffer
int bufferSize = 4096 * info.Channels * sizeof(float);
var buffer = new byte[bufferSize];

// Decode frames efficiently
while (true)
{
    var result = decoder.ReadFrames(buffer);
    if (result.IsEOF) break;
    
    if (result.FramesRead > 0)
    {
        int bytesRead = result.FramesRead * info.Channels * sizeof(float);
        var samples = System.Runtime.InteropServices.MemoryMarshal
            .Cast(buffer.AsSpan(0, bytesRead));
        
        // Process audio samples
        ProcessAudio(samples);
    }
}

Factory Classes

AudioEngineFactory

Creates platform-specific audio engine instances with automatic platform detection.

Method Description
Create(AudioConfig) Create engine with custom configuration
CreateDefault() Create with default settings (48kHz, stereo, 512 frames)
CreateLowLatency() Create with low latency settings (128 frames)
CreateHighLatency() Create with high latency settings (2048 frames)
Factory Usage
// Default configuration
using var engine1 = AudioEngineFactory.CreateDefault();

// Low latency
using var engine2 = AudioEngineFactory.CreateLowLatency();

// Custom configuration
var config = new AudioConfig
{
    SampleRate = 44100,
    Channels = 2,
    BufferSize = 256,
    EnableInput = true
};
using var engine3 = AudioEngineFactory.Create(config);

AudioDecoderFactory

Creates audio decoders with automatic format detection from file extension or header.

Method Description
Create(string, int, int) Create decoder from file path (auto-detects format)
Create(Stream, AudioFormat, int, int) Create decoder from stream with specified format
DetectFormat(Stream) Detect audio format from stream header (magic bytes)

Supported Formats

Format Types Implementation
WAV PCM, IEEE Float, ADPCM Pure C# implementation
MP3 MPEG-1/2 Layer III Platform-specific or managed fallback
FLAC Free Lossless Audio Codec Pure C# managed implementation
Decoder Factory Usage
// From file path (auto-detect format)
using var decoder1 = AudioDecoderFactory.Create(
    "music.mp3",
    targetSampleRate: 48000,
    targetChannels: 2
);

// From stream with explicit format
using var stream = File.OpenRead("audio.wav");
using var decoder2 = AudioDecoderFactory.Create(
    stream,
    AudioFormat.Wav,
    targetSampleRate: 48000,
    targetChannels: 2
);

// Detect format
using var fileStream = File.OpenRead("unknown.audio");
var format = AudioDecoderFactory.DetectFormat(fileStream);
Console.WriteLine($"Detected format: {format}");

Configuration

AudioConfig

Audio engine configuration parameters.

Property Type Default Description
SampleRate int 48000 Sample rate in Hz (typical: 44100, 48000, 96000)
Channels int 2 Number of channels (1=mono, 2=stereo)
BufferSize int 512 Buffer size in frames (affects latency)
EnableInput bool false Enable audio input/recording
EnableOutput bool true Enable audio output/playback
OutputDeviceId string? null Specific output device ID (null for default)
InputDeviceId string? null Specific input device ID (null for default)
HostType EngineHostType None Host API type (PortAudio only, ignored by miniaudio)
InputChannelSelectors int[]? null Select which physical input channels to record from. Works on all backends. See Channel Routing.
OutputChannelSelectors int[]? null Select which physical output channels to play back on. Works on all backends. See Channel Routing.

Predefined Configurations

Configuration Example
// Use predefined config
var config1 = AudioConfig.Default;

// Custom configuration
var config2 = new AudioConfig
{
    SampleRate = 44100,
    Channels = 1,           // Mono
    BufferSize = 256,       // Low latency
    EnableInput = true,
    EnableOutput = true
};

// Validate configuration
if (config2.Validate())
{
    Console.WriteLine("Configuration is valid");
}

AudioDeviceInfo

Information about an audio device.

Property Type Description
DeviceId string Unique device identifier
Name string Human-readable device name
IsInput bool True if device supports input
IsOutput bool True if device supports output
IsDefault bool True if this is the default device
State AudioDeviceState Current device state
MaxInputChannels int Maximum number of input channels supported (0 if unavailable)
MaxOutputChannels int Maximum number of output channels supported (0 if unavailable)
Device Enumeration & Channel Routing
using var engine = AudioEngineFactory.CreateDefault();

// List output devices and their channel counts
var devices = engine.GetOutputDevices();
foreach (var device in devices)
{
    Console.WriteLine($"Device: {device.Name}");
    Console.WriteLine($"  ID: {device.DeviceId}");
    Console.WriteLine($"  Default: {device.IsDefault}");
    Console.WriteLine($"  Max Input Channels: {device.MaxInputChannels}");
    Console.WriteLine($"  Max Output Channels: {device.MaxOutputChannels}");
}

// Route stereo output to physical channels 4 and 5 (e.g. second pair of a multi-output interface)
var config = new AudioConfig
{
    Channels = 2,
    OutputChannelSelectors = new[] { 4, 5 }  // physical channel indices (0-based)
};
using var engine2 = AudioEngineFactory.Create(config);

// ASIO with native hardware channel selection
var asioConfig = new AudioConfig
{
    HostType = EngineHostType.ASIO,
    Channels = 2,
    OutputChannelSelectors = new[] { 4, 5 }  // native ASIO channel selectors
};
using var asioEngine = AudioEngineFactory.Create(asioConfig);

Channel Routing

Channel routing lets you direct audio to specific physical output channels on multi-output devices (e.g. USB audio interfaces, ASIO cards, DAW-style monitor routing). Set OutputChannelSelectors to an array of physical channel indices you want to use. The array length must match the Channels property.

🔀 Universal Support Channel routing works on all backends and platforms — not just ASIO. For PortAudio + ASIO, native hardware channel selection is used (zero overhead). For all other backends (WASAPI, CoreAudio, ALSA, MiniAudio), software routing is applied inside the audio callback with no additional latency.

Routing Method by Backend

Engine / Backend Platform Routing Method
PortAudio + ASIO Windows Native hardware (PaAsioStreamInfo)
PortAudio + WASAPI Windows Software routing in callback
MiniAudio Windows / macOS / Linux / Android / iOS Software routing in callback

How It Works

When OutputChannelSelectors is set, the engine internally opens the device with max(selectors)+1 physical channels. The audio callback maps each logical channel to its designated physical channel, leaving others silent.

Channel Routing – Stereo + Metronome on Separate Outputs
// 8-channel interface: stereo music on ch 0-1, click track on ch 4-5
// Logical channels:  0   1   2   3
// Physical channels: 0   1  (skip 2,3)  4   5
var config = new AudioConfig
{
    SampleRate = 48000,
    Channels   = 4,                             // 4 logical channels
    OutputChannelSelectors = new[] { 0, 1, 4, 5 } // mapped to physical 0,1,4,5
};

using var engine = AudioEngineFactory.Create(config);
engine.Start();

// Build an interleaved 4-channel buffer per frame:
// [frame0_ch0, frame0_ch1, frame0_ch2, frame0_ch3, frame1_ch0, ...]
var buffer = new float[frameCount * 4];
for (int i = 0; i < frameCount; i++)
{
    buffer[i * 4 + 0] = musicLeft[i];     // → physical ch 0
    buffer[i * 4 + 1] = musicRight[i];    // → physical ch 1
    buffer[i * 4 + 2] = clickLeft[i];     // → physical ch 4
    buffer[i * 4 + 3] = clickRight[i];    // → physical ch 5
}

engine.Send(buffer.AsSpan());
⚠️ Validation Rule The length of OutputChannelSelectors must equal the Channels property value. AudioConfig.Validate() returns false if they mismatch or if any selector index is negative.

AudioStreamInfo

Information about an audio stream from a decoder.

Property Type Description
SampleRate int Sample rate in Hz
Channels int Number of channels
Duration TimeSpan Total duration of the stream
TotalFrames long Total number of frames

Enumerations

AudioDeviceState

Possible states of an audio device.

AudioDeviceState Enum
[Flags]
public enum AudioDeviceState
{
    Active      = 0x00000001,  // Device is active and available
    Disabled    = 0x00000002,  // Device is disabled
    NotPresent  = 0x00000004,  // Device is not present
    Unplugged   = 0x00000008,  // Device is unplugged
    All         = 0x0000000F   // All device states
}

AudioFormat

Supported audio file formats.

AudioFormat Enum
public enum AudioFormat
{
    Unknown,  // Unknown format
    Wav,      // WAV format (PCM, IEEE Float, ADPCM)
    Mp3,      // MP3 format (MPEG-1/2/2.5 Layer III)
    Flac      // FLAC format (Free Lossless Audio Codec)
}

EngineHostType

Host API types for PortAudio backend (ignored by miniaudio).

EngineHostType Enum
public enum EngineHostType
{
    None = 0,           // Use platform default
    DirectSound = 1,    // Windows DirectSound
    MME = 2,            // Windows MME
    ASIO = 3,           // Windows ASIO
    WASAPI = 13,        // Windows WASAPI (recommended)
    CoreAudio = 5,      // macOS Core Audio
    ALSA = 8,           // Linux ALSA
    JACK = 12           // Linux JACK
}

Common Components

The Ownaudio.Core.Common namespace provides utility classes for audio processing:

Class Description
AudioBuffer Efficient audio buffer management
AudioChannelConverter Channel conversion (mono/stereo)
AudioResampler Sample rate conversion
AudioFormatConverter Format conversion utilities
LockFreeRingBuffer Lock-free circular buffer for real-time audio
AudioFramePool Object pool for audio frames
SimdAudioConverter SIMD-optimized audio conversion
DecodedAudioCache Cache for decoded audio data
Performance Characteristics
  • Zero-allocation: Critical audio paths use zero-allocation techniques
  • Lock-free: Real-time safe data structures for thread synchronization
  • SIMD optimizations: Available on supported platforms
  • Thread-safe: All public APIs are thread-safe

Latency Reference

Buffer Size Latency @ 48kHz Use Case
128 frames ~2.7 ms Professional audio, live monitoring
256 frames ~5.3 ms Low latency applications
512 frames ~10.7 ms Default, balanced performance
1024 frames ~21.3 ms Standard applications
2048 frames ~42.7 ms High reliability, less CPU usage

Related Documentation