Build a Plugin
Create custom plugins for TypeWhisper using Swift and the TypeWhisperPluginSDK.
Overview
TypeWhisper plugins are standard macOS bundles (.bundle) written in Swift. Each plugin links against the TypeWhisperPluginSDK package and exports a principal class conforming to one or more plugin protocols. The principal class must inherit from NSObject and use the @objc(ClassName) attribute so the bundle loader can instantiate it. Plugins are loaded at launch from ~/Library/Application Support/TypeWhisper/Plugins/.
Plugin Types
There are four plugin protocols you can adopt. A single plugin can conform to multiple protocols (e.g., both transcription and LLM).
TranscriptionEnginePlugin
Provides a speech-to-text engine. Receives audio data (16kHz mono Float samples + pre-encoded WAV) and returns transcribed text. Supports model selection, language detection, translation, and optional streaming via a progress callback.
LLMProviderPlugin
Provides an LLM for processing transcribed text via custom prompts. Receives a system prompt and user text, returns the model's response. Used for text correction, summarization, formatting, and more.
PostProcessorPlugin
Processes text after transcription in a priority-based pipeline. Receives the transcribed text and context (active app, URL, language). Runs alongside built-in processors like snippets and dictionary.
ActionPlugin
Performs an action with LLM-processed text instead of inserting it. Receives the processed text and context, returns a result message shown in the notch indicator. Can include a URL to open and a custom display duration.
Getting Started
Prerequisites
- - macOS 14.0+ (Sonoma) and Xcode 16+
- - Swift 6.0
- - Basic familiarity with macOS bundle targets
1. Create a Bundle Target
In Xcode, create a new macOS Bundle target. Set the principal class in your Info.plist to your plugin's main class name. The class must inherit from NSObject and be annotated with @objc(ClassName) so the runtime can find and instantiate it.
2. Add the SDK Dependency
Add TypeWhisperPluginSDK as a Swift Package dependency:
// Package.swift
dependencies: [
.package(
url: "https://github.com/TypeWhisper/TypeWhisperPluginSDK.git",
from: "1.0.0"
)
]
// Target dependency
.product(name: "TypeWhisperPluginSDK", package: "TypeWhisperPluginSDK")3. Create manifest.json
Add a manifest.json to your bundle's Resources directory:
{
"id": "com.yourname.myplugin",
"name": "My Plugin",
"version": "1.0.0",
"minHostVersion": "0.9.0",
"minOSVersion": "14.0",
"author": "Your Name",
"principalClass": "MyPlugin"
}principalClass must match the name in your @objc(...) annotation. minOSVersion is optional and defaults to 14.0.
4. Build & Install
Build your bundle and copy the .bundle file to ~/Library/Application Support/TypeWhisper/Plugins/. Restart TypeWhisper to load the plugin.
SDK API Reference
TypeWhisperPlugin
Base protocol all plugins must conform to.
public protocol TypeWhisperPlugin: AnyObject, Sendable {
static var pluginId: String { get }
static var pluginName: String { get }
init()
func activate(host: HostServices)
func deactivate()
var settingsView: AnyView? { get } // optional, default nil
}HostServices
Provided to your plugin on activation. Gives access to keychain, preferences, file storage, app context, and the event bus.
public protocol HostServices: Sendable {
// Keychain (plugin-scoped)
func storeSecret(key: String, value: String) throws
func loadSecret(key: String) -> String?
// UserDefaults (plugin-scoped)
func userDefault(forKey: String) -> Any?
func setUserDefault(_ value: Any?, forKey: String)
// File storage
var pluginDataDirectory: URL { get }
// App context
var activeAppBundleId: String? { get }
var activeAppName: String? { get }
// Event bus
var eventBus: EventBusProtocol { get }
// Profiles
var availableProfileNames: [String] { get }
// Notify host that plugin capabilities changed
func notifyCapabilitiesChanged()
}Call notifyCapabilitiesChanged() when your plugin's available models or configuration state changes (e.g., after loading a model or receiving an API key).
LLMProviderPlugin
public protocol LLMProviderPlugin: TypeWhisperPlugin {
var providerName: String { get }
var isAvailable: Bool { get }
var supportedModels: [PluginModelInfo] { get }
func process(
systemPrompt: String,
userText: String,
model: String?
) async throws -> String
}PluginModelInfo
public final class PluginModelInfo: @unchecked Sendable {
public let id: String
public let displayName: String
public let sizeDescription: String // e.g. "1.5 GB"
public let languageCount: Int // number of supported languages
public init(
id: String,
displayName: String,
sizeDescription: String = "",
languageCount: Int = 0
)
}TranscriptionEnginePlugin
public protocol TranscriptionEnginePlugin: TypeWhisperPlugin {
var providerId: String { get }
var providerDisplayName: String { get }
var isConfigured: Bool { get }
var transcriptionModels: [PluginModelInfo] { get }
var selectedModelId: String? { get }
func selectModel(_ modelId: String)
var supportsTranslation: Bool { get }
var supportsStreaming: Bool { get } // default false
var supportedLanguages: [String] { get } // default []
// Standard transcription
func transcribe(
audio: AudioData,
language: String?,
translate: Bool,
prompt: String?
) async throws -> PluginTranscriptionResult
// Streaming variant - onProgress returns false to cancel
func transcribe(
audio: AudioData,
language: String?,
translate: Bool,
prompt: String?,
onProgress: @Sendable @escaping (String) -> Bool
) async throws -> PluginTranscriptionResult
}The streaming variant has a default implementation that falls back to the standard transcribe method. Override it and set supportsStreaming to true if your engine supports partial results.
PostProcessorPlugin
public protocol PostProcessorPlugin: TypeWhisperPlugin {
var processorName: String { get }
var priority: Int { get }
@MainActor func process(
text: String,
context: PostProcessingContext
) async throws -> String
}
public struct PostProcessingContext: Sendable {
public let appName: String?
public let bundleIdentifier: String?
public let url: String?
public let language: String?
}ActionPlugin
public protocol ActionPlugin: TypeWhisperPlugin {
var actionName: String { get }
var actionId: String { get }
var actionIcon: String { get } // SF Symbol name
func execute(
input: String,
context: ActionContext
) async throws -> ActionResult
}
public struct ActionContext: Sendable {
public let appName: String?
public let bundleIdentifier: String?
public let url: String?
public let language: String?
public let originalText: String
}
public struct ActionResult: Sendable {
public let success: Bool
public let message: String
public let url: String? // URL to open after action
public let icon: String? // SF Symbol for result display
public let displayDuration: TimeInterval? // custom display time
}EventBus
Subscribe to app-wide events like recording start/stop, transcription completion, and text insertion.
public protocol EventBusProtocol: Sendable {
@discardableResult
func subscribe(
handler: @escaping @Sendable (TypeWhisperEvent) async -> Void
) -> UUID
func unsubscribe(id: UUID)
}
public enum TypeWhisperEvent: Sendable {
case recordingStarted(RecordingStartedPayload)
case recordingStopped(RecordingStoppedPayload)
case transcriptionCompleted(TranscriptionCompletedPayload)
case transcriptionFailed(TranscriptionFailedPayload)
case textInserted(TextInsertedPayload)
case actionCompleted(ActionCompletedPayload)
}Helper Classes
The SDK includes helpers for common patterns:
- -
PluginOpenAITranscriptionHelper- OpenAI-compatible transcription API client - -
PluginOpenAIChatHelper- OpenAI-compatible chat completion client - -
PluginWavEncoder- Encode Float samples to WAV data
Example: Minimal LLM Plugin
A complete LLM provider plugin that wraps an OpenAI-compatible API:
import Foundation
import TypeWhisperPluginSDK
@objc(MyLLMPlugin)
final class MyLLMPlugin: NSObject, LLMProviderPlugin {
static let pluginId = "com.example.my-llm"
static let pluginName = "My LLM"
private nonisolated(unsafe) var host: HostServices?
private let chatHelper = PluginOpenAIChatHelper(
baseURL: "https://api.example.com"
)
let providerName = "My LLM"
let supportedModels = [
PluginModelInfo(
id: "model-v1",
displayName: "Model V1",
sizeDescription: "Cloud",
languageCount: 50
)
]
var isAvailable: Bool {
host?.loadSecret(key: "apiKey") != nil
}
override init() {
super.init()
}
func activate(host: HostServices) {
self.host = host
}
func deactivate() {
host = nil
}
func process(
systemPrompt: String,
userText: String,
model: String?
) async throws -> String {
guard let apiKey = host?.loadSecret(key: "apiKey") else {
throw PluginChatError.notConfigured
}
return try await chatHelper.process(
apiKey: apiKey,
model: model ?? "model-v1",
systemPrompt: systemPrompt,
userText: userText
)
}
}Distribution
Build your plugin in Release configuration and distribute the resulting .bundle file. Users install it by copying to:
~/Library/Application Support/TypeWhisper/Plugins/MyPlugin.bundleSubmit to the Plugin Catalog
Share your plugin with the TypeWhisper community by submitting it to the official plugin catalog. Submitted plugins are built automatically and listed on this website for easy discovery.
How to Submit
- 1. Fork the TypeWhisper/typewhisper-plugins repository.
- 2. Create a directory under
plugins/your-plugin-slug/with yourmanifest.json,README.md,LICENSE, andsrc/directory with your Swift source. - 3. Open a pull request - CI will automatically validate your manifest, check required files, and compile your plugin.
- 4. The TypeWhisper team reviews your submission.
- 5. After merge, your plugin is automatically built, released, and listed in the catalog.
See the CONTRIBUTING.md for detailed guidelines on manifest format, directory structure, and review criteria.