OpenAI Compatible is a universal adapter that connects TypeWhisper to any service with an OpenAI-compatible API. Use it with Ollama, LM Studio, vLLM, or any other compatible server for both transcription and LLM tasks. Models are loaded dynamically from the server’s /v1/models endpoint or can be entered manually.
Features
Works with any OpenAI-compatible API (Ollama, LM Studio, vLLM, etc.)
Transcription and LLM support
Translation support
99+ languages (depending on the model)
Dynamic model discovery via /v1/models endpoint
Manual model entry supported
Test connection function
No vendor lock-in
Transcription & LLM Models
Models are loaded dynamically from the connected server. You can also enter model names manually.
Configuration
Server URL - The base URL of your OpenAI-compatible server (e.g. http://localhost:11434/v1 for Ollama)
API Key - Optional, depends on the server
Setup
Start your OpenAI-compatible server (e.g. Ollama, LM Studio)
Open TypeWhisper Settings > Plugins
Find the OpenAI Compatible plugin and click Configure
Enter your server URL
Use “Test Connection” to verify the connection
Select OpenAI Compatible as your transcription engine or LLM provider