Support
Need help with SRTWhisper? Find answers below or get in touch.
Contact Us
Send us an email at zhiii0x@gmail.com
System Requirements
| Operating System | macOS 14.0 (Sonoma) or later |
| Processor | Apple Silicon (M1 or later) recommended |
| Memory | 8 GB RAM minimum, 16 GB recommended for larger models |
| Storage | 2 GB+ for AI models (varies by model size) |
| Network | Required for model download and optional API translation |
Frequently Asked Questions
Is my data sent to any server?
No. All speech recognition is performed locally on your Mac using WhisperKit. Your video and audio files never leave your device. The only exception is if you choose to use OpenAI GPT translation, which sends subtitle text (not audio/video) to OpenAI's servers.
Which Whisper models are available?
SRTWhisper supports 7 model sizes: Tiny, Base, Small, Medium, Large-v3 Turbo, Large-v3, and Distil-Large-v3. Larger models provide better accuracy but require more memory and processing time. For Apple Silicon Macs, we recommend Large-v3 Turbo for the best balance of speed and accuracy.
What subtitle formats are supported?
SRTWhisper supports three subtitle formats: SRT (SubRip), WebVTT (VTT), and ASS (Advanced SubStation Alpha). You can also burn subtitles directly into your video files.
How do I set up translation?
SRTWhisper offers multiple translation methods: (1) Local LLM - runs entirely on-device, (2) OpenAI GPT - requires an API key from OpenAI, (3) Local API - works with Ollama-compatible endpoints, (4) Ollama - connect to local or remote Ollama servers. Go to Settings to configure your preferred method.
The transcription is inaccurate. What can I do?
Try using a larger Whisper model for better accuracy. Also ensure the audio is clear and not too noisy. You can use the built-in subtitle editor to make corrections after transcription. Selecting the correct source language manually (instead of auto-detect) can also improve results.
Can I use SRTWhisper offline?
Yes, once you've downloaded a Whisper model, you can transcribe videos completely offline. Translation using the local LLM method also works offline. Only OpenAI GPT translation and model downloads require an internet connection.