examples/macos/BasicTranscription/README.md
A Swift command-line application that demonstrates how to use Moonshine Voice for transcription, mirroring the functionality of the Python basic_transcription.py script.
cd examples/macos/BasicTranscription
swift build
An Xcode project is available for building and running from the Xcode IDE:
BasicTranscription.xcodeproj in XcodeBasicTranscription schemeThe Xcode project is generated from project.yml using xcodegen. To regenerate it:
cd examples/macos/BasicTranscription
xcodegen generate
# Use default test file (test-assets/two_cities.wav)
swift run BasicTranscription
# Specify input files
swift run BasicTranscription path/to/audio1.wav path/to/audio2.wav
# Specify language
swift run BasicTranscription --language en path/to/audio.wav
# Specify model architecture
swift run BasicTranscription --model-arch tiny path/to/audio.wav
# Show help
swift run BasicTranscription --help
--language, -l LANGUAGE: Language to use for transcription (default: en)--model-arch, -m ARCH: Model architecture: tiny, base, tiny-streaming, small-streaming, medium-streaming--help, -h: Show help messageen / english - Englishja / japanese - Japanesees / spanish - Spanishar / arabic - Arabicko / korean - Koreanvi / vietnamese - Vietnameseuk / ukrainian - Ukrainianzh / chinese - ChineseThe application will:
test-assets/{model-name}/ directory