examples/macos/MicTranscription/README.md
A Swift command-line application that demonstrates how to use Moonshine Voice for transcription, mirroring the functionality of the Python basic_transcription.py script.
cd examples/macos/MicTranscription
swift build
An Xcode project is available for building and running from the Xcode IDE:
MicTranscription.xcodeproj in XcodeMicTranscription schemeThe Xcode project is generated from project.yml using xcodegen. To regenerate it:
cd examples/macos/MicTranscription
xcodegen generate
# Use default test file (test-assets/two_cities.wav)
swift run MicTranscription
# Specify input files
swift run MicTranscription path/to/audio1.wav path/to/audio2.wav
# Specify language
swift run MicTranscription --language en path/to/audio.wav
# Specify model architecture
swift run MicTranscription --model-arch tiny path/to/audio.wav
# Show help
swift run MicTranscription --help
--language, -l LANGUAGE: Language to use for transcription (default: en)--model-arch, -m ARCH: Model architecture: tiny, base, tiny-streaming, base-streaming, small-streaming, medium-streaming--help, -h: Show help messageen / english - Englishja / japanese - Japanesees / spanish - Spanishar / arabic - Arabicko / korean - Koreanvi / vietnamese - Vietnameseuk / ukrainian - Ukrainianzh / chinese - ChineseThe application will:
test-assets/{model-name}/ directory