Back to Omi

Swift SDK

docs/doc/developer/sdk/swift.mdx

3.0.0-Android-App6.0 KB
Original Source

Overview

An easy-to-install Swift package for connecting to Omi devices. Get started in seconds with local Whisper-based transcription - no cloud API required.

<CardGroup cols={3}> <Card title="Swift Package" icon="swift"> Native iOS/macOS support </Card> <Card title="Local Transcription" icon="microphone"> Whisper runs on-device </Card> <Card title="Simple API" icon="code"> Connect in minutes </Card> </CardGroup>

Installation

<Steps> <Step title="Create a New Project"> Open Xcode → File → New Project → iOS → App
<Note>
Make sure to select **Storyboard** as the Interface option.
</Note>

</Step> <Step title="Add the Swift Package"> 1. Navigate to **File → Swift Packages → Add Package Dependency...** 2. Select your project 3. Paste the repository URL: `https://github.com/BasedHardware/omi` 4. Click **Next**
<Tip>
If you aren't prompted to add the package to your target, click "Add Package" again, then "Add to Target" and choose your project.
</Tip>

</Step> <Step title="Add Bluetooth Permission"> Go to **Targets → Your Project → Info** and add this permission:
```xml
<key>NSBluetoothAlwaysUsageDescription</key>
<string>This app needs Bluetooth access to connect to BLE devices.</string>
```
</Step> </Steps>

Quick Start

Get transcription working in 2 minutes:

<Steps> <Step title="Copy This Code"> Replace your `ViewController.swift` with:
```swift
import UIKit
import omi_lib

class ViewController: UIViewController {

    override func viewDidLoad() {
        super.viewDidLoad()
        self.lookForDevice()
    }

    func lookForDevice() {
        OmiManager.startScan { device, error in
            print("starting scan")
            if let device = device {
                print("got device ", device)
                self.connectToOmiDevice(device: device)
                OmiManager.endScan()
            }
        }
    }

    func connectToOmiDevice(device: Device) {
        OmiManager.connectToDevice(device: device)
        self.listenToLiveTranscript(device: device)
        self.reconnectIfDisconnects()
    }

    func reconnectIfDisconnects() {
        OmiManager.connectionUpdated { connected in
            if connected == false {
                self.lookForDevice()
            }
        }
    }

    func listenToLiveTranscript(device: Device) {
        OmiManager.getLiveTranscription(device: device) { transcription in
            print("transcription:", transcription ?? "no transcription")
        }
    }
}
```
</Step> <Step title="Build and Run"> 1. Select your development team 2. Connect your iPhone via cable (simulators don't support Bluetooth) 3. Run the project </Step> <Step title="Test It"> 1. Turn on your Omi device 2. The app should connect automatically 3. Speak - you'll see transcription in the Xcode console
<Note>
There's no UI in this example - transcription appears in the Xcode logs.
</Note>

</Step> </Steps>

API Reference

The OmiManager class provides all device interaction methods:

Device Scanning

swift
import omi_lib

// Scan for any Omi device
func lookForDevice() {
    OmiManager.startScan { device, error in
        if let device = device {
            print("got device ", device)
            self.connectToOmiDevice(device: device)
            OmiManager.endScan()
        }
    }
}

// Scan for a specific device
func lookForSpecificDevice(device_id: String) {
    OmiManager.startScan { device, error in
        if let device = device, device.id == "some_device_id" {
            print("got device ", device)
            self.connectToOmiDevice(device: device)
            OmiManager.endScan()
        }
    }
}

Connection Management

swift
func connectToOmiDevice(device: Device) {
    OmiManager.connectToDevice(device: device)
    self.reconnectIfDisconnects()
}

func reconnectIfDisconnects() {
    OmiManager.connectionUpdated { connected in
        if connected == false {
            self.lookForDevice()
        }
    }
}

Live Data

<Tabs> <Tab title="Live Transcription"> ```swift func listenToLiveTranscript(device: Device) { OmiManager.getLiveTranscription(device: device) { transcription in print("transcription:", transcription ?? "no transcription") } } ``` </Tab> <Tab title="Live Audio"> ```swift func listenToLiveAudio(device: Device) { OmiManager.getLiveAudio(device: device) { file_url in print("file_url: ", file_url?.absoluteString ?? "no url") } } ``` </Tab> </Tabs>

OmiManager Methods

MethodDescription
startScan(callback)Start scanning for Omi devices
endScan()Stop scanning
connectToDevice(device)Connect to a discovered device
connectionUpdated(callback)Monitor connection state changes
getLiveTranscription(device, callback)Receive real-time transcription
getLiveAudio(device, callback)Receive audio file URLs

License

MIT License - Omi's Swift SDK is open source.


<CardGroup cols={2}> <Card title="SDK Overview" icon="cube" href="/doc/developer/sdk/sdk"> Compare all available SDKs </Card> <Card title="GitHub Source" icon="github" href="https://github.com/BasedHardware/omi"> View source code and contribute </Card> </CardGroup>