docs/docs/photos/features/search-and-discovery/magic-search.md
Magic search lets you find photos using natural language descriptions of their content. Using on-device AI, you can search for objects, scenes, colors, and activities without your photos or search queries ever leaving your device.
Magic search uses AI models (specifically CLIP - Contrastive Language-Image Pre-training) to understand the content of your photos. This enables semantic search where you can describe what you're looking for in natural language, and the app will find matching photos.
You can search for:
Magic search runs entirely on your device:
Privacy guarantee: Your photos, search queries, and AI-generated indexes never leave your device in unencrypted form. Ente's servers cannot see what's in your photos or what you're searching for.
The AI model understands:
This allows for flexible, natural language searches that go beyond simple keyword matching.
Magic search is part of Ente's machine learning features. You must enable it manually (it's off by default).
On mobile:
Open Settings > Machine learning, enable Machine learning and/or Local indexing, and wait for indexing to complete.
On desktop:
Open Settings > Preferences > Machine learning, enable Machine learning and/or Local indexing, and monitor indexing progress.
Note: Magic search is not available on web.ente.io. You must use the mobile or desktop app.
After enabling magic search:
Indexing tips:
Learn more about Machine learning.
Type natural language descriptions in the search bar:
Magic search understands more complex, descriptive queries:
Be descriptive:
Combine terms:
Try different phrasings:
Use with other search types:
Magic search works even better when combined with descriptions (captions) you've added to photos.
On mobile and desktop:
Your descriptions become searchable, making it easy to find photos you've documented with specific details, memories, or context.
Example: Add "Sarah's graduation ceremony at the park" as a description, and you can later search for any of those terms to find the photo.
Learn more in Metadata and Editing FAQ.
Cars, dogs, cats, bicycles, books, phones, computers, furniture, plants, flowers, trees, food items, drinks, and many more everyday objects.
Beach, mountain, forest, city, park, garden, indoor/outdoor settings, architectural features, natural landscapes.
Basic colors (red, blue, green, etc.), lighting conditions (night, day, sunset), weather conditions.
Eating, swimming, hiking, cooking, reading, celebrations, sports, and other common activities (detection varies by photo clarity).
Magic search works best with:
It may struggle with:
Magic search maintains complete privacy:
Your magic search data is as private and secure as your photos themselves.
Learn more in Security and Privacy FAQ.
Once your photos have been indexed, magic search works completely offline:
The initial indexing requires downloading your photos (which is faster over WiFi), but after that magic search works without an internet connection.
Magic search understands queries in multiple languages. The CLIP model has been trained on diverse linguistic data, allowing it to understand common terms in many languages.
However, English queries typically work best due to the model's training data distribution.
Magic search uses CLIP (Contrastive Language-Image Pre-training), an AI model developed by OpenAI that understands both images and text. This allows the model to:
Magic search works well alongside Ente's other search capabilities:
Learn more in Search and Discovery overview.