Back to Autogen

Chat With Llava

dotnet/website/articles/AutoGen.Ollama/Chat-with-llava.md

0.4.41.2 KB
Original Source

This sample shows how to use @AutoGen.Ollama.OllamaAgent to chat with LLaVA model.

To run this example, you need to have an Ollama server running aside and have llava:latest model installed. For how to setup an Ollama server, please refer to Ollama.

[!NOTE] You can find the complete sample code here

Step 1: Install AutoGen.Ollama

First, install the AutoGen.Ollama package using the following command:

bash
dotnet add package AutoGen.Ollama

For how to install from nightly build, please refer to Installation.

Step 2: Add using statement

[!code-csharp]

Step 3: Create @AutoGen.Ollama.OllamaAgent

[!code-csharp]

Step 4: Start MultiModal Chat

LLaVA is a multimodal model that supports both text and image inputs. In this step, we create an image message along with a question about the image.

[!code-csharp]