Back to Opik

Observability for Novita AI with Opik

apps/opik-documentation/documentation/fern/docs-v2/integrations/novita-ai.mdx

2.0.24-52626.0 KB
Original Source
<Note> In Opik 2.0, datasets and experiments are project-scoped. Make sure to specify a `project_name` when creating datasets and running experiments so they are associated with the correct project. </Note>

Novita AI is an AI cloud platform that helps developers easily deploy AI models through a simple API, backed by affordable and reliable GPU cloud infrastructure. It provides access to a wide range of models including DeepSeek, Qwen, Llama, and other popular LLMs.

This guide explains how to integrate Opik with Novita AI via LiteLLM. By using the LiteLLM integration provided by Opik, you can easily track and evaluate your Novita AI API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.

Getting Started

Configuring Opik

To get started, you need to configure Opik to send traces to your Comet project. You can do this by setting the OPIK_PROJECT_NAME environment variable:

bash
export OPIK_PROJECT_NAME="your-project-name"
export OPIK_WORKSPACE="your-workspace-name"

You can also call the opik.configure method:

python
import opik

opik.configure(
    project_name="your-project-name",
    workspace="your-workspace-name",
)

Configuring LiteLLM

Install the required packages:

bash
pip install opik litellm

Create a LiteLLM configuration file (e.g., litellm_config.yaml):

yaml
model_list:
  - model_name: deepseek-r1-turbo
    litellm_params:
      model: novita/deepseek/deepseek-r1-turbo
      api_key: os.environ/NOVITA_API_KEY
  - model_name: qwen-32b-fp8
    litellm_params:
      model: novita/qwen/qwen3-32b-fp8
      api_key: os.environ/NOVITA_API_KEY
  - model_name: llama-70b-instruct
    litellm_params:
      model: novita/meta-llama/llama-3.1-70b-instruct
      api_key: os.environ/NOVITA_API_KEY

litellm_settings:
  callbacks: ["opik"]

Authentication

Set your Novita AI API key as an environment variable:

bash
export NOVITA_API_KEY="your-novita-api-key"

You can obtain a Novita AI API key from the Novita AI dashboard.

Usage

Using LiteLLM Proxy Server

Start the LiteLLM proxy server:

bash
litellm --config litellm_config.yaml

Use the proxy server to make requests:

python
import openai

client = openai.OpenAI(
    api_key="anything",  # can be anything
    base_url="http://0.0.0.0:4000"
)

response = client.chat.completions.create(
    model="deepseek-r1-turbo",
    messages=[
        {"role": "user", "content": "What are the advantages of using cloud-based AI platforms?"}
    ]
)

print(response.choices[0].message.content)

Direct Integration

You can also use LiteLLM directly in your Python code:

python
import os
from litellm import completion

# Configure Opik
import opik
opik.configure()

# Configure LiteLLM for Opik
from litellm.integrations.opik.opik import OpikLogger
import litellm

litellm.callbacks = ["opik"]

os.environ["NOVITA_API_KEY"] = "your-novita-api-key"

response = completion(
    model="novita/deepseek/deepseek-r1-turbo",
    messages=[
        {"role": "user", "content": "How can cloud AI platforms improve development efficiency?"}
    ]
)

print(response.choices[0].message.content)

Supported Models

Novita AI provides access to a comprehensive catalog of models from leading providers. Some of the popular models available include:

  • DeepSeek Models: deepseek-r1-turbo, deepseek-v3-turbo, deepseek-v3-0324
  • Qwen Models: qwen3-235b-a22b-fp8, qwen3-30b-a3b-fp8, qwen3-32b-fp8
  • Llama Models: llama-4-maverick-17b-128e-instruct-fp8, llama-3.3-70b-instruct, llama-3.1-70b-instruct
  • Mistral Models: mistral-nemo
  • Google Models: gemma-3-27b-it

For the complete list of available models, visit the Novita AI model catalog.

Advanced Features

Tool Calling

Novita AI supports function calling with compatible models:

python
from litellm import completion

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                },
                "required": ["location"],
            },
        },
    }
]

response = completion(
    model="novita/deepseek/deepseek-r1-turbo",
    messages=[{"role": "user", "content": "What's the weather like in Boston today?"}],
    tools=tools,
)

JSON Mode

For structured outputs, you can enable JSON mode:

python
response = completion(
    model="novita/deepseek/deepseek-r1-turbo",
    messages=[
        {"role": "user", "content": "List 5 popular cookie recipes."}
    ],
    response_format={"type": "json_object"}
)

Feedback Scores and Evaluation

Once your Novita AI calls are logged with Opik, you can evaluate your LLM application using Opik's evaluation framework:

python
from opik.evaluation import evaluate
from opik.evaluation.metrics import Hallucination

# Define your evaluation task
def evaluation_task(x):
    return {
        "message": x["message"],
        "output": x["output"],
        "reference": x["reference"]
    }

# Create the Hallucination metric
hallucination_metric = Hallucination()

# Run the evaluation
evaluation_results = evaluate(
    experiment_name="novita-ai-evaluation",
    dataset=your_dataset,
    task=evaluation_task,
    scoring_metrics=[hallucination_metric],
    project_name="my-project",
)

Environment Variables

Make sure to set the following environment variables:

bash
# Novita AI Configuration
export NOVITA_API_KEY="your-novita-api-key"

# Opik Configuration
export OPIK_PROJECT_NAME="your-project-name"
export OPIK_WORKSPACE="your-workspace-name"