Back to Reflex

reflex-llamaindex-template

packages/reflex-site-shared/src/reflex_site_shared/gallery/templates/llamaindex-app.md

0.9.2a21.6 KB
Original Source

The following is an alternative UI to display the LLamaIndex app.

Prerequisites

If you plan on deploying your agentic workflow to prod, follow the llama deploy tutorial to deploy your agentic workflow.

Setup

To run this app locally, install Reflex and run:

bash
reflex init --template reflex-llamaindex-template

The following lines in the state.py file are where the app makes a request to your deployed agentic workflow. If you have not deployed your agentic workflow, you can edit this to call and api endpoint of your choice.

python
client = httpx.AsyncClient()

# call the agentic workflow
input_payload = {
    "chat_history_dicts": chat_history_dicts,
    "user_input": question,
}
deployment_name = os.environ.get("DEPLOYMENT_NAME", "MyDeployment")
apiserver_url = os.environ.get("APISERVER_URL", "http://localhost:4501")
response = await client.post(
    f"\{apiserver_url}/deployments/\{deployment_name}/tasks/create",
    json=\{"input": json.dumps(input_payload)},
    timeout=60,
)
answer = response.text

for i in range(len(answer)):
    # Pause to show the streaming effect.
    await asyncio.sleep(0.01)
    # Add one letter at a time to the output.
    self.chat_history[-1] = (
        self.chat_history[-1][0],
        answer[: i + 1],
    )
    yield

Run the app

Once you have set up your environment, install the dependencies and run the app:

bash
cd reflex-llamaindex-template
pip install -r requirements.txt
reflex run