Back to Lobehub

Using Qiniu Cloud's Large Language Model API Key in LobeHub

docs/usage/providers/qiniu.mdx

2.1.562.2 KB
Original Source

Using Qiniu Cloud's Large Language Model in LobeHub

<Image alt={'Using Qiniu Cloud LLM in LobeHub'} cover src={'/blog/assets48b5c19e20fb870c7bdd34bd3aefbb21.webp'} />

Qiniu Cloud, a well-established cloud service provider, offers cost-effective and reliable real-time and batch AI inference services that are easy to use.

This guide will walk you through how to use Qiniu Cloud's large language models in LobeHub:

<Steps> ### Step 1: [Obtain Your AI Model API Key](https://developer.qiniu.com/aitokenapi/12884/how-to-get-api-key)

Step 2: Configure Qiniu Cloud LLM Service in LobeHub

  • Open the Settings panel in LobeHub
  • Under the "AI Providers" section, find the "Qiniu Cloud" configuration

<Image alt={'Enter API Key'} inStep src={'https://static.sufy.com/lobehub/439049319-6ae44f36-bf48-492a-a6aa-7be72f4a29d8.png'} />

  • Enable Qiniu Cloud and paste in your API key
  • Choose a Qiniu Cloud large language model for your AI assistant to start chatting

<Image alt={'Select Qiniu Cloud LLM and start chatting'} inStep src={'https://static.sufy.com/lobehub/439048945-c608eb9e-6ee1-4611-9df7-2075e95d069b.png'} />

<Callout type={'warning'}> You may incur charges from the API service provider during usage. Please refer to Qiniu Cloud's pricing policy for details. </Callout> </Steps>

That's it! You're now ready to use Qiniu Cloud's large language models in LobeHub for intelligent conversations.