docs/usage/providers/qiniu.mdx
<Image alt={'Using Qiniu Cloud LLM in LobeHub'} cover src={'/blog/assets48b5c19e20fb870c7bdd34bd3aefbb21.webp'} />
Qiniu Cloud, a well-established cloud service provider, offers cost-effective and reliable real-time and batch AI inference services that are easy to use.
This guide will walk you through how to use Qiniu Cloud's large language models in LobeHub:
<Steps> ### Step 1: [Obtain Your AI Model API Key](https://developer.qiniu.com/aitokenapi/12884/how-to-get-api-key)Method 1: Get it via the Console
Method 2: Get it via the Mini Program
<Image alt={'Enter API Key'} inStep src={'https://static.sufy.com/lobehub/439049319-6ae44f36-bf48-492a-a6aa-7be72f4a29d8.png'} />
<Image alt={'Select Qiniu Cloud LLM and start chatting'} inStep src={'https://static.sufy.com/lobehub/439048945-c608eb9e-6ee1-4611-9df7-2075e95d069b.png'} />
<Callout type={'warning'}> You may incur charges from the API service provider during usage. Please refer to Qiniu Cloud's pricing policy for details. </Callout> </Steps>
That's it! You're now ready to use Qiniu Cloud's large language models in LobeHub for intelligent conversations.