docs/usage/providers/ppio.mdx
<Image alt={'Using PPIO in LobeHub'} cover src={'/blog/assets06d4e543cbaca9a2762923a23b2cae67.webp'} />
PPIO offers stable and cost-effective API services for open-source large language models, supporting industry-leading models such as DeepSeek, Llama, and Qwen.
This guide will walk you through how to use PPIO in LobeHub:
<Steps> ### Step 1: Register and Log In to PPIO<Image alt={'Register on PPIO'} height={457} inStep src={'/blog/assets3ca963d92475f34b0789cfa50071bc52.webp'} />
<Image alt={'Create a PPIO API Key'} inStep src={'/blog/assetsbd39adddc9a1cdb85ce4a0e37fa595c1.webp'} />
<Image alt={'Enter PPIO API Key in LobeHub'} inStep src={'/blog/assetsbfe7d519c29884b6699e89866e1db7e2.webp'} />
<Image alt={'Select and use a PPIO model'} inStep src={'/blog/assets89d0dcbf5ffccd21086845cea3a514cc.webp'} />
<Callout type={'warning'}> You may incur charges when using the API services. For pricing details, refer to the official PPIO pricing page. </Callout> </Steps>
That's it! You're now ready to use PPIO's models for conversations in LobeHub.