docs/customize/model-providers/more/msty.mdx
Msty is an application for Windows, Mac, and Linux that makes it really easy to run online as well as local open-source models, including Llama-2, DeepSeek Coder, etc. No need to fidget with your terminal, run a command, or anything. Just download the app from the website, click a button, and you are up and running. Continue can then be configured to use the Msty LLM class:
models: - name: Msty provider: msty model: deepseek-coder:6.7b
</Tab>
<Tab title="JSON">
```json title="config.json"
{
"models": [
{
"title": "Msty",
"provider": "msty",
"model": "deepseek-coder:6.7b",
}
]
}
In addition to the model type, you can also configure some of the parameters that Msty uses to run the model.
If you need to send custom headers for authentication, you may use the requestOptions.headers property like this:
models: - name: Msty provider: msty model: deepseek-coder:6.7b requestOptions: headers: Authorization: Bearer xxx
</Tab>
<Tab title="JSON">
```json title="config.json"
{
"models": [
{
"title": "Msty",
"provider": "msty",
"model": "deepseek-coder:6.7b",
"requestOptions": {
"headers": {
"Authorization": "Bearer xxx"
}
}
}
]
}