candle-wasm-examples/phi/index.html
🕯️
The Phi-1.5 and Phi-2 models achieve state-of-the-art performance with only 1.3 billion and 2.7 billion parameters, compared to larger models with up to 13 billion parameters. Here you can try the quantized versions. Additional prompt examples are available in the technical report.
You can also try Puffin-Phi V2 quantized version, a fine-tuned version of Phi-1.5 on the Puffin dataset
Note: When first run, the app will download and cache the model, which could take a few minutes. The models are ~800MB or ~1.57GB in size.
Models Options:
#Prompt Templates Instruct: Write a detailed analogy between mathematics and a lighthouse. Output: Run #Advanced Options Maximum length 200Temperature 0.00Top-p 1.00Repeat Penalty1.10Seed Rand
No output yet