docs/examples/llm/palm.ipynb
<a href="https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/llm/palm.ipynb" target="_parent"></a>
In this short notebook, we show how to use the PaLM LLM from Google in LlamaIndex: https://ai.google/discover/palm2/.
We use the text-bison-001 model by default.
If you're opening this Notebook on colab, you will probably need to install LlamaIndex 🦙.
%pip install llama-index-llms-palm
!pip install llama-index
!pip install -q google-generativeai
import pprint
import google.generativeai as palm
palm_api_key = ""
palm.configure(api_key=palm_api_key)
models = [
m
for m in palm.list_models()
if "generateText" in m.supported_generation_methods
]
model = models[0].name
print(model)
PaLM LLM abstraction!from llama_index.llms.palm import PaLM
model = PaLM(api_key=palm_api_key)
model.complete(prompt)