Officially Supported LLMs
These LLMs are officially supported by Predibase for fine-tuning and serving within the platform:
|Model Name||URI||Parameters||Instruction Tuned||Context Length||Available As Serverless LLM|
Non-instruction tuned model variants are not fine-tuned to perform a specific task or respond in a specific way. They were trained for text-completion, and as such are primarily designed to be fine-tuned to perform more specific tasks.
Instruction tuned models, in contrast, are foundation models that have already been fine-tuned using Instruction Tuning to respond as a chatbot. These models can be productionized with little to no additional fine-tuning, but are less easy to adapt to entirely new tasks.
Predibase provides best-effort support for any Huggingface pretrained LLM meeting the following criteria:
- Has the "Text Generation" and "Transformer" tags
- Does not have a "custom_code" tag
- Maximum of 13 billion parameters
We are continuing to build out support for larger and a more diverse set of LLMs.