When fine-tuning a language model, choosing the right hyperparameters is crucial for achieving optimal performance. Hyperparameters control various aspects of the training process including learning speed, model stability, and final performance. Each task type has its own set of hyperparameters that can be tuned. You can find all of the available hyperparameters for each task type in the Fine-Tuning Configuration Reference.Documentation Index
Fetch the complete documentation index at: https://docs.predibase.com/llms.txt
Use this file to discover all available pages before exploring further.
- SFT: SFTConfig
- Continued Pretraining: ContinuedPretrainingConfig
- GRPO: GRPOConfig
- Classification: ClassificationConfig