The Adapters API provides methods for creating, retrieving, and managing adapters for fine-tuned models.

Create Adapter (Finetune)

Fine-Tune Adapter

pb.adapters.create(
    config: SFTConfig | ContinuedPretrainingConfig | GRPOConfig,       # Configuration for fine-tuning
    dataset: str,                                                      # The dataset to use for fine-tuning
    repo: str,                                                         # Name of the adapter repo
    continue_from_version: str = None,                                 # The adapter version to continue training from
    description: str = None,                                           # Description for the adapter
    show_tensorboard: bool = False                                     # If true, launch tensorboard instance
) -> Adapter
Create a new adapter by starting a new (blocking) fine-tuning job. Parameters
  • config: SFTConfig | ContinuedPretrainingConfig | GRPOConfig - Configuration for fine-tuning
  • dataset: str - The dataset to use for fine-tuning
  • continue_from_version: str, optional - The adapter version to continue training from
  • repo: str - Name of the adapter repo to store the newly created adapter
  • description: str, optional - Description for the adapter
  • show_tensorboard: bool, default False - If true, launch a tensorboard instance to view training logs
Returns
  • Adapter - The created adapter object
Example 1: Create an adapter with defaults
# Create an adapter repository
repo = pb.repos.create(name="news-summarizer-model", description="TLDR News Summarizer Experiments", exists_ok=True)

# Start a fine-tuning job, blocks until training is finished
adapter = pb.adapters.create(
    config=SFTConfig(
        base_model="qwen3-8b"
    ),
    dataset="tldr_news",
    repo=repo,
    description="initial model with defaults"
)
Example 2: Create a new adapter with customized parameters
# Create an adapter repository
repo = pb.repos.create(name="news-summarizer-model", description="TLDR News Summarizer Experiments", exists_ok=True)

# Start a fine-tuning job with custom parameters, blocks until training is finished
adapter = pb.adapters.create(
    config=SFTConfig(
        base_model="qwen3-8b",
        task="instruction_tuning",
        epochs=1, # default: 3
        rank=8, # default: 16
        learning_rate=0.0001, # default: 0.0002
        target_modules=["q_proj", "v_proj", "k_proj"], # default: None (infers [q_proj, v_proj] for qwen3-8b)
    ),
    dataset="tldr_news",
    repo=repo,
    description="changing epochs, rank, and learning rate"
)
Example 3: Continue training from an existing adapter
adapter = pb.adapters.create(
    # Note: only `epochs` and `enable_early_stopping` are available parameters in this case.
    config=SFTConfig(
        epochs=3,  # The maximum number of ADDITIONAL epochs to train for
        enable_early_stopping=False,
    ),
    continue_from_version="myrepo/3",  # The adapter version to resume training from
    dataset="mydataset",
    repo="myrepo"
)

Fine-Tune Adapter (Async)

pb.finetuning.jobs.create(
    config: SFTConfig | ContinuedPretrainingConfig | GRPOConfig,     # Configuration for fine-tuning
    dataset: str,                                                    # The dataset to use for fine-tuning
    repo: str,                                                       # Name of the adapter repo
    continue_from_version: str = None,                               # The adapter version to continue training from
    description: str = None,                                         # Description for the adapter
    watch: bool = False,                                             # Whether to block until job finishes
    show_tensorboard: bool = False                                   # If true, launch tensorboard instance
) -> FinetuningJob
Start a non-blocking fine-tuning job for creating an adapter. Parameters
  • config: SFTConfig | ContinuedPretrainingConfig | GRPOConfig - Configuration for fine-tuning
  • dataset: str - The dataset to use for fine-tuning
  • repo: str - Name of the adapter repo to store the newly created adapter
  • continue_from_version: str, optional - The adapter version to continue training from
  • description: str, optional - Description for the adapter
  • watch: bool, default False - Whether to block until the fine-tuning job finishes
  • show_tensorboard: bool, default False - Whether to launch a tensorboard instance
Returns
  • FinetuningJob - Object representing the fine-tuning job
Example
adapter_job = pb.finetuning.jobs.create(
    config=SFTConfig(
        base_model="qwen3-8b",
        task="instruction_tuning",
        epochs=1,
        rank=8,
        target_modules=["q_proj", "v_proj", "k_proj"],
    ),
    dataset="tldr_news",
    repo="news-summarizer-model",
    description="async job example",
    watch=False,
    show_tensorboard=False,
)
All the examples for the synchronous pb.adapters.create method are also valid for the asynchronous pb.finetuning.jobs.create method.

Run finetuning in a specified region (Advanced, VPC only)

from predibase import TrainingComputeSpec, TrainingComputeRequests, ComputeRequest

adapter_job = pb.finetuning.jobs.create(
    config=SFTConfig(
        base_model="qwen3-8b",
    ),
    compute_spec=TrainingComputeSpec(
      region="us-west-2",
      requests=TrainingComputeRequests(
        predifine=ComputeRequest(
          sku="a100_80gb_100",  # Optional
        )
      ),
    ),
    # Other parameters are identical to the non-region-specific case.
    ...
)

Estimate Fine-Tuning Cost

pb.finetuning.jobs.estimate_cost(
    config: SFTConfig | ContinuedPretrainingConfig | GRPOConfig,     # Configuration for fine-tuning
    dataset: str,                                                    # The dataset to use for fine-tuning
) -> FinetuningJob
Estimate how much it will cost to fine-tune an adapter based on its config and the data. This is automatically invoked whenever an adapter is created. Parameters Returns
  • None - The cost is printed to the terminal.
Example
pb.finetuning.jobs.estimate_cost(
    config=SFTConfig(
        base_model="qwen3-8b",
        epochs=1,
        rank=8,
        target_modules=["q_proj", "v_proj", "k_proj"],
    ),
    dataset="tldr_news"
)

# Output
# >>> Estimated Cost: $0.56

Get Adapter

pb.adapters.get(
    adapter_id: str,     # Name of the adapter in the format "repo/version"
) -> Adapter
Fetch an adapter by name and version. Parameters
  • adapter_id: str - ID of the adapter in the format “repo/version”
Returns
  • Adapter - The requested adapter object
Example
# Get an adapter by ID
adapter = pb.adapters.get("news-summarizer-model/1")

# Print adapter details
print(f"Adapter: {adapter.name} v{adapter.tag}")
print(f"Description: {adapter.description}")
print(f"Base Model: {adapter.base_model}")

Upload Adapter

pb.adapters.upload(
    local_dir: str,     # Path to the local directory containing the adapter files
    repo: str,          # Name of the adapter repo
    region: str,        # Region to upload to
) -> Adapter
Upload existing adapter weights to create a new adapter version. Parameters
  • local_dir: str - Path to the local directory containing the adapter files
  • repo: str - Name of the adapter repo
  • region: str - The region to upload the adapter to. Only required for multi-region VPC users
Returns
  • Adapter - The uploaded adapter object
Example
# Upload adapter weights
adapter = pb.adapters.upload(
    local_dir="./my_adapter_weights",
    repo="news-summarizer-model",
)
print(f"Uploaded adapter: {adapter.name} v{adapter.tag}")

Cancel Adapter (Finetune)

pb.adapters.cancel(
    adapter_id: str,     # Name of the adapter in the format "repo/version"
) -> None
Cancel a running fine-tuning job for an adapter. Parameters
  • adapter_id: str - ID of the adapter in the format “repo/version”
Returns
  • None - The fine-tuning job is cancelled
Example
pb.adapters.cancel("news-summarizer-model/1")

Download Adapter

pb.adapters.download(
    adapter_id: str,             # Name of the adapter in the format "repo/version"
    dest: os.PathLike = None,    # Local destination to download the weights to
) -> None
Download the adapter weights of an existing adapter. Parameters
  • adapter_id: str - ID of the adapter in the format “repo/version”
  • dest: os.PathLike, optional - Local destination to download the weights
Returns
  • None - The adapter weights are downloaded to the local destination
Example
# Download adapter weights
pb.adapters.download(
    adapter_id="news-summarizer-model/1",
    dest="./adapter_weights/"
)

Archive Adapter

pb.adapters.archive(
    adapter_id: str,     # Name of the adapter in the format "repo/version"
) -> None
Archive an adapter version to hide it in the UI. Parameters
  • adapter_id: str - ID of the adapter in the format “repo/version”
Returns
  • None - The adapter version is archived
Example
# Archive an adapter
pb.adapters.archive("news-summarizer-model/1")

Unarchive Adapter

pb.adapters.unarchive(
    adapter_id: str,     # Name of the adapter in the format "repo/version"
) -> None
Unarchive an adapter version to make it visible in the UI. Parameters
  • adapter_id: str - ID of the adapter in the format “repo/version”
Returns
  • None - The adapter version is unarchived
Example
# Unarchive an adapter
pb.adapters.unarchive("news-summarizer-model/1")

Delete Adapter

pb.adapters.delete(
    adapter_id: str,     # Name of the adapter in the format "repo/version"
) -> None
Delete an adapter version. Parameters
  • adapter_id: str - ID of the adapter in the format “repo/version”
Returns
  • None - The adapter version is deleted
Example
# Delete an adapter
pb.adapters.delete("news-summarizer-model/1")