pb.adapters.create( config: SFTConfig | ContinuedPretrainingConfig | GRPOConfig, # Configuration for fine-tuning dataset: str, # The dataset to use for fine-tuning repo: str, # Name of the adapter repo continue_from_version: str = None, # The adapter version to continue training from description: str = None, # Description for the adapter show_tensorboard: bool = False # If true, launch tensorboard instance) -> Adapter
Create a new adapter by starting a new (blocking) fine-tuning job.Parameters
continue_from_version: str, optional - The adapter version to continue
training from
repo: str - Name of the adapter repo to store the newly created adapter
description: str, optional - Description for the adapter
show_tensorboard: bool, default False - If true, launch a tensorboard
instance to view training logs
Returns
Adapter - The created adapter object
Example 1: Create an adapter with defaults
Copy
Ask AI
# Create an adapter repositoryrepo = pb.repos.create(name="news-summarizer-model", description="TLDR News Summarizer Experiments", exists_ok=True)# Start a fine-tuning job, blocks until training is finishedadapter = pb.adapters.create( config=SFTConfig( base_model="qwen3-8b" ), dataset="tldr_news", repo=repo, description="initial model with defaults")
Example 2: Create a new adapter with customized parameters
Copy
Ask AI
# Create an adapter repositoryrepo = pb.repos.create(name="news-summarizer-model", description="TLDR News Summarizer Experiments", exists_ok=True)# Start a fine-tuning job with custom parameters, blocks until training is finishedadapter = pb.adapters.create( config=SFTConfig( base_model="qwen3-8b", task="instruction_tuning", epochs=1, # default: 3 rank=8, # default: 16 learning_rate=0.0001, # default: 0.0002 target_modules=["q_proj", "v_proj", "k_proj"], # default: None (infers [q_proj, v_proj] for qwen3-8b) ), dataset="tldr_news", repo=repo, description="changing epochs, rank, and learning rate")
Example 3: Continue training from an existing adapter
Copy
Ask AI
adapter = pb.adapters.create( # Note: only `epochs` and `enable_early_stopping` are available parameters in this case. config=SFTConfig( epochs=3, # The maximum number of ADDITIONAL epochs to train for enable_early_stopping=False, ), continue_from_version="myrepo/3", # The adapter version to resume training from dataset="mydataset", repo="myrepo")
pb.finetuning.jobs.create( config: SFTConfig | ContinuedPretrainingConfig | GRPOConfig, # Configuration for fine-tuning dataset: str, # The dataset to use for fine-tuning repo: str, # Name of the adapter repo continue_from_version: str = None, # The adapter version to continue training from description: str = None, # Description for the adapter watch: bool = False, # Whether to block until job finishes show_tensorboard: bool = False # If true, launch tensorboard instance) -> FinetuningJob
Start a non-blocking fine-tuning job for creating an adapter.Parameters
pb.finetuning.jobs.estimate_cost( config: SFTConfig | ContinuedPretrainingConfig | GRPOConfig, # Configuration for fine-tuning dataset: str, # The dataset to use for fine-tuning) -> FinetuningJob
Estimate how much it will cost to fine-tune an adapter based on its config and the data. This is automatically invoked whenever an adapter is created.Parameters
pb.adapters.upload( local_dir: str, # Path to the local directory containing the adapter files repo: str, # Name of the adapter repo region: str, # Region to upload to) -> Adapter
Upload existing adapter weights to create a new adapter version.Parameters
local_dir: str - Path to the local directory containing the adapter files
repo: str - Name of the adapter repo
region: str - The region to upload the adapter to. Only required for multi-region VPC users
pb.adapters.download( adapter_id: str, # Name of the adapter in the format "repo/version" dest: os.PathLike = None, # Local destination to download the weights to) -> None
Download the adapter weights of an existing adapter.Parameters
adapter_id: str - ID of the adapter in the format “repo/version”
dest: os.PathLike, optional - Local destination to download the weights
Returns
None - The adapter weights are downloaded to the local destination