-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Open
Description
I finetuned a model using LoRA and saved the checkpoint. Is it possible to load that LoRA to a model, and finetune BOTH LoRA and the DiT together? I set my options like the following, but it seems like whenever I specified LoRA-specific arguments, the DiT is never trained, even if i set trainable_models to be dit
--trainable_models "dit" \
--lora_base_model "dit" \
-lora_target_modules "to_q,to_k,to_v,to_out.0,add_q_proj,add_k_proj,add_v_proj,to_add_out,linear_in,linear_out,to_qkv_mlp_proj,single_transformer_blocks.0.attn.to_out,single_transformer_blocks.1.attn.to_out,single_transformer_blocks.2.attn.to_out,single_transformer_blocks.3.attn.to_out,single_transformer_blocks.4.attn.to_out,single_transformer_blocks.5.attn.to_out,single_transformer_blocks.6.attn.to_out,single_transformer_blocks.7.attn.to_out,single_transformer_blocks.8.attn.to_out,single_transformer_blocks.9.attn.to_out,single_transformer_blocks.10.attn.to_out,single_transformer_blocks.11.attn.to_out,single_transformer_blocks.12.attn.to_out,single_transformer_blocks.13.attn.to_out,single_transformer_blocks.14.attn.to_out,single_transformer_blocks.15.attn.to_out,single_transformer_blocks.16.attn.to_out,single_transformer_blocks.17.attn.to_out,single_transformer_blocks.18.attn.to_out,single_transformer_blocks.19.attn.to_out" \
--lora_rank 64 \
--lora_checkpoint ${lora_ckpt}
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels