Hi Nick Matton
Welcome to Microsoft Q&A Forum, thank you for posting your query here!
I understand that you are encountering an issue, The LoRA (Low-Rank Adaptation) fine-tuning process generally employs a rank between 4 and 32. This range ensures effective performance while greatly decreasing the number of trainable parameters and the overall memory requirements. Using LoRA (Low-Rank Adaptation) allows you to attain performance levels similar to those achieved with full fine-tuning.
Kindly refer below link: huggingface
Hope this helps. Do let us know if you any further queries.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.
Thank You.