-
-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Flux Block Lora Select + Lora Loader #110
Comments
It's quite experimental, but seems to work. It simply sets the LoRA alpha value individually for each block. Value of 0 drops the whole block from the LoRA. This can be used to test the influence of specific blocks, it affects the actual weights loaded so you can also then save the model/extract new LoRA. Note that the values are the alpha values, with kohya trained LoRAs alpha values are saved in the LoRA itself, with some trainers it's not saved and then the default would be the LoRA rank. |
May I ask how to get lora's rank?Some Loras don't seem to be standard sizes |
Hi,
I noticed these nodes under Experimental. Do you have a quick explanation as to how they're meant to be used? Thanks.
The text was updated successfully, but these errors were encountered: