Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flux Block Lora Select + Lora Loader #110

Open
caniyabanci76 opened this issue Sep 10, 2024 · 5 comments
Open

Flux Block Lora Select + Lora Loader #110

caniyabanci76 opened this issue Sep 10, 2024 · 5 comments

Comments

@caniyabanci76
Copy link

Hi,

I noticed these nodes under Experimental. Do you have a quick explanation as to how they're meant to be used? Thanks.

@kijai
Copy link
Owner

kijai commented Sep 17, 2024

It's quite experimental, but seems to work. It simply sets the LoRA alpha value individually for each block. Value of 0 drops the whole block from the LoRA. This can be used to test the influence of specific blocks, it affects the actual weights loaded so you can also then save the model/extract new LoRA.

Note that the values are the alpha values, with kohya trained LoRAs alpha values are saved in the LoRA itself, with some trainers it's not saved and then the default would be the LoRA rank.

image

@vigee88
Copy link

vigee88 commented Sep 23, 2024

It's quite experimental, but seems to work. It simply sets the LoRA alpha value individually for each block. Value of 0 drops the whole block from the LoRA. This can be used to test the influence of specific blocks, it affects the actual weights loaded so you can also then save the model/extract new LoRA.

Note that the values are the alpha values, with kohya trained LoRAs alpha values are saved in the LoRA itself, with some trainers it's not saved and then the default would be the LoRA rank.

image

May I ask how to get lora's rank?Some Loras don't seem to be standard sizes
For example, a rank of 299MB is 32, but what is the rank of 82MB? Is there any tool available for viewing?

@kijai
Copy link
Owner

kijai commented Sep 23, 2024

It's quite experimental, but seems to work. It simply sets the LoRA alpha value individually for each block. Value of 0 drops the whole block from the LoRA. This can be used to test the influence of specific blocks, it affects the actual weights loaded so you can also then save the model/extract new LoRA.
Note that the values are the alpha values, with kohya trained LoRAs alpha values are saved in the LoRA itself, with some trainers it's not saved and then the default would be the LoRA rank.
image

May I ask how to get lora's rank?Some Loras don't seem to be standard sizes For example, a rank of 299MB is 32, but what is the rank of 82MB? Is there any tool available for viewing?

There's an output from the loader node that should detect it, you'll need some node to display it like "Display Anything".

@vigee88
Copy link

vigee88 commented Sep 23, 2024

It's quite experimental, but seems to work. It simply sets the LoRA alpha value individually for each block. Value of 0 drops the whole block from the LoRA. This can be used to test the influence of specific blocks, it affects the actual weights loaded so you can also then save the model/extract new LoRA.
Note that the values are the alpha values, with kohya trained LoRAs alpha values are saved in the LoRA itself, with some trainers it's not saved and then the default would be the LoRA rank.
image

May I ask how to get lora's rank?Some Loras don't seem to be standard sizes For example, a rank of 299MB is 32, but what is the rank of 82MB? Is there any tool available for viewing?

There's an output from the loader node that should detect it, you'll need some node to display it like "Display Anything".

Sorry to bother you again, but I am unable to use the Display Anything node to obtain information about Lora. Is there a problem with my settings?
01
02

@kijai
Copy link
Owner

kijai commented Sep 23, 2024

I meant my node that's used to select the blocks:

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants