rahul7star commited on
Commit
3d08364
Β·
verified Β·
1 Parent(s): 3a55b0c

Update app1.py

Browse files
Files changed (1) hide show
  1. app1.py +0 -9
app1.py CHANGED
@@ -1,7 +1,4 @@
1
- Got itβ€”here’s a cleaned-up version of your script with the extra LoRA wired **only into the low-noise stage (`transformer_2`)** of Wan 2.2. I’ve kept everything else intact and added the adapter with a clear name so you can tweak weights later if you want.
2
 
3
- ```python
4
- # PyTorch 2.8 (temporary hack)
5
  import os
6
  os.system('pip install --upgrade --pre --extra-index-url https://download.pytorch.org/whl/nightly/cu126 "torch<2.9" spaces')
7
 
@@ -199,10 +196,4 @@ with gr.Blocks() as demo:
199
 
200
  if __name__ == "__main__":
201
  demo.queue().launch(mcp_server=True)
202
- ```
203
 
204
- ### Notes
205
-
206
- * The key lines are the `pipe.load_lora_weights(..., components=["transformer_2"])` and `pipe.set_adapters(..., components=["transformer_2"])`, which ensure the **orbit-shot LoRA only affects the low-noise stage**.
207
- * If you want to dial the LoRA’s strength, change `adapter_weights=[1.0]` to something like `0.6–1.2`.
208
- * If you also plan to stack a β€œLightning” LoRA, you can load it with a different `adapter_name` (e.g., `"lightning"`) and then call a combined `set_adapters` where `components=["transformer", "transformer_2"]` for lightning and only `["transformer_2"]` for orbit.
 
 
1
 
 
 
2
  import os
3
  os.system('pip install --upgrade --pre --extra-index-url https://download.pytorch.org/whl/nightly/cu126 "torch<2.9" spaces')
4
 
 
196
 
197
  if __name__ == "__main__":
198
  demo.queue().launch(mcp_server=True)
 
199