davidberenstein1957 commited on
Commit
18e0bee
·
verified ·
1 Parent(s): bd290ee

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +27 -1
README.md CHANGED
@@ -2,6 +2,8 @@
2
  library_name: diffusers
3
  tags:
4
  - pruna-ai
 
 
5
  ---
6
 
7
  # Model Card for PrunaAI/FLUX.1-Fill-dev-smashed
@@ -14,6 +16,8 @@ First things first, you need to install the pruna library:
14
 
15
  ```bash
16
  pip install pruna
 
 
17
  ```
18
 
19
  You can [use the diffusers library to load the model](https://huggingface.co/PrunaAI/FLUX.1-Fill-dev-smashed?library=diffusers) but this might not include all optimizations by default.
@@ -23,9 +27,31 @@ To ensure that all optimizations are applied, use the pruna library to load the
23
  ```python
24
  from pruna import PrunaModel
25
 
26
- loaded_model = PrunaModel.from_hub(
 
 
 
 
 
27
  "PrunaAI/FLUX.1-Fill-dev-smashed"
28
  )
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
29
  ```
30
 
31
  After loading the model, you can use the inference methods of the original model. Take a look at the [documentation](https://pruna.readthedocs.io/en/latest/index.html) for more usage information.
 
2
  library_name: diffusers
3
  tags:
4
  - pruna-ai
5
+ base_model:
6
+ - black-forest-labs/FLUX.1-Depth-dev
7
  ---
8
 
9
  # Model Card for PrunaAI/FLUX.1-Fill-dev-smashed
 
16
 
17
  ```bash
18
  pip install pruna
19
+ pip install git+https://github.com/asomoza/image_gen_aux.git
20
+
21
  ```
22
 
23
  You can [use the diffusers library to load the model](https://huggingface.co/PrunaAI/FLUX.1-Fill-dev-smashed?library=diffusers) but this might not include all optimizations by default.
 
27
  ```python
28
  from pruna import PrunaModel
29
 
30
+ import torch
31
+ from diffusers import FluxControlPipeline, FluxTransformer2DModel
32
+ from diffusers.utils import load_image
33
+ from image_gen_aux import DepthPreprocessor
34
+
35
+ pipe = PrunaModel.from_hub(
36
  "PrunaAI/FLUX.1-Fill-dev-smashed"
37
  )
38
+ prompt = "A robot made of exotic candies and chocolates of different kinds. The background is filled with confetti and celebratory gifts."
39
+ control_image = load_image("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/robot.png")
40
+
41
+ processor = DepthPreprocessor.from_pretrained("LiheYoung/depth-anything-large-hf")
42
+ control_image = processor(control_image)[0].convert("RGB")
43
+
44
+ image = pipe(
45
+ prompt=prompt,
46
+ control_image=control_image,
47
+ height=1024,
48
+ width=1024,
49
+ num_inference_steps=30,
50
+ guidance_scale=10.0,
51
+ generator=torch.Generator().manual_seed(42),
52
+ ).images[0]
53
+ image.save("output.png")
54
+
55
  ```
56
 
57
  After loading the model, you can use the inference methods of the original model. Take a look at the [documentation](https://pruna.readthedocs.io/en/latest/index.html) for more usage information.