├── assets ├── demo.txt ├── at.png ├── t4.gif ├── cpugpu.gif ├── cpugpu.png ├── image1.png ├── image2.png ├── image3.webp ├── image4.webp └── accesstokengpu.png ├── choose ├── demo.txt ├── 6.png └── 7.png ├── assets-cpu ├── demo.txt ├── cpu.png ├── cpu0.png ├── cpu1.png ├── cpu2.png ├── cpu3.png ├── cpu4.png ├── cpu5.png ├── image3.webp ├── image4.webp └── accesstokengpu.png ├── assets-gpu ├── demo.txt ├── gpu0.png ├── gpu1.png ├── gpu3.png ├── gpu4.png ├── gpu5.png └── gpu6.png ├── requirements.txt ├── run-as-cpu.py └── README.md /assets/demo.txt: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /choose/demo.txt: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /assets-cpu/demo.txt: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /assets-gpu/demo.txt: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /choose/6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/choose/6.png -------------------------------------------------------------------------------- /choose/7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/choose/7.png -------------------------------------------------------------------------------- /assets/at.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/at.png -------------------------------------------------------------------------------- /assets/t4.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/t4.gif -------------------------------------------------------------------------------- /assets/cpugpu.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/cpugpu.gif -------------------------------------------------------------------------------- /assets/cpugpu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/cpugpu.png -------------------------------------------------------------------------------- /assets/image1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/image1.png -------------------------------------------------------------------------------- /assets/image2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/image2.png -------------------------------------------------------------------------------- /assets-cpu/cpu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/cpu.png -------------------------------------------------------------------------------- /assets-cpu/cpu0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/cpu0.png -------------------------------------------------------------------------------- /assets-cpu/cpu1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/cpu1.png -------------------------------------------------------------------------------- /assets-cpu/cpu2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/cpu2.png -------------------------------------------------------------------------------- /assets-cpu/cpu3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/cpu3.png -------------------------------------------------------------------------------- /assets-cpu/cpu4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/cpu4.png -------------------------------------------------------------------------------- /assets-cpu/cpu5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/cpu5.png -------------------------------------------------------------------------------- /assets-gpu/gpu0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-gpu/gpu0.png -------------------------------------------------------------------------------- /assets-gpu/gpu1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-gpu/gpu1.png -------------------------------------------------------------------------------- /assets-gpu/gpu3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-gpu/gpu3.png -------------------------------------------------------------------------------- /assets-gpu/gpu4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-gpu/gpu4.png -------------------------------------------------------------------------------- /assets-gpu/gpu5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-gpu/gpu5.png -------------------------------------------------------------------------------- /assets-gpu/gpu6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-gpu/gpu6.png -------------------------------------------------------------------------------- /assets/image3.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/image3.webp -------------------------------------------------------------------------------- /assets/image4.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/image4.webp -------------------------------------------------------------------------------- /assets-cpu/image3.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/image3.webp -------------------------------------------------------------------------------- /assets-cpu/image4.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/image4.webp -------------------------------------------------------------------------------- /assets/accesstokengpu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets/accesstokengpu.png -------------------------------------------------------------------------------- /assets-cpu/accesstokengpu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/PRITHIVSAKTHIUR/How-to-run-huggingface-spaces-on-local-machine-demo/HEAD/assets-cpu/accesstokengpu.png -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | torch==2.0.0+cu118 2 | torchvision==0.15.0+cu118 3 | torchaudio==2.0.0+cu118 4 | spaces 5 | diffusers 6 | gradio 7 | safetensors 8 | xformers 9 | invisible_watermark 10 | accelerate 11 | transformers 12 | peft 13 | pillow 14 | huggingface_hub 15 | -------------------------------------------------------------------------------- /run-as-cpu.py: -------------------------------------------------------------------------------- 1 | # Authenticate with Hugging Face 2 | from huggingface_hub import login 3 | 4 | # Log in to Hugging Face using the provided token 5 | hf_token = '-------------your-access-token----goes---here--------' 6 | login(hf_token) 7 | 8 | 9 | import gradio as gr 10 | import numpy as np 11 | import random 12 | from diffusers import DiffusionPipeline 13 | import torch 14 | 15 | device = "cuda" if torch.cuda.is_available() else "cpu" 16 | 17 | if torch.cuda.is_available(): 18 | torch.cuda.max_memory_allocated(device=device) 19 | pipe = DiffusionPipeline.from_pretrained("stabilityai/sdxl-turbo", torch_dtype=torch.float16, variant="fp16", use_safetensors=True) 20 | pipe.enable_xformers_memory_efficient_attention() 21 | pipe = pipe.to(device) 22 | else: 23 | pipe = DiffusionPipeline.from_pretrained("stabilityai/sdxl-turbo", use_safetensors=True) 24 | pipe = pipe.to(device) 25 | 26 | MAX_SEED = np.iinfo(np.int32).max 27 | MAX_IMAGE_SIZE = 1024 28 | 29 | def infer(prompt, negative_prompt, seed, randomize_seed, width, height, guidance_scale, num_inference_steps): 30 | 31 | if randomize_seed: 32 | seed = random.randint(0, MAX_SEED) 33 | 34 | generator = torch.Generator().manual_seed(seed) 35 | 36 | image = pipe( 37 | prompt = prompt, 38 | negative_prompt = negative_prompt, 39 | guidance_scale = guidance_scale, 40 | num_inference_steps = num_inference_steps, 41 | width = width, 42 | height = height, 43 | generator = generator 44 | ).images[0] 45 | 46 | return image 47 | 48 | examples = [ 49 | "Astronaut in a jungle, cold color palette, muted colors, detailed, 8k", 50 | "An astronaut riding a green horse", 51 | "A delicious ceviche cheesecake slice", 52 | ] 53 | 54 | css=""" 55 | #col-container { 56 | margin: 0 auto; 57 | max-width: 520px; 58 | } 59 | """ 60 | 61 | if torch.cuda.is_available(): 62 | power_device = "GPU" 63 | else: 64 | power_device = "CPU" 65 | 66 | with gr.Blocks(css=css) as demo: 67 | 68 | with gr.Column(elem_id="col-container"): 69 | gr.Markdown(f""" 70 | # Text-to-Image Gradio Template 71 | Currently running on {power_device}. 72 | """) 73 | 74 | with gr.Row(): 75 | 76 | prompt = gr.Text( 77 | label="Prompt", 78 | show_label=False, 79 | max_lines=1, 80 | placeholder="Enter your prompt", 81 | container=False, 82 | ) 83 | 84 | run_button = gr.Button("Run", scale=0) 85 | 86 | result = gr.Image(label="Result", show_label=False) 87 | 88 | with gr.Accordion("Advanced Settings", open=False): 89 | 90 | negative_prompt = gr.Text( 91 | label="Negative prompt", 92 | max_lines=1, 93 | placeholder="Enter a negative prompt", 94 | visible=False, 95 | ) 96 | 97 | seed = gr.Slider( 98 | label="Seed", 99 | minimum=0, 100 | maximum=MAX_SEED, 101 | step=1, 102 | value=0, 103 | ) 104 | 105 | randomize_seed = gr.Checkbox(label="Randomize seed", value=True) 106 | 107 | with gr.Row(): 108 | 109 | width = gr.Slider( 110 | label="Width", 111 | minimum=256, 112 | maximum=MAX_IMAGE_SIZE, 113 | step=32, 114 | value=512, 115 | ) 116 | 117 | height = gr.Slider( 118 | label="Height", 119 | minimum=256, 120 | maximum=MAX_IMAGE_SIZE, 121 | step=32, 122 | value=512, 123 | ) 124 | 125 | with gr.Row(): 126 | 127 | guidance_scale = gr.Slider( 128 | label="Guidance scale", 129 | minimum=0.0, 130 | maximum=10.0, 131 | step=0.1, 132 | value=0.0, 133 | ) 134 | 135 | num_inference_steps = gr.Slider( 136 | label="Number of inference steps", 137 | minimum=1, 138 | maximum=12, 139 | step=1, 140 | value=2, 141 | ) 142 | 143 | gr.Examples( 144 | examples = examples, 145 | inputs = [prompt] 146 | ) 147 | 148 | run_button.click( 149 | fn = infer, 150 | inputs = [prompt, negative_prompt, seed, randomize_seed, width, height, guidance_scale, num_inference_steps], 151 | outputs = [result] 152 | ) 153 | 154 | demo.queue().launch() -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | --- 2 | title: HF_SPACE DEMO 3 | emoji: 🐹 4 | colorFrom: blue 5 | colorTo: pink 6 | sdk: gradio 7 | sdk_version: 4.36.1 8 | app_file: app.py 9 | base_model: stabilityai/sdxl-turbo 10 | model: SG161222/RealVisXL_V4.0 / other models based on the conditions 11 | type: base_model, model 12 | pinned: true 13 | header: mini 14 | theme: bethecloud/storj_theme 15 | get_hamster_from: https://prithivhamster.vercel.app/ 16 | license: creativeml-openrail-m 17 | short_description: Fast as Hamster | Stable Hamster | Stable Diffusion 18 | --- 19 | 20 | ## How to run hf spaces on local cpu (ex.intel i5 / amd ryzen 7) or by google colab with T4 gpu ❓ 21 | 22 | ![alt text](assets/cpugpu.gif) 23 | 24 | # Before getting into the demo, let's first understand how Hugging Face access tokens are passed from the settings on your profile ⭐ 25 | 26 | You can see the hf token there : 👇🏻 in your profile 27 | 28 | https://huggingface.co/settings/tokens 29 | 30 | ![alt text](assets/at.png) 31 | 32 | Pass the access to Login locally to Hugging face 33 | 34 | ![alt text](assets/accesstokengpu.png) 35 | 36 | Here we used T4 GPU Instead of Nvidia A100, where as you can access the A100 in Colab if you are a premium user. T4 is free for certain amount of computation & although it's not as powerful as the A100 or V100. Since A100 supports HCP() - Acc 37 | 38 | ![alt text](assets/t4.gif) 39 | 40 | ## 1. Running in T4 GPU, Google Colab Space : Hardware accelerator 41 | 42 | Choose the run-as-gpu.ipynb file from the repo & take it on to the colab notebooks 43 | 44 | ![alt text](choose/6.png) 45 | 46 | In Colab Choose the T4 GPU as a Runtime Hardware ✅ as Google Compute Engine !! 47 | 48 | ![alt text](assets-gpu/gpu0.png) 49 | 50 | Run the modules one by one : first the requirements, sencond the hf_access_token -- Login successful!, third the main code block. After the components of the model which you have linked with the model id will be loaded. 51 | 52 | ![alt text](assets-gpu/gpu4.png) 53 | 54 | 👇🏻👇🏻After Successfully running the code the live.server for gradio will give a link like this ... 55 | 56 | https://7770379da2bab84efe.gradio.live/ 57 | 58 | 🚀Progress 59 | 60 | After loading to the gradio.live, the gradio interface like this.. & enter the prompt and process it 61 | 62 | 63 | | ![alt text](assets-gpu/gpu3.png) |![alt text](assets-gpu/gpu1.png) | 64 | |---------------------------|--------------------------| 65 | 66 | 67 | The Sample results 1 & 2 from the colab space 68 | 69 | | ![alt text](assets-gpu/gpu5.png) |![alt text](assets-gpu/gpu6.png) | 70 | |---------------------------|--------------------------| 71 | 72 | The original resultant image from the space // gradio.live 73 | 74 | | ![alt text](assets/image1.png) |![alt text](assets/image2.png) | 75 | |---------------------------|--------------------------| 76 | 77 | 🚀Working Link for the Colab : 78 | 79 | https://colab.research.google.com/drive/1rpL-UPkVpJgj5U2WXOupV0GWbBGqJ5-p 80 | 81 | . 82 | 83 | . 84 | 85 | ----------------------------------------------------------------------------------------------------------------------------------------------------------------- 86 | 87 | 88 | ## 2. Running on CPU, Local System : Hardware accelerator [ 0 ] 89 | 90 | 👇🏻Same Hugging_Face Login procedure for this method also !! 91 | 92 | You can see the hf token there : 👇🏻 in your profile 93 | 94 | https://huggingface.co/settings/tokens 95 | 96 | ![alt text](assets/at.png) 97 | 98 | Pass the access to Login locally to Hugging face 99 | 100 | ![alt text](assets/accesstokengpu.png) 101 | 102 | Choose the run-as-cpu.py file from the repo & take it on to the local code editor ( eg. vs.code ) 103 | Statisfy all the requirement.txt ; pip install -r requirements.txt 104 | 105 | ![alt text](choose/7.png) 106 | Run all the requirements 107 | 108 | accelerate 109 | diffusers 110 | invisible_watermark 111 | torch 112 | transformers 113 | xformers 114 | 115 | 🚀Run the run-as-cpu.py by ( python run-as-cpu.py ) 116 | 117 | ![alt text](assets-cpu/cpu.png) 118 | 119 | ✅ After the successful -run you will see the components loading to the local code editor 120 | 121 | ===== Application Startup at 2024-06-17 16:51:58 ===== 122 | The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. 123 | 0it [00:00, ?it/s] 124 | 0it [00:00, ?it/s] 125 | /usr/local/lib/python3.10/site-packages/diffusers/models/transformers/transformer_2d.py:34: FutureWarning: `Transformer2DModelOutput` is deprecated and will be removed in version 1.0.0. Importing `Transformer2DModelOutput` from `diffusers.models.transformer_2d` is deprecated and this will be removed in a future version. Please use `from diffusers.models.modeling_outputs import Transformer2DModelOutput`, instead. 126 | deprecate("Transformer2DModelOutput", "1.0.0", deprecation_message) 127 | Loading pipeline components...: 0%| | 0/7 [00:00