Stable Diffusion Proxy Issue | Generated by AI

Home PDF

Based on the error and your environment, here’s how to fix the “Unknown scheme for proxy URL” issue:

Solution:

The error occurs because httpx doesn’t support socks:// proxies by default. You have two options:

Option 1: Remove the SOCKS Proxy Reference

  1. Check your environment variables for stray proxy settings:
    env | grep -i proxy
    

    Look for socks:// references (especially ALL_PROXY or http_proxy).

  2. Unset the problematic variable (replace VAR_NAME with the actual name found above):
    unset VAR_NAME  # e.g., unset ALL_PROXY
    

Option 2: Add SOCKS Support to httpx

  1. Activate your Python virtual environment:
    source /home/lzw/Projects/stable-diffusion-webui/venv/bin/activate
    
  2. Install required dependencies:
    pip install -U "httpx[socks]"
    
  3. Deactivate the environment:
    deactivate
    

Why this happens:

Verification:

After fixing, run:

./webui.sh

The SOCKS proxy error should disappear.

💡 Note: If you didn’t intentionally set a SOCKS proxy, investigate applications that might auto-configure proxies (e.g., VPNs, network managers, or shell config files like ~/.bashrc).


The error Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check indicates that PyTorch, used by Stable Diffusion WebUI, cannot detect or access a compatible GPU. This is likely because of missing CUDA libraries, an incompatible GPU, driver issues, or a configuration problem. Below are steps to fix this issue, tailored to your environment (glibc 2.35, Python 3.10.12, GCC 11.4.0, likely Ubuntu 22.04 based on glibc version).

Steps to Fix the Issue

1. Verify GPU and CUDA Compatibility

2. Verify PyTorch Installation

The error suggests PyTorch is installed but cannot use the GPU. Ensure you have the correct PyTorch version with CUDA support.

3. Bypass the CUDA Check (Temporary Workaround)

If you want to run Stable Diffusion without GPU support (e.g., for testing on CPU), bypass the CUDA check by adding --skip-torch-cuda-test to the command-line arguments.

4. Ensure TCMalloc is Properly Configured

Your output shows TCMalloc (libtcmalloc_minimal.so.4) is detected and linked with LD_PRELOAD. Confirm it’s working:

   echo $LD_PRELOAD

If it outputs /lib/x86_64-linux-gnu/libtcmalloc_minimal.so.4, you’re set. If not, set it manually:

   export LD_PRELOAD=/lib/x86_64-linux-gnu/libtcmalloc_minimal.so.4

Or add it to webui-user.sh:

   export LD_PRELOAD=/lib/x86_64-linux-gnu/libtcmalloc_minimal.so.4

5. Check Environment Variables and Paths

Ensure your environment is correctly set up:

6. Update Stable Diffusion WebUI

Your version (v1.10.1, commit 82a973c) may have compatibility issues. Update to the latest version:

   cd /home/lzw/Projects/stable-diffusion-webui
   git pull

Then, reinstall dependencies:

   ./webui.sh

7. Troubleshooting

8. Run the Application

After applying the fixes, run the WebUI:

   cd /home/lzw/Projects/stable-diffusion-webui
   ./webui.sh

If you fixed the GPU issue, the error should disappear. If using --skip-torch-cuda-test, it will run on CPU.

Notes

If you still face issues or need help with specific error messages, please provide additional details (e.g., GPU model, CUDA version, or full logs), and I can refine the solution!


Back 2025.06.18 Donate