Using hf-mirror.com in China | Generated by AI

Home 2025.11

Understanding hf-mirror.com

hf-mirror.com is a community-maintained mirror of the official Hugging Face Hub (huggingface.co), specifically designed to help users in China (including Guangzhou) bypass network restrictions and download AI models/datasets faster. It replicates the full repository structure, so you can access models like GPT-2 or datasets like WikiText without directly hitting the original site, which is often slow or blocked. Since you use Clash proxy often, note that hf-mirror.com is hosted domestically and typically doesn’t require a proxy for access—it’s optimized for direct use in China. If you’re already proxying traffic via Clash, you can either route hf-mirror.com traffic directly (to avoid unnecessary hops) or keep it proxied if preferred.

Basic Setup: Using the Mirror

The key is setting the HF_ENDPOINT environment variable to point to the mirror. This works globally for Hugging Face tools like the transformers library, huggingface-cli, or hfd (a faster downloader). Do this before importing libraries or running downloads.

1. Set the Environment Variable

This redirects all Hugging Face downloads to the mirror without changing your code.

2. Install Required Tools

3. Downloading Models or Datasets

4. Handling Gated/Logged-In Models

Some models (e.g., Llama-2) require a Hugging Face account and token:

Integrating with Clash Proxy

Since hf-mirror.com is a Chinese mirror, it should be accessible without Clash (direct connection is faster). However, if you want to proxy it (e.g., for consistency or if you hit any issues), configure Clash to route traffic to hf-mirror.com via your preferred proxy group. Clash doesn’t need special “HF” config—it’s system-wide.

Quick Clash Setup Tips

Troubleshooting

This setup should get you downloading reliably. If you run into specific errors, share the output for more help!

References


Back

x-ai/grok-4-fast

Donate