Skip to main content
Coming Soon

The Open Source Hub for AI Models

MatrixHub is an open-source, self-hosted AI model registry engineered for large-scale enterprise inference. It serves as a drop-in private replacement for Hugging Face, purpose-built to accelerate vLLM and SGLang workloads.

matrixhub-ai/matrixhub

MatrixHub is to Hugging Face what Harbor is to Docker Hub.

Stop relying on public internet for mission-critical AI. Control your assets, accelerate your pipelines.

Core Features

Infrastructure designed for Scale

Built for the specific needs of SREs and Algorithm Engineers managing massive model weights.

Transparent HF Proxy

Drop-in replacement for Hugging Face. Point your HF_ENDPOINT to MatrixHub and keep all training/inference code unchanged.

On-Demand Caching

Pull once, cache forever. Automatically localizes public models to slash redundant traffic and accelerate cluster-wide distribution.

RBAC & Audit Logs

Fine-grained permissions, project-based isolation, and comprehensive audit trails for every upload and download.

Storage Agnostic

Compatible with local filesystems, NFS, and S3-compatible backends (MinIO, AWS). Scale to unlimited model capacity.

Key Use Cases

How organizations use MatrixHub in production.

View Architecture

Zero-Wait Distribution

Eliminate bandwidth bottlenecks with a 'Pull-once, serve-all' cache. Achieve 10Gbps+ speeds across 100+ GPU nodes simultaneously.

Air-Gapped Delivery

Securely ferry models into isolated networks with integrity protection, malware scanning, and comprehensive audit trails.

Private Registry

Centralize fine-tuned weights with tag locking and CI/CD integration. Guarantee consistency from development to production.

Global Multi-Region Sync

Automate asynchronous, resumable replication between data centers for high availability and low-latency local access.

SEAMLESSLY INTEGRATED WITH

vLLMvLLM
SGLangSGLang
KubernetesKubernetes

Ready to take control of your AI Models?

Deploy MatrixHub in minutes using Docker Compose or Helm. Open source and free for the community.

Read the Docs
curl -fsSL https://bit.ly/4qqSZIG | docker compose -f - up -d