Search/
Skip to content
/
OpenRouter
© 2026 OpenRouter, Inc

Product

  • Chat
  • Rankings
  • Apps
  • Models
  • Providers
  • Pricing
  • Enterprise
  • Labs

Company

  • About
  • Announcements
  • CareersHiring
  • Privacy
  • Terms of Service
  • Support
  • State of AI
  • Works With OR
  • Data

Developer

  • Documentation
  • API Reference
  • SDK
  • Status

Connect

  • Discord
  • GitHub
  • LinkedIn
  • X
  • YouTube

Baidu: ERNIE 4.5 21B A3B

baidu/ernie-4.5-21b-a3b

Released Aug 12, 2025Knowledge cutoff Mar 31, 2025120,000 context$0.07/M input tokens$0.28/M output tokens

A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.

Overview
Playground
Providers
Performance
Pricing
Apps
Activity
Uptime
API

Recent activity on ERNIE 4.5 21B A3B

Total usage per day on OpenRouter

Not enough data to display yet.