AI Apps
Dli.li AI exposes an OpenAI-compatible relay layer, so it can connect to many clients and tools that support OpenAI-style interfaces. This section collects the integration guides that are already available in the current docs site.
Prepare these first
- A usable API endpoint
- An API key
- A valid model name
Most apps only need those three fields to connect.
Before configuring any app
- Use the main site
https://dli.lifor login, payment, and account linking. - For API requests from mainland China, prefer
https://dlili.04s.net/v1. - Make sure the model name matches the exact name exposed by your account.
- If an app offers both Anthropic-native and OpenAI-compatible modes, prefer the OpenAI-compatible mode unless you have confirmed
/messagessupport for the target model.
Available guides
| App | Description |
|---|---|
| AionUi | Free open-source cowork tool with multi-model support, file management, and live preview |
| CCR | Claude Code Router setup notes for routing Claude Code traffic to the compatible interface on this site |
| CC Switch | Configuration assistant for Claude, Codex, Gemini, and related CLI tools |
| Cherry Studio | Desktop AI client suitable for daily multi-model chat |
| Memoh | Containerized AI agent platform |
| OpenClaw | Self-hosted AI assistant platform with multi-channel integrations |
| Fluent Read | AI reading and translation assistant |
| LangBot | Large-language-model chatbot framework |
| Luna Translator | Real-time translator for games and documents |
| AstrBot | Open-source Agent chatbot platform |
| Claude Code | Anthropic terminal coding assistant |
| Codex CLI | OpenAI terminal coding assistant |
| Factory Droid CLI | Workflow automation and software engineering agent |
Notes
Some pages still reuse upstream New API screenshots or UI labels because Dli.li AI keeps compatibility at the interface layer. If you add more apps later, follow the organization pattern already used in this directory.
If you mainly use Claude Code-related tooling, also check:
If you want to verify your endpoint and model first, use:
