The Kimi K2.5 GitHub ecosystem is rapidly growing as developers build tools, integrations, and applications around Moonshot AI's flagship model. This comprehensive guide covers official repositories, community projects, SDKs, and resources to accelerate your Kimi K2.5 development.
Official Kimi K2.5 GitHub Resources
Moonshot AI Official Repositories
| Repository | Description | URL |
|---|---|---|
| kimi-cli | Official Kimi Code CLI | github.com/MoonshotAI/kimi-cli |
| Kimi-K2.5 | Model weights & docs | huggingface.co/moonshotai/Kimi-K2.5 |
| MoonshotAI-Cookbook | Official API examples | github.com/MoonshotAI/MoonshotAI-Cookbook |
Kimi Code CLI Repository
The Kimi Code CLI is Moonshot AI's official terminal-based coding agent:
# Clone the repository
git clone https://github.com/MoonshotAI/kimi-cli.git
# Install from source
cd kimi-cli
pip install -e .Key Features:
- Terminal-based AI coding assistant
- ACP (Agent Client Protocol) support
- MCP (Model Context Protocol) integration
- Multi-provider support
Repository Structure:
kimi-cli/
├── kimi/ # Main package
├── docs/ # Documentation
├── tests/ # Test suite
├── examples/ # Usage examples
└── README.md # Getting started guideCommunity Kimi K2.5 Projects
Public Repositories to Watch
The following repositories are currently verifiable and active:
1. zsh-kimi-cli
- Official Zsh plugin for Kimi Code CLI
- URL: github.com/MoonshotAI/zsh-kimi-cli
2. kimi-code-zed-extension
- Kimi Code extension for Zed editor
- URL: github.com/MoonshotAI/kimi-code-zed-extension
3. kimi-agent-sdk
- Agent SDK resources from Moonshot AI
- URL: github.com/MoonshotAI/kimi-agent-sdk
4. kimi-agent-rs
- Rust ecosystem repository for Kimi agent tooling
- URL: github.com/MoonshotAI/kimi-agent-rs
Starter Templates
No single official starter-template repo is listed for Kimi K2.5. A safe approach is to scaffold your app first, then wire an OpenAI-compatible client.
Next.js + Kimi K2.5 Starter (manual scaffold)
npx create-next-app@latest my-kimi-app
cd my-kimi-app
npm install openaiPython FastAPI + Kimi K2.5 Starter (manual scaffold)
mkdir kimi-fastapi-starter && cd kimi-fastapi-starter
python -m venv .venv && source .venv/bin/activate
pip install fastapi uvicorn openaiKimi K2.5 SDKs and Libraries
Python (OpenAI-Compatible SDK)
pip install openaifrom openai import OpenAI
client = OpenAI(
api_key="your-api-key",
base_url="https://api.moonshot.cn/v1"
)
# Simple completion
response = client.chat.completions.create(
model="kimi-k2-5",
messages=[{"role": "user", "content": "Hello!"}]
)
# Streaming
for chunk in client.chat.completions.create(
model="kimi-k2-5",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True
):
print(chunk.choices[0].delta.content, end="")JavaScript/TypeScript (OpenAI-Compatible SDK)
npm install openaiimport OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'your-api-key',
baseURL: 'https://api.moonshot.cn/v1',
});
const response = await client.chat.completions.create({
model: 'kimi-k2-5',
messages: [{ role: 'user', content: 'Explain TypeScript' }],
});Go (OpenAI-Compatible via HTTP)
package main
import (
"bytes"
"fmt"
"io"
"net/http"
)
func main() {
payload := []byte(`{
"model":"kimi-k2-5",
"messages":[{"role":"user","content":"Hello!"}]
}`)
req, _ := http.NewRequest("POST", "https://api.moonshot.cn/v1/chat/completions", bytes.NewBuffer(payload))
req.Header.Set("Authorization", "Bearer YOUR_API_KEY")
req.Header.Set("Content-Type", "application/json")
resp, err := http.DefaultClient.Do(req)
if err != nil {
panic(err)
}
defer resp.Body.Close()
body, _ := io.ReadAll(resp.Body)
fmt.Println(string(body))
}Integration Examples
OpenAI SDK Compatible
import openai
# Drop-in replacement
client = openai.OpenAI(
api_key="your-kimi-api-key",
base_url="https://api.moonshot.cn/v1"
)
# Works with existing OpenAI code
response = client.chat.completions.create(
model="kimi-k2-5",
messages=[{"role": "user", "content": "Hello"}]
)LangChain Integration
from langchain_openai import ChatOpenAI
# Use with LangChain
llm = ChatOpenAI(
model="kimi-k2-5",
openai_api_key="your-kimi-api-key",
openai_api_base="https://api.moonshot.cn/v1"
)
from langchain import LLMChain, PromptTemplate
template = """Answer the following question:
Question: {question}
Answer: """
prompt = PromptTemplate(template=template, input_variables=["question"])
chain = LLMChain(llm=llm, prompt=prompt)
response = chain.run("What is machine learning?")LlamaIndex Integration
from llama_index.llms.openai import OpenAI
from llama_index.core import Settings
Settings.llm = OpenAI(
model="kimi-k2-5",
api_key="your-kimi-api-key",
api_base="https://api.moonshot.cn/v1"
)
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("Summarize the documents")Docker and Deployment
Docker Compose Setup
version: '3.8'
services:
kimi-api:
image: vllm/vllm-openai:latest
command: >
--model moonshotai/Kimi-K2.5
--tensor-parallel-size 4
ports:
- '8000:8000'
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 4
capabilities: [gpu]
kimi-web:
build: ./web
ports:
- '3000:3000'
environment:
- KIMI_API_URL=http://kimi-api:8000Kubernetes Manifest
apiVersion: v1
kind: ConfigMap
metadata:
name: kimi-config
data:
MODEL_NAME: 'moonshotai/Kimi-K2.5'
TENSOR_PARALLEL: '4'
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: kimi-deployment
spec:
replicas: 1
selector:
matchLabels:
app: kimi
template:
metadata:
labels:
app: kimi
spec:
containers:
- name: vllm
image: vllm/vllm-openai:latest
envFrom:
- configMapRef:
name: kimi-config
resources:
limits:
nvidia.com/gpu: '4'GitHub Actions CI/CD
Automated Testing
name: Kimi K2.5 Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: |
pip install -r requirements.txt
pip install pytest
- name: Run tests
env:
KIMI_API_KEY: ${{ secrets.KIMI_API_KEY }}
run: pytest tests/Model Deployment Pipeline
name: Deploy Kimi K2.5
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Deploy to Kubernetes
run: |
kubectl apply -f k8s/
kubectl rollout status deployment/kimi-deploymentCommunity Contributions
Contributing Guidelines
When contributing to Kimi K2.5 projects:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit changes:
git commit -m 'Add amazing feature' - Push to branch:
git push origin feature/amazing-feature - Open a Pull Request
Popular Contribution Areas
| Area | Description | Skills Needed |
|---|---|---|
| SDK Development | Language bindings | Python, JS, Go, Rust |
| Integrations | Framework plugins | Framework APIs |
| Documentation | Tutorials, guides | Technical writing |
| Examples | Demo applications | Full-stack dev |
| Testing | Bug reports, QA | Testing methodologies |
Code Examples Repository
Complete Project Examples
# Clone official cookbook
git clone https://github.com/MoonshotAI/MoonshotAI-Cookbook.git
cd MoonshotAI-Cookbook
# Browse language/framework examples
find examples -maxdepth 2 -type f | headRunning Examples
# Pick an example folder and run according to its README
cd examples
lsIssue Tracking and Support
GitHub Issues
Report issues on official repositories:
- Bugs: Include reproduction steps, environment details
- Features: Describe use case and expected behavior
- Documentation: Point to unclear sections
Community Support
- GitHub Discussions: Q&A and feature requests
- Discord: Real-time community support
- Stack Overflow: Tag questions with
kimi-k2-5
FAQ
Where can I find official Kimi K2.5 code examples?
Use the MoonshotAI-Cookbook and the examples directory in kimi-cli.
Is Kimi K2.5 open source?
The Kimi Code CLI is open source (Apache 2.0). The model weights are available under a Modified MIT License with some commercial restrictions.
How do I contribute to Kimi K2.5 projects?
Fork the relevant repository, make your changes, and submit a pull request. Check each repo's CONTRIBUTING.md for specific guidelines.
Are there starter templates for Kimi K2.5?
Yes, community templates exist for Next.js, FastAPI, Django, and other frameworks. Search GitHub for "kimi-k2-5 starter" or "kimi-k2-5 template".
Can I self-host Kimi K2.5 using GitHub resources?
Yes, deployment configs for Docker and Kubernetes are available in community repositories and the official documentation.