OpenAI-compatible proxy gateway for Kiro IDE API (AWS CodeWhisperer)
Use Claude models through any tools that support the OpenAI API
Features β’ Quick Start β’ Configuration β’ API Reference β’ License
| Feature | Description |
|---|---|
| π OpenAI-compatible API | Works with any OpenAI client out of the box |
| π§ Extended Thinking | See how the model reasons before answering |
| π¬ Full message history | Passes complete conversation context |
| π οΈ Tool Calling | Supports function calling in OpenAI format |
| π‘ Streaming | Full SSE streaming support |
| π Retry Logic | Automatic retries on errors (403, 429, 5xx) |
| π Extended model list | Including versioned models |
| π Smart token management | Automatic refresh before expiration |
- Python 3.10+
- One of the following:
# Clone the repository
git clone https://github.com/Jwadow/kiro-openai-gateway.git
cd kiro-openai-gateway
# Install dependencies
pip install -r requirements.txt
# Configure (see Configuration section)
cp .env.example .env
# Edit .env with your credentials
# Start the server
python main.pyThe server will be available at http://localhost:8000
Specify the path to the credentials file:
KIRO_CREDS_FILE="~/.aws/sso/cache/kiro-auth-token.json"
# Password to protect YOUR proxy server (make up any secure string)
# You'll use this as api_key when connecting to your gateway
PROXY_API_KEY="my-super-secret-password-123"π JSON file format
{
"accessToken": "eyJ...",
"refreshToken": "eyJ...",
"expiresAt": "2025-01-12T23:00:00.000Z",
"profileArn": "arn:aws:codewhisperer:us-east-1:...",
"region": "us-east-1"
}Create a .env file in the project root:
# Required
REFRESH_TOKEN="your_kiro_refresh_token"
# Password to protect YOUR proxy server (make up any secure string)
PROXY_API_KEY="my-super-secret-password-123"
# Optional
PROFILE_ARN="arn:aws:codewhisperer:us-east-1:..."
KIRO_REGION="us-east-1"If you use kiro-cli with AWS IAM Identity Center (SSO), the gateway will automatically detect and use AWS SSO OIDC authentication.
KIRO_CREDS_FILE="~/.aws/sso/cache/your-sso-cache-file.json"
# Password to protect YOUR proxy server
PROXY_API_KEY="my-super-secret-password-123"
# Note: PROFILE_ARN is NOT needed for AWS SSO OIDC (Builder ID) users
# The gateway will work without itπ AWS SSO JSON file format
AWS SSO credentials files (from ~/.aws/sso/cache/) contain:
{
"accessToken": "eyJ...",
"refreshToken": "eyJ...",
"expiresAt": "2025-01-12T23:00:00.000Z",
"region": "us-east-1",
"clientId": "...",
"clientSecret": "..."
}Note: AWS SSO OIDC (Builder ID) users do NOT need profileArn. The gateway will work without it (if specified, it will be ignored).
π How it works
The gateway automatically detects the authentication type based on the credentials file:
-
Kiro Desktop Auth (default): Used when
clientIdandclientSecretare NOT present- Endpoint:
https://prod.{region}.auth.desktop.kiro.dev/refreshToken
- Endpoint:
-
AWS SSO OIDC: Used when
clientIdandclientSecretARE present- Endpoint:
https://oidc.{region}.amazonaws.com/token
- Endpoint:
No additional configuration is needed β just point to your credentials file!
If you use kiro-cli and prefer to use its SQLite database directly:
KIRO_CLI_DB_FILE="~/.local/share/kiro-cli/data.sqlite3"
# Password to protect YOUR proxy server
PROXY_API_KEY="my-super-secret-password-123"
# Note: PROFILE_ARN is NOT needed for AWS SSO OIDC (Builder ID) users
# The gateway will work without itπ Database locations
| CLI Tool | Database Path |
|---|---|
| kiro-cli | ~/.local/share/kiro-cli/data.sqlite3 |
| amazon-q-developer-cli | ~/.local/share/amazon-q/data.sqlite3 |
The gateway reads credentials from the auth_kv table which stores:
kirocli:odic:tokenorcodewhisperer:odic:tokenβ access token, refresh token, expirationkirocli:odic:device-registrationorcodewhisperer:odic:device-registrationβ client ID and secret
Both key formats are supported for compatibility with different kiro-cli versions.
For Kiro IDE users:
- Log in to Kiro IDE and use Option 1 above (JSON credentials file)
- The credentials file is created automatically after login
For Kiro CLI users:
- Log in with
kiro-cli loginand use Option 3 or Option 4 above - No manual token extraction needed!
π§ Advanced: Manual token extraction
If you need to manually extract the refresh token (e.g., for debugging), you can intercept Kiro IDE traffic:
- Look for requests to:
prod.us-east-1.auth.desktop.kiro.dev/refreshToken
| Endpoint | Method | Description |
|---|---|---|
/ |
GET | Health check |
/health |
GET | Detailed health check |
/v1/models |
GET | List available models |
/v1/chat/completions |
POST | Chat completions |
| Model | Description |
|---|---|
claude-opus-4-5 |
Top-tier model |
claude-opus-4-5-20251101 |
Top-tier model (versioned) |
claude-sonnet-4-5 |
Enhanced model |
claude-sonnet-4-5-20250929 |
Enhanced model (versioned) |
claude-sonnet-4 |
Balanced model |
claude-sonnet-4-20250514 |
Balanced model (versioned) |
claude-haiku-4-5 |
Fast model |
claude-3-7-sonnet-20250219 |
Legacy model |
πΉ Simple cURL Request
curl http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer my-super-secret-password-123" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [{"role": "user", "content": "Hello!"}],
"stream": true
}'Note: Replace
my-super-secret-password-123with thePROXY_API_KEYyou set in your.envfile.
πΉ Streaming Request
curl http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer my-super-secret-password-123" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is 2+2?"}
],
"stream": true
}'πΉ With Tool Calling
curl http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer my-super-secret-password-123" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [{"role": "user", "content": "What is the weather in London?"}],
"tools": [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City name"}
},
"required": ["location"]
}
}
}]
}'π Python OpenAI SDK
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:8000/v1",
api_key="my-super-secret-password-123" # Your PROXY_API_KEY from .env
)
response = client.chat.completions.create(
model="claude-sonnet-4-5",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")π¦ LangChain
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="http://localhost:8000/v1",
api_key="my-super-secret-password-123", # Your PROXY_API_KEY from .env
model="claude-sonnet-4-5"
)
response = llm.invoke("Hello, how are you?")
print(response.content)Debug logging is disabled by default. To enable, add to your .env:
# Debug logging mode:
# - off: disabled (default)
# - errors: save logs only for failed requests (4xx, 5xx) - recommended for troubleshooting
# - all: save logs for every request (overwrites on each request)
DEBUG_MODE=errors| Mode | Description | Use Case |
|---|---|---|
off |
Disabled (default) | Production |
errors |
Save logs only for failed requests (4xx, 5xx) | Recommended for troubleshooting |
all |
Save logs for every request | Development/debugging |
When enabled, requests are logged to the debug_logs/ folder:
| File | Description |
|---|---|
request_body.json |
Incoming request from client (OpenAI format) |
kiro_request_body.json |
Request sent to Kiro API |
response_stream_raw.txt |
Raw stream from Kiro |
response_stream_modified.txt |
Transformed stream (OpenAI format) |
app_logs.txt |
Application logs for the request |
error_info.json |
Error details (only on errors) |
This project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0).
This means:
- β You can use, modify, and distribute this software
- β You can use it for commercial purposes
β οΈ You must disclose source code when you distribute the softwareβ οΈ Network use is distribution β if you run a modified version on a server and let others interact with it, you must make the source code available to themβ οΈ Modifications must be released under the same license
See the LICENSE file for the full license text.
AGPL-3.0 ensures that improvements to this software benefit the entire community. If you modify this gateway and deploy it as a service, you must share your improvements with your users.
By submitting a contribution to this project, you agree to the terms of our Contributor License Agreement (CLA). This ensures that:
- You have the right to submit the contribution
- You grant the maintainer rights to use and relicense your contribution
- The project remains legally protected
Jwadow β @Jwadow
This project is not affiliated with, endorsed by, or sponsored by Amazon Web Services (AWS), Anthropic, or Kiro IDE. Use at your own risk and in compliance with the terms of service of the underlying APIs.