Knowledge Base Sections ▾
For Beginners
For Investors
- Where does GNK token value come from
- Gonka vs Competitors: Render, Akash, io.net
- The Libermans: from biophysics to decentralized AI
- GNK Tokenomics
- Risks and Prospects of Gonka: Objective Analysis
- Gonka vs Render Network: Detailed Comparison
- Gonka vs Akash: AI Inference vs Containers
- Gonka vs io.net: Inference vs GPU Marketplace
- Gonka vs Bittensor: A Detailed Comparison of Two Approaches to AI
- Gonka vs Flux: Two Approaches to Useful Mining
- Governance in Gonka: How a Decentralized Network is Managed
Technical
Analytics
Tools
- Cursor + Gonka AI - cheap LLM for coding
- Claude Code + Gonka AI - LLM for the terminal
- OpenClaw + Gonka AI - affordable AI agents
- OpenCode + Gonka AI - free AI for code
- Continue.dev + Gonka AI - AI for VS Code/JetBrains
- Cline + Gonka AI - AI agent in VS Code
- Aider + Gonka AI - pair programming with AI
- LangChain + Gonka AI - AI applications for pennies
- n8n + Gonka AI - automation with cheap AI
- Open WebUI + Gonka AI - your own ChatGPT
- LibreChat + Gonka AI — open-source ChatGPT
- API quick start — curl, Python, TypeScript
- JoinGonka Gateway — a full overview
- Management Keys — SaaS on Gonka
Tools
API quick start — curl, Python, TypeScript
JoinGonka Gateway provides an OpenAI + Anthropic compatible API to the decentralized Gonka network. Any code written for the OpenAI API (/v1/chat/completions) works with Gonka—just change base_url and api_key. And tools using the Anthropic API (Claude Code) connect via /v1/messages—directly, without a proxy.
This article provides ready-to-use code examples for the three most popular tools: curl (command line), Python, and TypeScript/Node.js (OpenAI format). For Anthropic format, see the Claude Code instructions.
What you need: a JoinGonka API key (jg-xxx format). Get it for free at gate.joingonka.ai/register along with a bonus of 10M tokens.
curl — request from terminal
The fastest way to test the API is curl:
Normal request:
curl https://gate.joingonka.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer jg-your-key" \
-d '{
"model": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"messages": [
{"role": "user", "content": "What is Gonka?"}
]
}'Streaming (response in parts):
curl https://gate.joingonka.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer jg-your-key" \
-d '{
"model": "Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
"messages": [
{"role": "user", "content": "Write hello world in Python"}
],
"stream": true
}'The response comes in JSON format (normal) or Server-Sent Events (streaming) — fully compatible with the OpenAI API.
Python — openai SDK
The official OpenAI Python SDK works with JoinGonka Gateway without changes:
pip install openaiNormal request:
from openai import OpenAI
client = OpenAI(
base_url="https://gate.joingonka.ai/v1",
api_key="jg-your-key",
)
response = client.chat.completions.create(
model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
messages=[
{"role": "user", "content": "Explain blockchain in simple terms"}
],
)
print(response.choices[0].message.content)Streaming:
stream = client.chat.completions.create(
model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
messages=[{"role": "user", "content": "Write sorting in Python"}],
stream=True,
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")Tool calling:
import json
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather in a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "City name"}
},
"required": ["city"]
}
}
}]
response = client.chat.completions.create(
model="Qwen/Qwen3-235B-A22B-Instruct-2507-FP8",
messages=[{"role": "user", "content": "What is the weather in Moscow?"}],
tools=tools,
)
tool_call = response.choices[0].message.tool_calls[0]
print(f"Function: {tool_call.function.name}")
print(f"Arguments: {tool_call.function.arguments}")Qwen3-235B supports native tool calling — functions are called correctly, without parsing text responses.
TypeScript/Node.js — openai SDK
Installation:
npm install openaiNormal request:
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://gate.joingonka.ai/v1',
apiKey: 'jg-your-key',
});
async function main() {
const response = await client.chat.completions.create({
model: 'Qwen/Qwen3-235B-A22B-Instruct-2507-FP8',
messages: [
{ role: 'user', content: 'Write an Express.js server' },
],
});
console.log(response.choices[0].message.content);
}
main();Streaming:
const stream = await client.chat.completions.create({
model: 'Qwen/Qwen3-235B-A22B-Instruct-2507-FP8',
messages: [{ role: 'user', content: 'Explain async/await' }],
stream: true,
});
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content || '';
process.stdout.write(content);
}Tool calling:
const response = await client.chat.completions.create({
model: 'Qwen/Qwen3-235B-A22B-Instruct-2507-FP8',
messages: [{ role: 'user', content: 'Convert 100 USD to EUR' }],
tools: [{
type: 'function',
function: {
name: 'convert_currency',
description: 'Currency conversion',
parameters: {
type: 'object',
properties: {
amount: { type: 'number' },
from: { type: 'string' },
to: { type: 'string' },
},
required: ['amount', 'from', 'to'],
},
},
}],
});
const toolCall = response.choices[0].message.tool_calls?.[0];
console.log(`Function: ${toolCall?.function.name}`);
console.log(`Arguments: ${toolCall?.function.arguments}`);All examples use the official OpenAI SDK — no additional libraries are required. Simply replace base_url and api_key.
Supported API parameters
JoinGonka Gateway supports all standard OpenAI Chat Completions API parameters:
| Parameter | Type | Description |
|---|---|---|
model | string | Model: Qwen/Qwen3-235B-A22B-Instruct-2507-FP8 |
messages | array | Message history (system, user, assistant) |
stream | boolean | Streaming generation (SSE). Default: false |
temperature | number | Response creativity (0.0 — 2.0) |
max_tokens | integer | Maximum response length (max: 2048, default: 1024) |
tools | array | Function definitions for tool calling |
tool_choice | string/object | Function calling strategy |
Qwen3-235B model parameters: context window — 128K tokens, maximum response — 2048 tokens. Full specifications: HuggingFace. The list of models is available via GET /v1/models.
Two endpoints:
- OpenAI format:
POST https://gate.joingonka.ai/v1/chat/completions - Anthropic format:
POST https://gate.joingonka.ai/v1/messages
Authentication: Authorization: Bearer jg-your-key (OpenAI) or x-api-key: jg-your-key (Anthropic)
The response format is fully compatible with OpenAI and Anthropic—any SDK, library, or framework that supports OpenAI or Anthropic API works with JoinGonka Gateway without modifications. Claude Code connects directly via Anthropic format.