Knowledge Base Sections ▾
For Beginners
For Investors
- Where does GNK token value come from
- Gonka vs Competitors: Render, Akash, io.net
- The Libermans: from biophysics to decentralized AI
- GNK Tokenomics
- Risks and Prospects of Gonka: Objective Analysis
- Gonka vs Render Network: Detailed Comparison
- Gonka vs Akash: AI Inference vs Containers
- Gonka vs io.net: Inference vs GPU Marketplace
- Gonka vs Bittensor: A Detailed Comparison of Two Approaches to AI
- Gonka vs Flux: Two Approaches to Useful Mining
- Governance in Gonka: How a Decentralized Network is Managed
Technical
Analytics
Tools
- Cursor + Gonka AI - cheap LLM for coding
- Claude Code + Gonka AI - LLM for the terminal
- OpenClaw + Gonka AI - affordable AI agents
- OpenCode + Gonka AI - free AI for code
- Continue.dev + Gonka AI - AI for VS Code/JetBrains
- Cline + Gonka AI - AI agent in VS Code
- Aider + Gonka AI - pair programming with AI
- LangChain + Gonka AI - AI applications for pennies
- n8n + Gonka AI - automation with cheap AI
- Open WebUI + Gonka AI - your own ChatGPT
- LibreChat + Gonka AI — open-source ChatGPT
- API quick start — curl, Python, TypeScript
- JoinGonka Gateway — a full overview
- Management Keys — SaaS on Gonka
Analytics
$112B Hole — Big Tech's Hidden Bankruptcy
Data Center Race
Project Stargate — hundreds of billions of dollars to build giant data centers. This is not a typo: we are talking about amounts comparable to the GDP of small countries. Microsoft, Google, and Meta annually spend tens of billions on GPU infrastructure: Microsoft alone invested over $50 billion in capital expenditures in 2025, most of it for AI.
The problem is hidden in accounting. H100 generation GPUs become obsolete in 2 years with the release of H200, B100, B200 — each subsequent generation is 50-100% faster than the previous one. But corporations record depreciation over 5-6 years, creating an accounting illusion. Example: a company bought GPUs for $20 billion. In the accounting books, after 2 years, they still “cost” $13 billion (with linear depreciation over 6 years). In reality — they cost ~$5 billion, because the new generation does the same work twice as fast and cheaper.
This creates a hidden deficit: the difference between the book value of assets and their real market value — trillions of dollars across the industry. When (not “if,” but “when”) auditors demand a revaluation — this could lead to massive write-offs, collapse AI company stocks, and provoke a crisis of confidence in the entire industry.
$112 Billion in OpenAI Losses
According to analysts' forecasts, OpenAI will accumulate approximately $112 billion in losses by 2030. This figure is not arbitrary: it reflects a fundamental problem with the business model of centralized AI.
On one hand, revenues are growing impressively: billions of dollars annually from ChatGPT subscriptions and API. On the other hand, expenses are growing even faster. Each new generation of models requires exponentially more resources:
- GPT-3 → GPT-4: training cost increased by approximately 10 times.
- GPT-4 → GPT-5: another exponential increase — an exponential curve.
- Inference: millions of users = billions of tokens per day = billions of dollars per year in GPU power.
This model only works with an endless influx of venture capital. OpenAI has raised tens of billions in investments, including rounds from Microsoft and SoftBank. But investors are not philanthropists. Sooner or later, they will demand profit. The question is not “if,” but “when” — and what will happen to the millions of businesses built on OpenAI's API at that moment?
For comparison: Gonka has raised $80M and is already processing real AI requests through a network of ~4,648 GPUs. The cost of inference is $0.0009/1M tokens. This is possible because the decentralized model doesn't need to recoup trillion-dollar investments in data centers.
Why Gonka Is Not a Bubble
Gonka doesn't build data centers — it unites existing GPUs worldwide. This is not just an alternative business model — it's a fundamentally different economic architecture that eliminates the root cause of the bubble.
No capital expenditures: the Gonka network does not raise hundreds of billions for construction. The protocol, blockchain, software — that's all the team creates. GPUs are provided by independent hosts worldwide — each at its own expense.
No depreciation stretched over 6 years: when an H100 becomes obsolete — the host simply replaces it with an H200 or the next generation. The decision is made by the equipment owner based on market conditions, not by a corporate CFO trying to hide write-offs.
No accounting tricks: all transactions on the Gonka blockchain are transparent. Rewards are distributed according to a protocol audited by CertiK. There are no “hidden” costs that will be discovered 5 years later upon asset revaluation.
Distributed risk: each host bears its own risk. If one host went bankrupt on a bad GPU investment — that's its problem, not the entire network's. In a centralized model, a single $10 billion mistake can bring down the entire company. In Gonka, such a mistake is impossible by definition — because there is no single participant capable of making a $10 billion decision.
The result: the cost of inference through Gonka is $0.0009 per million tokens. This is ~2,800 times cheaper than OpenAI. And this price is sustainable — because there is no trillion-dollar infrastructure behind it that needs to be recouped.
Contrast: Centralization vs Decentralization
Let's compare two AI infrastructure models:
| Parameter | Centralized AI | Decentralized AI (Gonka) |
|---|---|---|
| Capital Expenditures | Tens—hundreds of billions $ | $0 (GPUs at hosts') |
| GPU Depreciation | 6 years (accounting) vs 2 years (real) | Risk on host |
| Debt | Trillions (loans, bonds) | No protocol debt |
| Scaling | Building a data center = years + billions | Organic growth (hosts connect) |
| Inference Price | $2.50—15/1M tokens | $0.0009/1M tokens |
| Single Point of Failure | Yes (data center, company) | No (thousands of nodes) |
Gonka operates with around 4,648 GPUs across ~113 participants (~582 MLNodes). The project raised $80M – thousands of times less than a single Stargate spends. But the network does the same thing: processes AI requests via the Qwen3-235B neural network, accessible through an OpenAI-compatible API.
Analogy: imagine in the 2000s someone suggested: “Instead of building gigantic server farms for the internet, let's have every homeowner set up a mini-server and get rewarded for participation.” Sounds utopian – but that's exactly how Airbnb works for housing, Uber for transportation, and how Gonka works for AI computations. Decentralization is not a utopia – it's the next stage in infrastructure evolution.