United States - Ekhbary News Agency
Claude Code's High Costs Spark Developer Uprising; Goose Emerges as Free Alternative
The transformative power of artificial intelligence in software development promises to revolutionize how code is written, debugged, and deployed. However, this AI-driven advancement comes with a significant price tag, leading to growing friction within the developer community. Anthropic's Claude Code, a sophisticated AI agent designed for terminal-based operations, has garnered considerable attention for its autonomous coding capabilities. Yet, its tiered pricing structure, ranging from $20 to $200 per month based on usage, has ignited a rebellion among the very programmers it aims to empower.
Amidst this growing frustration, a powerful open-source alternative, Goose, has rapidly gained traction. Developed by Block, the financial technology company formerly known as Square, Goose offers functionality nearly identical to Claude Code but operates entirely on a user's local machine. This model eliminates subscription fees, negates cloud dependency, and removes restrictive rate limits that often hamper productivity. "Your data stays with you, period," emphasized Parth Sareen, a software engineer who demonstrated the tool. This core appeal—complete user control over their AI-powered workflow, including offline capabilities—resonates deeply with developers concerned about data privacy and operational freedom, even on airplanes.
Read Also
- Robotic Canines Revolutionize Agriculture: From Field Hauling to Crop Protection
- Bond Strength, Biocompatibility, and Beyond: Master Bond's Guide to Medical-Grade Adhesives
- AI-Powered Startup Streamlines Access to Rehabilitation Facilities, Tackling U.S. Healthcare Referral Crisis
- DraftKings Unveils Ambitious Super App Strategy to Consolidate US Digital Betting Ecosystem
- Gambling Is Not Investing Coalition Challenges Prediction Markets Over Consumer Protections
The project's popularity has surged dramatically. Goose has amassed over 26,100 stars on GitHub, a testament to its growing adoption and utility. With 362 contributors and 102 releases since its inception, including version 1.20.1 shipped on January 19, 2026, its development pace rivals that of commercial products. For developers weary of Claude Code's restrictive pricing and usage caps, Goose represents a rare commodity in the AI landscape: a truly free, no-strings-attached solution for professional-grade development work.
Anthropic's Pricing Controversy and Developer Revolt
To fully appreciate the significance of Goose, understanding the controversy surrounding Claude Code's pricing is essential. Anthropic, a San Francisco-based AI firm founded by former OpenAI executives, integrates Claude Code into its subscription tiers. The free plan offers no access, while the "Pro" plan, at $17/month (billed annually) or $20/month, imposes a strict limit of 10 to 40 prompts every five hours—a constraint that many developers find insufficient for even brief, intensive coding sessions.
The higher "Max" plans, priced at $100 and $200 per month, offer increased capacity, providing 50-200 prompts and 200-800 prompts respectively, along with access to Anthropic's most advanced model, Claude 4.5 Opus. However, even these premium offerings are subject to restrictions that have further inflamed the developer community. In late July, Anthropic introduced new weekly rate limits. Pro users are allocated 40-80 hours of Sonnet 4 usage per week, while $200 Max tier users receive 240-480 hours of Sonnet 4 and 24-40 hours of Opus 4. Months later, the developer frustration remains palpable.
The core issue lies in the interpretation of these "hours." They are not literal time-based units but rather token-based limits that fluctuate based on factors like codebase size, conversation length, and code complexity. Independent analyses suggest that these limits translate to approximately 44,000 tokens for Pro users and 220,000 tokens for the $200 Max plan per session. "It's confusing and vague," one developer noted in a widely circulated analysis, criticizing the lack of practical utility in the provided metrics. The backlash has been fierce across platforms like Reddit and developer forums, with users reporting hitting daily limits within 30 minutes of intensive work and others canceling subscriptions outright, deeming the restrictions "a joke" and "unusable for real work." Anthropic has defended the changes, stating they affect fewer than five percent of users and target continuous, 24/7 usage. However, the company has not clarified whether this percentage applies to all users or only Max subscribers, a distinction with significant implications.
Block's Innovative Approach: The Offline AI Coding Agent
Goose offers a fundamentally different paradigm. Built by Block, the company led by Jack Dorsey, Goose operates as an "on-machine AI agent." Unlike cloud-based solutions like Claude Code, Goose processes queries locally, leveraging open-source language models that users download and manage on their own hardware. Its documentation emphasizes its ability to "install, execute, edit, and test with any LLM," highlighting its model-agnostic design.
This flexibility allows developers to connect Goose to various AI models, including Anthropic's Claude (via API), OpenAI's GPT-5, Google's Gemini, or route through services like Groq or OpenRouter. Crucially, it enables fully local operation using tools like Ollama. The practical benefits are substantial: no subscription fees, no usage caps, no rate limits, and the assurance that sensitive code and conversations remain on the user's machine. "I use Ollama all the time on planes — it's a lot of fun!" shared Sareen, underscoring the freedom from internet connectivity constraints.
Goose's Advanced Capabilities Beyond Traditional Assistants
Goose functions as a command-line tool or desktop application capable of autonomous, complex development tasks. It can initiate projects, write and execute code, debug issues, manage multi-file workflows, and interact with external APIs without constant human intervention. Its architecture relies on advanced "tool calling" or "function calling" capabilities, allowing the language model to trigger specific actions within external systems. When a user requests Goose to create a file, run tests, or check a GitHub pull request status, it executes these operations directly.
The effectiveness of these operations depends on the underlying language model. While Anthropic's Claude 4 models currently lead in tool-calling performance according to the Berkeley Function-Calling Leaderboard, open-source models are rapidly improving. Goose's documentation recommends models like Meta's Llama series, Alibaba's Qwen, Google's Gemma variants, and DeepSeek's reasoning architectures for strong tool-calling support. Furthermore, Goose integrates with the emerging Model Context Protocol (MCP), expanding its reach to databases, search engines, file systems, and third-party APIs, significantly enhancing its utility beyond the base LLM's capabilities.
Setting Up Goose with a Local Model
For developers seeking a completely free and privacy-preserving setup, the process involves three key components: Goose, Ollama, and a compatible language model.
Step 1: Install Ollama
Ollama simplifies running large language models locally. Download and install it from ollama.com. Once installed, pull a model with a simple command, such as for coding tasks using Qwen 2.5:
Related News
- New Study Reveals Dogs Mimic Toddlers' Helping Behavior
- Microsoft's Project Silica Achieves 10,000-Year Data Storage on Everyday Glass, Revolutionizing Archival Solutions
- Long-Lost Silent Film Reveals Cinema's First 'Robot' in Georges Méliès' 1897 Masterpiece
- NASA Taps AI Model Claude to Chart Course for Mars Rover Perseverance
- Arctic Forum in Arkhangelsk to Discuss Auto Tourism Development and Patriotic Routes for Victory's 80th Anniversary
ollama run qwen2.5
The model downloads and runs on your machine.
Step 2: Install Goose
Goose is available as a desktop application and a command-line interface. The desktop application...