<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Goose on RockB</title><link>https://baeseokjae.github.io/tags/goose/</link><description>Recent content in Goose on RockB</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Thu, 07 May 2026 12:00:00 +0000</lastBuildDate><atom:link href="https://baeseokjae.github.io/tags/goose/index.xml" rel="self" type="application/rss+xml"/><item><title>Goose AI Agent Review 2026: Block's Open-Source Local Coding Agent</title><link>https://baeseokjae.github.io/posts/goose-ai-agent-review-2026/</link><pubDate>Thu, 07 May 2026 12:00:00 +0000</pubDate><guid>https://baeseokjae.github.io/posts/goose-ai-agent-review-2026/</guid><description>Goose AI agent review 2026: Apache 2.0 open-source coding agent, 15+ LLM providers, 70+ MCP extensions, local Ollama support, and comparison to Claude Code and Aider.</description><content:encoded><![CDATA[<p>Goose moved to the Linux Foundation&rsquo;s Agentic AI Foundation (AAIF) in 2026, transitioning from Block&rsquo;s internal open-source project to a foundation-governed community project. With 70+ MCP extensions, support for 15+ AI providers including local Ollama models, and an Apache 2.0 license that allows commercial use without restrictions, Goose occupies the same space as Claude Code and Aider — terminal-first AI coding agents — but with a distinct emphasis on extensibility and provider flexibility. Built in Rust for native performance and low resource usage, Goose runs on macOS, Linux, and Windows. Here is an honest technical assessment of what Goose delivers in 2026 and when to use it over its alternatives.</p>
<h2 id="what-is-goose-ai-agent-blocks-open-source-contribution-explained">What Is Goose AI Agent? Block&rsquo;s Open-Source Contribution Explained</h2>
<p>Goose is an open-source AI coding agent originally created at Block (the parent company of Square and Cash App) and donated to the Linux Foundation&rsquo;s Agentic AI Foundation in 2026. It&rsquo;s a terminal-first agent that can read and write files, execute shell commands, search the web, and integrate with external services via MCP — all driven by natural language instructions. The Apache 2.0 license is a meaningful differentiator: unlike Claude Code&rsquo;s proprietary terms or n8n&rsquo;s AGPLv3 requirements, Apache 2.0 allows commercial use, modification, and embedding without source disclosure obligations. This makes Goose viable for companies building their own internal tooling on top of the agent framework. The provider flexibility is the other key differentiator: Goose supports Anthropic, OpenAI, Google Gemini, Groq, Mistral, Cohere, and Ollama among 15+ providers, configurable per-session. A developer can use Claude Opus 4.7 for complex reasoning tasks and switch to a local Qwen3 model for simple code generation — all within the same agent framework without tool switching. This flexibility, combined with the desktop app available on all major platforms, positions Goose as a general-purpose agent runtime rather than a tool tied to a specific LLM vendor.</p>
<h2 id="key-features-mcp-extensions-multi-provider-support-and-local-mode">Key Features: MCP Extensions, Multi-Provider Support, and Local Mode</h2>
<p><strong>MCP ecosystem</strong> is Goose&rsquo;s strongest technical capability. The 70+ available MCP extensions connect Goose to databases, APIs, project management tools, communication platforms, and development infrastructure. When an extension provides MCP tools, Goose can call them as part of autonomous task execution — fetching Jira tickets, querying databases, posting to Slack, or running CI/CD workflows.</p>
<p><strong>Multi-provider support</strong> means developers aren&rsquo;t locked to a single LLM. Provider switching is configured in Goose&rsquo;s settings file:</p>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-yaml" data-lang="yaml"><span style="display:flex;"><span><span style="color:#75715e"># ~/.config/goose/providers.yaml</span>
</span></span><span style="display:flex;"><span><span style="color:#f92672">default_provider</span>: <span style="color:#ae81ff">anthropic</span>
</span></span><span style="display:flex;"><span><span style="color:#f92672">providers</span>:
</span></span><span style="display:flex;"><span>  <span style="color:#f92672">anthropic</span>:
</span></span><span style="display:flex;"><span>    <span style="color:#f92672">model</span>: <span style="color:#ae81ff">claude-opus-4-7-20261101</span>
</span></span><span style="display:flex;"><span>  <span style="color:#f92672">openai</span>:
</span></span><span style="display:flex;"><span>    <span style="color:#f92672">model</span>: <span style="color:#ae81ff">gpt-5.5</span>
</span></span><span style="display:flex;"><span>  <span style="color:#f92672">ollama</span>:
</span></span><span style="display:flex;"><span>    <span style="color:#f92672">model</span>: <span style="color:#ae81ff">qwen3-coder:32b</span>
</span></span><span style="display:flex;"><span>    <span style="color:#f92672">base_url</span>: <span style="color:#ae81ff">http://localhost:11434</span>
</span></span></code></pre></div><p><strong>Local mode via Ollama</strong> enables zero-API-cost operation. Running Goose with a local Qwen3 Coder or DeepSeek Coder model on appropriate hardware costs nothing beyond electricity. For privacy-sensitive codebases or teams with compliance requirements prohibiting external API calls, local mode makes Goose viable where cloud-only agents cannot operate.</p>
<p><strong>Desktop app</strong> provides a GUI for users who prefer not to work in terminal, while maintaining the same agent capabilities. The GUI is useful for session management — viewing history, switching providers, and managing extension configuration.</p>
<h2 id="installation-and-getting-started-in-5-minutes">Installation and Getting Started in 5 Minutes</h2>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-bash" data-lang="bash"><span style="display:flex;"><span><span style="color:#75715e"># macOS (Homebrew)</span>
</span></span><span style="display:flex;"><span>brew install block/tap/goose
</span></span><span style="display:flex;"><span>
</span></span><span style="display:flex;"><span><span style="color:#75715e"># Linux</span>
</span></span><span style="display:flex;"><span>curl -fsSL https://getgoose.ai/install.sh | bash
</span></span><span style="display:flex;"><span>
</span></span><span style="display:flex;"><span><span style="color:#75715e"># Configure your first provider</span>
</span></span><span style="display:flex;"><span>goose configure
</span></span><span style="display:flex;"><span>
</span></span><span style="display:flex;"><span><span style="color:#75715e"># Start a session</span>
</span></span><span style="display:flex;"><span>goose session
</span></span></code></pre></div><p>After installation, <code>goose configure</code> prompts for your preferred LLM provider and API key. For local-only usage with Ollama, no API key is needed:</p>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-bash" data-lang="bash"><span style="display:flex;"><span><span style="color:#75715e"># Configure for local Ollama</span>
</span></span><span style="display:flex;"><span>goose configure --provider ollama --model qwen3-coder:32b
</span></span><span style="display:flex;"><span>
</span></span><span style="display:flex;"><span><span style="color:#75715e"># Verify Ollama is running</span>
</span></span><span style="display:flex;"><span>ollama serve  <span style="color:#75715e"># in another terminal</span>
</span></span><span style="display:flex;"><span>
</span></span><span style="display:flex;"><span><span style="color:#75715e"># Start session</span>
</span></span><span style="display:flex;"><span>goose session
</span></span></code></pre></div><p>Once in a session, Goose accepts natural language task descriptions: &ldquo;review the authentication module for security issues and generate a list of findings&rdquo; or &ldquo;add comprehensive error handling to all database calls in the user service.&rdquo; Goose reads relevant files, executes shell commands as needed, and produces output or file changes based on the task.</p>
<h2 id="mcp-ecosystem-70-extensions-for-real-workflow-power">MCP Ecosystem: 70+ Extensions for Real Workflow Power</h2>
<p>The MCP extension catalog is where Goose&rsquo;s practical capabilities expand beyond basic file operations. Notable categories:</p>
<p><strong>Development infrastructure:</strong> GitHub, GitLab, Jira, Linear — Goose can read issues, create PRs, and comment on tickets as part of automated workflows.</p>
<p><strong>Databases:</strong> PostgreSQL, MySQL, SQLite — Goose can query databases directly, making it useful for data exploration and schema analysis tasks without separate database tooling.</p>
<p><strong>Communication:</strong> Slack, email — Goose can send notifications and messages as part of workflow completion.</p>
<p><strong>Web:</strong> Search, scraping — Goose can research documentation, find library examples, and gather context from the web.</p>
<p>Installing extensions adds them to the available tool set for the current session or globally:</p>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-bash" data-lang="bash"><span style="display:flex;"><span><span style="color:#75715e"># Install the GitHub MCP extension</span>
</span></span><span style="display:flex;"><span>goose extension add github --token YOUR_GITHUB_TOKEN
</span></span><span style="display:flex;"><span>
</span></span><span style="display:flex;"><span><span style="color:#75715e"># Install database extension</span>
</span></span><span style="display:flex;"><span>goose extension add postgres --connection-string postgresql://user:pass@localhost/db
</span></span></code></pre></div><p>The MCP architecture means any new MCP server can be integrated — the 70+ officially supported extensions are a baseline, not a ceiling.</p>
<h2 id="running-goose-with-local-models-for-zero-api-cost">Running Goose with Local Models for Zero API Cost</h2>
<p>Goose&rsquo;s Ollama integration is production-quality for code-focused tasks. Best local models for coding agents in 2026:</p>
<table>
  <thead>
      <tr>
          <th>Model</th>
          <th>VRAM Required</th>
          <th>Coding Quality</th>
          <th>Speed</th>
      </tr>
  </thead>
  <tbody>
      <tr>
          <td>Qwen3 Coder 32B</td>
          <td>20GB</td>
          <td>Excellent</td>
          <td>Moderate</td>
      </tr>
      <tr>
          <td>DeepSeek Coder V2</td>
          <td>16GB</td>
          <td>Very Good</td>
          <td>Fast</td>
      </tr>
      <tr>
          <td>Llama 4 Scout</td>
          <td>14GB</td>
          <td>Good</td>
          <td>Fast</td>
      </tr>
      <tr>
          <td>Qwen2.5 7B</td>
          <td>6GB</td>
          <td>Adequate</td>
          <td>Very Fast</td>
      </tr>
  </tbody>
</table>
<p>For teams with hardware constraints, the 7B models are capable enough for routine code generation and review. For complex reasoning tasks, 32B models at Q4 quantization deliver meaningfully better results on code that requires understanding of multi-file dependencies.</p>
<p>The privacy advantage: all code, context, and conversation history stays on-premise. For healthcare organizations, financial services firms, or government agencies where code cannot leave controlled infrastructure, local Goose is often the only viable AI coding agent option.</p>
<h2 id="goose-vs-claude-code-vs-aider-vs-cline-honest-comparison">Goose vs Claude Code vs Aider vs Cline: Honest Comparison</h2>
<table>
  <thead>
      <tr>
          <th></th>
          <th>Goose</th>
          <th>Claude Code</th>
          <th>Aider</th>
          <th>Cline</th>
      </tr>
  </thead>
  <tbody>
      <tr>
          <td><strong>License</strong></td>
          <td>Apache 2.0</td>
          <td>Proprietary</td>
          <td>Apache 2.0</td>
          <td>MIT</td>
      </tr>
      <tr>
          <td><strong>Provider lock-in</strong></td>
          <td>None (15+)</td>
          <td>Anthropic</td>
          <td>Multiple</td>
          <td>Multiple</td>
      </tr>
      <tr>
          <td><strong>Local models</strong></td>
          <td>Yes (Ollama)</td>
          <td>No</td>
          <td>Yes</td>
          <td>Yes</td>
      </tr>
      <tr>
          <td><strong>MCP extensions</strong></td>
          <td>70+</td>
          <td>Growing</td>
          <td>No</td>
          <td>Yes</td>
      </tr>
      <tr>
          <td><strong>IDE integration</strong></td>
          <td>Desktop app</td>
          <td>Terminal/IDE</td>
          <td>Terminal</td>
          <td>VS Code</td>
      </tr>
      <tr>
          <td><strong>SWE-bench (best model)</strong></td>
          <td>~75%</td>
          <td>80.9%</td>
          <td>~72%</td>
          <td>~75%</td>
      </tr>
      <tr>
          <td><strong>Git integration</strong></td>
          <td>Good</td>
          <td>Excellent</td>
          <td>Excellent</td>
          <td>Good</td>
      </tr>
  </tbody>
</table>
<p>Claude Code leads on benchmark performance — the 80.9% SWE-bench score with Anthropic&rsquo;s tooling optimizations is the current ceiling for autonomous coding agents. Aider has excellent git-native workflow integration and strong commit message generation. Cline&rsquo;s VS Code integration makes it the smoothest experience for developers who don&rsquo;t want to leave the IDE. Goose wins on three dimensions: provider flexibility (switch models without switching tools), MCP ecosystem breadth, and Apache 2.0 licensing for commercial embedding.</p>
<h2 id="real-world-use-cases-and-developer-workflows">Real-World Use Cases and Developer Workflows</h2>
<p><strong>Code review automation:</strong> Goose reviews entire pull requests when given a GitHub MCP extension and PR URL. &ldquo;Review this PR for security issues, performance problems, and adherence to our coding standards&rdquo; triggers a multi-file analysis with specific findings.</p>
<p><strong>Database exploration:</strong> Combined with a database MCP extension, Goose can answer &ldquo;which tables have no foreign key constraints?&rdquo; or &ldquo;show me all queries that don&rsquo;t use parameterized input&rdquo; by querying the schema and analyzing query patterns.</p>
<p><strong>Documentation generation:</strong> Goose reads a codebase and generates or updates documentation: &ldquo;generate JSDoc comments for all exported functions in the src/ directory that currently lack them.&rdquo;</p>
<p><strong>Local private agent:</strong> For teams processing sensitive data, running Goose with Ollama on on-premise hardware creates a fully air-gapped AI coding assistant. No external API calls, no data leaving the network.</p>
<h2 id="who-should-use-goose-in-2026">Who Should Use Goose in 2026?</h2>
<p>Goose fits a specific developer profile: teams that need to switch between AI providers without switching tools, organizations with data sovereignty requirements that necessitate on-premise local model deployment, and companies building internal AI agent tooling under a permissive license. With 15+ supported providers and no vendor lock-in, Goose is the right choice when the LLM landscape is changing faster than you want to renegotiate tool dependencies. The Apache 2.0 license matters for teams embedding AI agent capabilities into commercial products — no AGPLv3 source disclosure, no proprietary restrictions.</p>
<p><strong>Use Goose if:</strong> You need provider flexibility to switch between LLM models without changing tools. Your team has compliance requirements that prevent external API calls (local Ollama mode). You want to build internal tooling on top of an AI agent framework under Apache 2.0. The MCP ecosystem covers your required integrations.</p>
<p><strong>Skip Goose if:</strong> Maximum SWE-bench performance matters (use Claude Code). You want the tightest git workflow integration (Aider). You primarily work in VS Code and want IDE integration (Cline). You&rsquo;re not comfortable with CLI-first tooling.</p>
<hr>
<h2 id="faq">FAQ</h2>
<p><strong>What is Goose AI agent and who made it?</strong></p>
<p>Goose is an open-source AI coding agent originally created at Block (parent company of Square and Cash App) and donated to the Linux Foundation&rsquo;s Agentic AI Foundation in 2026. Built in Rust, it&rsquo;s Apache 2.0 licensed and supports 15+ AI providers including local Ollama models. It runs on macOS, Linux, and Windows with both CLI and desktop GUI interfaces.</p>
<p><strong>Does Goose work without internet access?</strong></p>
<p>Yes. Configure Goose with an Ollama provider and a locally-hosted model (Qwen3 Coder, DeepSeek Coder, etc.) to operate completely offline. No API keys required, no external network calls. The Ollama server runs locally; Goose connects to it via localhost. Suitable for air-gapped environments and privacy-sensitive codebases.</p>
<p><strong>How does Goose compare to Claude Code?</strong></p>
<p>Claude Code scores 80.9% on SWE-bench Verified and has the deepest Anthropic tooling integration. Goose supports 15+ providers, 70+ MCP extensions, Apache 2.0 licensing, and local model support — making it more flexible but with lower peak performance. Teams needing maximum coding capability choose Claude Code; teams needing provider flexibility, local mode, or commercial embedding choose Goose.</p>
<p><strong>What are MCP extensions in Goose?</strong></p>
<p>MCP (Model Context Protocol) extensions are integrations that give Goose access to external tools and services — GitHub, databases, Slack, Jira, and 70+ others. Extensions provide MCP tools that Goose can call as part of autonomous task execution. Any MCP-compatible server can be added, making the 70 official extensions a baseline rather than a hard limit.</p>
<p><strong>Is Goose free to use commercially?</strong></p>
<p>Yes. The Apache 2.0 license allows commercial use, modification, and embedding without restrictions or source disclosure requirements. You can build internal tools on top of Goose, include it in commercial products, or deploy it in enterprise environments without licensing fees or obligations to open-source your modifications.</p>
]]></content:encoded></item></channel></rss>