Make Antigravity crawl the web without getting blocked in real-time for free

1. Open MCP Server Settings
From the Agent Panel in Antigravity IDE:
- Click Additional options (the three dots in the top‑right corner)
- Click MCP Servers
- Click Manage MCP Servers from the top right
A new tab called Manage MCPs will appear in the editor.
2. Open Raw MCP Configuration
Inside the Manage MCPs tab:
- Click View raw config
This opens the raw MCP JSON configuration file.
3. Add Crawleo MCP Configuration
Paste the following JSON into the configuration file:
{
"mcpServers": {
"crawleo": {
"serverUrl": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer API_KEY"
}
}
}
}
Replace API_KEY with your actual Crawleo API key.
Example:
"Authorization": "Bearer ck_live_xxxxxxxxxxxxxxxxx"
Save the file after editing.
4. Refresh MCP Servers
-
Go back to the Manage MCPs tab
-
Click Refresh
If the configuration is valid, Antigravity IDE will register the Crawleo MCP server.
What You Get After Setup
After completing this setup, Antigravity IDE will expose two Crawleo MCP tools:
search_webfor real‑time web searchcrawl_webfor direct page crawling
Troubleshooting
Error: Unauthorized or Stream Failure
If you see the following error:
Error: streamable http connection failed: calling "initialize": sending "initialize": Unauthorized, sse fallback failed: missing endpoint: first event is "", want "endpoint".
Cause:
- An invalid or incorrect Crawleo API key was used
Fix:
-
Double‑check the API key
-
Ensure there are no extra spaces or missing characters
-
Confirm the key is active in your Crawleo dashboard
After correcting the key, refresh MCP servers again.
Tags
Related Posts

LangChain v0.3 Tutorial & Migration Guide for 2026
Learn what’s new in LangChain v0.3 and how to migrate: Runnables, new agents, tools, middleware, MCP, and testing patterns for modern AI agents in Python.

The Best Local LLMs for 16GB RAM: A Developer's Optimization Guide
Sixteen gigabytes of memory is the current sweet spot for developers exploring local large language models. With this capacity, you can efficiently run 7B to 14B parameter models using modern quantization techniques—delivering near-cloud performance while keeping your data on-premise. Whether you're...

How to Add Web Search Skill to OpenClaw (Step‑by‑Step) With Crawleo
OpenClaw’s skills system makes it easy to plug in powerful web search capabilities directly into your AI agents. This guide shows you how to install a Crawleo-powered search skill, wire it up with your API key, and start running live web queries from inside OpenClaw in just a few minutes.