mcp-omnisearch and google-ai-mode-mcp
These are complements—omnisearch aggregates multiple premium search APIs (Tavily, Brave, Kagi, Perplexity) while Google AI Mode provides free search with citations, allowing users to combine them for cost-optimized multi-source search coverage.
About mcp-omnisearch
spences10/mcp-omnisearch
🔍 A Model Context Protocol (MCP) server providing unified access to multiple search engines (Tavily, Brave, Kagi), AI tools (Perplexity, FastGPT), and content processing services (Jina AI, Kagi). Combines search, AI responses, content processing, and enhancement features through a single interface.
This tool helps researchers, analysts, and content creators gather information efficiently by combining multiple search engines, AI answer tools, and web content processors into a single interface. You input a search query or a URL, and it provides web search results, AI-generated answers with citations, GitHub content, or extracted and summarized web page content. It's ideal for anyone needing comprehensive, multi-faceted online research.
About google-ai-mode-mcp
PleasePrompto/google-ai-mode-mcp
MCP server for free Google AI Mode search with citations. Query optimization, CAPTCHA handling, multi-agent support. Works with Claude Code, Cursor, Cline, Windsurf.
This tool helps anyone using an LLM to get high-quality, cited web research. Instead of your LLM sifting through many links, this connects it to Google's AI Mode, which synthesizes answers from dozens of sources. You input a research question into your LLM, and it provides a concise, sourced answer. This is for professionals, researchers, or anyone who uses an AI agent for information gathering.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work