mcp-gemini-server and mcp-gemini-google-search
The first tool provides a dedicated server exposing Google's Gemini model capabilities as standard MCP tools, while the second tool is an MCP server specifically for Google Search integration using Gemini's built-in search capabilities, making them complementary components within the Gemini MCP ecosystem.
About mcp-gemini-server
bsmi021/mcp-gemini-server
This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
This server helps other AI systems and large language models (LLMs) like Claude use Google's Gemini for complex tasks. It takes publicly available web content like image URLs, YouTube videos, or web pages and processes them using Gemini's advanced analysis and generation features. This is ideal for AI developers or system integrators who want to add Gemini's capabilities to their existing AI applications through a standardized interface.
About mcp-gemini-google-search
yukukotani/mcp-gemini-google-search
MCP server for Google Search integration using Gemini's built-in search capabilities
This project integrates real-time Google Search into large language models (LLMs) like Claude, providing up-to-date information directly within your AI applications. It takes a search query as input and returns relevant web search results with source citations. This is for developers building AI agents and applications that need accurate, real-time web knowledge.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work