Hello Gemini Team & Community,
As the MCP (Model Context Protocol) ecosystem rapidly matures on desktop, there is a significant “Context Gap” in the mobile experience. Mobile AI is still largely an “Isolated Brain,” unable to access the private engineering standards or local data that we rely on in our professional workflows.
I’ve published a formal proposal to address this: mcp-mobile-bridge.
The Core Idea:
Instead of a centralized cloud-to-cloud proxy, Gemini Mobile should support Native SSE (Server-Sent Events) to connect directly to MCP servers.
Why Decentralized Direct-Connect?
Data Sovereignty: Private context (like our project standards at @liangshanli/mcp-server-project-standards) stays within the user’s infrastructure.
Efficiency: Direct client-side requests minimize RTT, allowing for smoother tool-calling in multimodal scenarios like Gemini Live.
Optimized Costs: Structured MCP tools allow Gemini to utilize Context Caching more effectively, potentially slashing input costs by 90%.
I believe making Gemini the first mobile AI to natively host decentralized MCP would be a game-changer for professional developers on the go.
I’d love to hear from the Google DevRel and Engineering teams on the feasibility of implementing a native EventSource/SSE client in the Gemini App.
Best regards,