This is a public HTTP streaming implementation of the Model Context Protocol (MCP) for https://rtsak.com
Endpoint: https://mcp.rtsak.com/mcp
You can try it out directly on ChatGPT as a GPT
Add to your Claude Desktop config file:
{
"mcpServers": {
"rtsak": {
"url": "https://mcp.rtsak.com/mcp"
}
}
}
%APPDATA%\Claude\config.json~/Library/Application Support/Claude/config.json~/.config/Claude/config.jsonRun this command:
claude mcp add --transport http rtsak https://mcp.rtsak.com/mcp
If your MCP client only speaks stdio and can't connect to a remote HTTP endpoint directly, use the @robtex/mcp-stdio proxy to bridge stdio ↔ https://mcp.rtsak.com/mcp.
Install options:
# npm (no local install needed) npx -y @robtex/mcp-stdio # Docker docker run -i --rm robtex/mcp # Homebrew (once the tap is live — pending) brew install robtex/tap/mcp
Claude Desktop / Cursor / Zed / Windsurf config snippet:
{
"mcpServers": {
"rtsak": {
"command": "npx",
"args": ["-y", "@robtex/mcp-stdio"]
}
}
}
~/.cursor/mcp.json~/.config/zed/settings.json (under "context_servers")~/.codeium/windsurf/mcp_config.jsonOverride the upstream: set ROBTEX_MCP_URL=https://mcp.hashxp.org/mcp (or any sister site, or http://127.0.0.1:3443/mcp for local WSH) to point the proxy at a different upstream.
Every outbound request from this proxy sets User-Agent: robtex-mcp-stdio/<version> (+https://github.com/robtex/skills) so server logs can be grepped by that prefix.
Quick setup:
https://mcp.rtsak.com/mcp/openapi.yamlOr manually add the endpoint https://mcp.rtsak.com/mcp as an action.
Install curated skills with one command:
npx skills add robtex/skills
Install the plugin with 6 curated skills:
/plugin marketplace add https://github.com/robtex/skills /plugin install robtex@robtex-plugins