Share markdown with Claude, ChatGPT, and every MCP client
mdshare is a free, no-login markdown sharing service. It ships with an MCP server so AI clients can upload, read, edit, comment on, and share markdown documents without any account setup, API keys, or copy-paste.
If you've ever burned Claude context pasting a 2000-line markdown file, or asked the model to regenerate a long document from memory, this is the fix. The LLM uploads once, gets a link, and from then on edits happen by patching — not by re-sending the whole file.
Install the MCP server
One command, any OS:
npx mdshare-mcp The package is published on the npm registry and the official MCP Registry. It runs as a local stdio process. No service to host, no credentials to rotate.
Connect to your AI client
Pick your client and add the config block. Every client uses the same command — only the config file path differs.
Claude Desktop & Claude Code
Add to ~/.claude/claude_desktop_config.json:
{
"mcpServers": {
"mdshare": {
"command": "npx",
"args": ["mdshare-mcp"]
}
}
} Restart Claude. Ask it to share a markdown file — it will call the upload_markdown tool and return a share link.
Cursor
Add to ~/.cursor/mcp.json:
{
"mcpServers": {
"mdshare": {
"command": "npx",
"args": ["mdshare-mcp"]
}
}
} Windsurf
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"mdshare": {
"command": "npx",
"args": ["mdshare-mcp"]
}
}
} OpenAI Codex CLI (ChatGPT)
Add to ~/.codex/config.json:
{
"mcpServers": {
"mdshare": {
"command": "npx",
"args": ["mdshare-mcp"]
}
}
} Google Gemini CLI
Add to ~/.gemini/settings.json:
{
"mcpServers": {
"mdshare": {
"command": "npx",
"args": ["mdshare-mcp"]
}
}
} Any other MCP client
Same pattern — command: "npx", args: ["mdshare-mcp"]. Transport is stdio. No environment variables required.
What the LLM can do
The MCP server exposes 14 tools across four groups. Each is callable by the model as needed, with no extra prompting required.
Documents
upload_markdown— create a document from a file path or inline contentread_document— fetch the current content (text or JSON)update_document— replace content (file_path preferred for large files)patch_document— apply find/replace operations without retransmitting the whole docget_versions— inspect the edit history
Sharing & permissions
generate_link— create a new view/comment/edit/admin link, optionally with expiryrevoke_link— invalidate a specific share linklist_links— see all active links for a documentget_admin_url— retrieve the admin URL (for direct user request only)
Comments
post_comment— add a comment (optionally anchored to text)list_comments— read the comment threadresolve_comment— mark a thread as resolved
Admin
list_my_documents— all documents uploaded via this MCP server (local credential store)register_document— import an admin URL from elsewhere into the local store
Full tool schemas, arguments, and example responses live in the REST API docs — mdshare's MCP server is a thin wrapper around the same REST endpoints.
FAQ
What is MCP?
MCP (Model Context Protocol) is an open standard from Anthropic that lets AI clients talk to external tools. Think of it as "USB for LLMs" — one plug shape, any tool. mdshare ships an MCP server so any compatible AI can share markdown without custom integration work.
Do I need an API key?
No. mdshare has no account, no auth system, no keys to manage. Each document the MCP server creates is its own sharable artifact with its own link.
Which AI clients support this?
Claude Desktop, Claude Code, Cursor, Windsurf, OpenAI Codex CLI (ChatGPT), Google Gemini CLI, and anything else that implements the MCP spec. Configuration is identical across clients — the only difference is which JSON file you paste into.
Where does the content live?
On Cloudflare D1 (SQLite), served through a Cloudflare Worker. No third-party analytics, no tracking, no auth database. Content expires after 90 days.
Is this an alternative to copy-pasting into the chat?
That's the main use case. Pasting a long markdown file into Claude burns context tokens and forces the model to work against a snapshot. With mdshare, the model uploads once, works against a live URL, and applies patches instead of rewriting. Your context stays clean and the document is shareable with any teammate.
What about privacy?
Content is sanitized server-side but otherwise stored as-is. Anyone with a share link can access the document at that permission tier. Don't upload anything sensitive — this is a paste-and-share service, not an encrypted vault.