Back to Blog

Introducing the LLMS.txt Explorer MCP Server

Introducing the LLMS.txt Explorer MCP Server featured image

Introducing the LLMS.txt Explorer MCP Server

By Zach · March 25, 2026 · 5 min read

We are excited to announce the LLMS.txt Explorer MCP Server — a new way for AI assistants to directly search and access our index of over 50,000 llms.txt files. If you use Claude Desktop, Claude Code, Cursor, or any tool that supports the Model Context Protocol, you can now give your AI real-time access to the world’s largest llms.txt database.

What Is MCP?

The Model Context Protocol (MCP) is an open standard that allows AI applications to connect to external data sources and tools through a consistent interface. Think of it as a universal adapter between your AI assistant and the services it needs to access. Instead of copying and pasting data into a chat window, MCP lets your assistant reach out and grab exactly the information it needs.

What Our MCP Server Does

The LLMS.txt Explorer MCP server exposes two powerful tools:

1. search_llms_txt — Search the Index

Search across our entire database of 50,000+ indexed llms.txt and llms-full.txt files. You can search by domain, keyword, topic, or title. Results include:

  • Domain and URL of each llms.txt file
  • Quality score (High, Medium, or Low) based on spec compliance
  • AI-classified topics for each entry
  • Token count so you know how much context it will consume
  • Last updated timestamp for freshness

You can filter by file type (llms.txt, llms-full.txt, or both) and paginate through results with up to 50 entries per page.

2. get_llms_txt_entry — Look Up a Domain

Already know which domain you are interested in? Use this tool to look up the llms.txt details for any specific domain. Just pass a domain name like openai.com or anthropic.com and get back the full metadata entry.

Getting Started

Setup takes under two minutes.

Step 1: Install Dependencies

cd mcp-server
npm install

Step 2: Add to Your MCP Configuration

For Claude Desktop or Claude Code, add this to your MCP config file:

{
  "mcpServers": {
    "llms-txt-explorer": {
      "command": "node",
      "args": ["/absolute/path/to/llms/mcp-server/index.js"]
    }
  }
}

For Claude Desktop, the config file is typically at ~/.claude/claude_desktop_config.json. For Claude Code, you can add it to your project’s .mcp.json.

For Cursor or other MCP clients, point your client at the server:

node mcp-server/index.js

The server communicates over stdio using the standard MCP protocol, so it works with any compliant client.

Step 3: Start Asking Questions

Once connected, you can ask your AI assistant things like:

  • “Search for llms.txt files related to e-commerce”
  • “Find the llms.txt entry for stripe.com”
  • “Which AI companies have high-quality llms.txt files?”
  • “Show me documentation-focused llms.txt files with low token counts”

Your assistant will call the MCP tools behind the scenes and return structured, relevant results.

Why This Matters

The llms.txt standard is growing rapidly. Thousands of companies now publish structured information specifically for AI consumption. But until now, discovering and accessing that data required manual browsing or API integration work.

With our MCP server, that data is one question away. Your AI assistant can:

  • Research competitors by searching their llms.txt files for product information
  • Discover APIs and documentation across thousands of domains
  • Audit llms.txt quality to understand how well sites follow the specification
  • Build knowledge bases by pulling structured data from multiple sources

Configuration Options

The server supports one optional environment variable:

  • LLMS_API_BASE — Override the default API endpoint (defaults to https://llms-text.ai)

This is useful for local development or if you are running your own instance of the LLMS.txt Explorer.

Open Source and Extensible

The MCP server is included in the LLMS.txt Explorer repository and is fully open source. We welcome contributions — whether that is adding new tools, improving error handling, or integrating with additional data sources.

Get started today by cloning the repository and following the setup instructions above. If you run into issues or have feature requests, open an issue on GitHub.


The MCP server is available now. Give your AI the ability to explore the llms.txt ecosystem and see what it discovers.

Avatar for LLMS.txt Team

About LLMS.txt Team

LLMS.txt Team is a contributor to the LLMS.txt Explorer blog.