Steed
Generate llms.txt, actions.json, and govern AI crawler access.
Why this matters: The next wave of digital traffic will be autonomous agents, not just browsers. Steed deploys the governance layer (llms.txt, actions.json) required to monetize agentic traffic while strictly protecting your data sovereignty and access rights.
Log in to continue
You need an account to access this module.
Brand Profile Required
You need to define your brand identity in Mare before generating assets.
Go to MareKeywords Required
You need to discover and select keywords in Colt before generating assets.
Go to Coltllms.txt and sitemap.xml optimized for AI search. This ensures models index your most important pages with the correct context and permissions.
The root URL to crawl for content discovery.
Prioritized URLs
Explicitly include these pages in llms.txt.
Context
Strategy
Matches keywords to pages based on meaning and context. Best for accuracy. Matches keywords based on exact text occurrence. Best for specific terms.
AI API Gateway Configuration
Don't build a chatbot; make your data agent-readable.
How it works: RockHorse acts as a secure middleware proxy. Map your existing internal APIs (REST/GraphQL) here. We automatically generate the schemas (actions.json for OpenAI, agent.json for Google, mcp.json for Anthropic) and handle the traffic governance.
API Capability Map
Defined Endpoints
These are the tools you are exposing to AI agents.
The function name the AI calls. Use snake_case.
Groups actions in the documentation.
The internal API URL RockHorse will proxy requests to.
This tells the AI when and how to use this tool. Be specific.
Define data the AI should extract from the user's prompt.
AI Gateway & Governance
Security Shield: RockHorse acts as a middleware proxy to rate-limit AI agents and strip sensitive data before it leaves your server.
Max requests per minute per agent to prevent DoS.
Comma-separated JSON keys to expose. All others are stripped.
Authentication & Schema
How RockHorse authenticates with your upstream API.
Optional: Define the structure of the API response.
ai.txt to declare data mining permissions and robots.txt to explicitly grant or deny permission for AI scrapers.
Crawler Governance
Manage data licensing via ai.txt and bot access via robots.txt.