Self-Service with Automated Validation: You control which operations to expose through the admin portal. Automated validation ensures security and compliance standards are met.
Tool deployment process
Governance and quality assurance
Grand Central ensures all MCP tools meet quality and security standards through automated validation and approval workflows. Security assessments happen for every tool request - automated scanners evaluate data exposure risks, authentication requirements, and potential compliance violations based on policy rules. Documentation quality checks use automated linters to ensure OpenAPI descriptions are clear enough for proper MCP mapping - vague descriptions trigger warnings. Performance testing runs automated load tests to validate response times and reliability before production deployment. Compliance reviews use rule-based validation to verify regulatory and audit requirements (PII handling, data retention, access logging) are met.Enabling new tools
Need a banking operation exposed as an MCP tool? Enable it yourself through the admin portal. For pre-configured operations: Browse the API catalog in the MCP Tools section, select the operations you need, configure rate limits and authentication requirements, and enable them. Pre-configured operations are available immediately after automated validation. For custom operations: Upload or link your OpenAPI specification through the portal, select which operations to expose, and the system runs automated security validation. Low-risk operations (read-only, public data) enable immediately. Higher-risk operations (write access, PII exposure) go through automated compliance scanning - typically completing within 1 to 3 business days. Automated validation evaluates multiple dimensions. Security scanners check data exposure risks, authentication requirements, and potential compliance violations based on policy rules. Documentation linters ensure OpenAPI descriptions are clear enough for proper MCP mapping - vague descriptions trigger warnings. Performance validators check response time expectations and reliability requirements. Compliance validators verify regulatory and audit requirements like PII handling, data retention policies, and access logging are properly configured. Platform support is available if you need help with complex specifications or have questions about validation results.Tool updates
Grand Central tools are updated through automated processes. Security policies are configured through the admin portal with automated validation running on each change. API changes happen by updating your OpenAPI spec in the portal - automated validation reruns to ensure the changes meet standards. Rate limiting adjusts through the admin portal based on your needs, with changes taking effect immediately. Bug fixes applied to underlying APIs are automatically reflected in MCP tools without requiring manual updates.Backward Compatibility: Tool names and schemas remain stable. Breaking changes require new tool versions.
Quality standards
Every MCP tool must meet these standards:Clear documentation
Clear documentation
Tools need descriptive names and summaries that clearly explain their purpose. Parameter descriptions must detail what each input expects, including format requirements and validation rules. Example requests and responses help developers understand proper usage patterns. Error handling guidance explains what can go wrong and how agents should respond to different failure scenarios.
Security requirements
Security requirements
Proper authentication and authorization must be configured for every tool, ensuring only authorized users can access sensitive operations. Rate limiting prevents abuse and ensures fair resource allocation. Audit logging captures every invocation with request parameters and response data for compliance and security investigations. Tools must not expose sensitive data without proper authorization controls in place.
Performance standards
Performance standards
Response times must stay under 2 seconds for typical operations to maintain acceptable user experience. Error rates below 1% ensure reliability and minimize frustration for users and agents. Availability targets above 99.9% mean less than 9 hours of downtime per year. Every tool must be load tested at expected scale before production deployment to validate it can handle realistic usage patterns.