AI Microservice DEMO

Examples and test tools for our AI services. Use the front-end code as examples to import into your own projects.

🚀 Recommended: Normalized LLM API

Normalized LLM Respond API

The unified, vendor-agnostic endpoint for all LLM interactions. This is the recommended way to interact with AI models through our platform.

  • Provider-agnostic: Works with OpenRouter and future providers
  • Tools/Function Calling: Full JSON Schema-based tool support
  • Structured Streaming: Real-time SSE streaming with token, tool_call, and done events
  • Multi-part Content: Supports text and image inputs
  • Normalized Responses: Consistent format regardless of provider
  • Error Normalization: Standardized error types and messages
  • Usage Tracking: Automatic usage recording for billing

Legacy Endpoints

Note: The endpoints below are legacy provider-specific interfaces. For new integrations, please use the Normalized LLM Respond API above, which provides a unified interface with better features and future-proofing.