This is a proxy server that converts the Tencent Hunyuan LLM API to an OpenAI-compatible API.
667 |You can use this proxy to access the Hunyuan LLM with any OpenAI-compatible client.
668 |This endpoint mimics the OpenAI Chat Completion API.
801 |POST /v1/chat/completions
803 | Authorization: Bearer YOUR_API_KEY
805 | Content-Type: application/json
806 |
807 | {
809 | "messages": [
810 | {
811 | "role": "user",
812 | "content": "Hello, who are you?"
813 | }
814 | ],
815 | "model": "hunyuan-t1-latest",
816 | "stream": true
817 | }
818 | Supported models: hunyuan-t1-latest
, hunyuan-turbos-latest
. To make a non-streaming request, set "stream": false
in the request body.
Returns a stream of Server-Sent Events (SSE) in the OpenAI format for streaming requests, or a JSON object for non-streaming requests.
821 |Get a list of available models.
826 |GET /v1/models
828 |
830 | {
831 | "object": "list",
832 | "data": [
833 | {
834 | "id": "hunyuan-t1-latest",
835 | "object": "model",
836 | "created": 1678886400,
837 | "owned_by": "tencent"
838 | },
839 | {
840 | "id": "hunyuan-turbos-latest",
841 | "object": "model",
842 | "created": 1700000000,
843 | "owned_by": "tencent"
844 | }
845 | ]
846 | }
847 |
848 | Retrieves the API key.
852 |GET /getkey
854 | Your API Key is: ${API_KEY}