Here’s the official Triton Inference Server documentation for more details.
Integrating Custom Models with Portkey SDK
Expose your Triton Server
Expose your Triton server by using a tunneling service like ngrok or any other way you prefer. You can skip this step if you’re self-hosting the Gateway.
Initialize Portkey with Triton custom URL
- Pass your publicly-exposed Triton server URL to Portkey with
customHost - Set target
providerastriton.
- NodeJS SDK
- Python SDK
custom_host here.Next Steps
Explore the complete list of features supported in the SDK:SDK
You’ll find more information in the relevant sections:

