Use unified endpoints in either OpenAI or Anthropic format to run inference on deployed models.
https://api.fundamental-photon.xyz/api/inference/v1/chat/completions
https://api.fundamental-photon.xyz/api/inference/v1/messages
Deploy, manage, and deactivate checkpoint models from Modal Storage with GPU resources.
https://api.fundamental-photon.xyz/api/checkpoints
https://api.fundamental-photon.xyz/api/checkpoints
https://api.fundamental-photon.xyz/api/checkpoints/deactivate
https://api.fundamental-photon.xyz/api/models
https://api.fundamental-photon.xyz/api/models/{id}
https://api.fundamental-photon.xyz/api/models
https://api.fundamental-photon.xyz/api/models/{id}
https://api.fundamental-photon.xyz/api/models/{id}
https://api.fundamental-photon.xyz/api/models/{id}/deploy-smart
https://api.fundamental-photon.xyz/api/models/{id}/deployment-progress
Check the health and availability of the Photon API.
https://api.fundamental-photon.xyz/api/health
API keys are optional for the current endpoints. If authentication is required, include your API key in the appropriate header based on the format you're using.
OpenAI Format:
Anthropic Format: