Use unified endpoints in either OpenAI or Anthropic format to run inference on deployed models.
https://api.fundamental-photon.xyz/api/inference/v1/chat/completionshttps://api.fundamental-photon.xyz/api/inference/v1/messagesDeploy, manage, and deactivate checkpoint models from Modal Storage with GPU resources.
https://api.fundamental-photon.xyz/api/checkpointshttps://api.fundamental-photon.xyz/api/checkpointshttps://api.fundamental-photon.xyz/api/checkpoints/deactivatehttps://api.fundamental-photon.xyz/api/modelshttps://api.fundamental-photon.xyz/api/models/{id}https://api.fundamental-photon.xyz/api/modelshttps://api.fundamental-photon.xyz/api/models/{id}https://api.fundamental-photon.xyz/api/models/{id}https://api.fundamental-photon.xyz/api/models/{id}/deploy-smarthttps://api.fundamental-photon.xyz/api/models/{id}/deployment-progressCheck the health and availability of the Photon API.
https://api.fundamental-photon.xyz/api/healthAPI keys are optional for the current endpoints. If authentication is required, include your API key in the appropriate header based on the format you're using.
OpenAI Format:
Anthropic Format: