Private Inference

Build with powerful AI models while keeping all data private.

Tinfoil SDKs

The Tinfoil SDKs provide secure, automatically verified access to our inference API:

  • Requests are automatically blocked if verification of the enclave fails
  • Compatible with the OpenAI API standard, making it a drop-in replacement for most existing deployments and workflows.
-from openai import OpenAI
-client = OpenAI()
+from tinfoil import TinfoilAI
+client = TinfoilAI()
TinfoilAI supports the same interface as the OpenAI Python clientView on GitHub

Inference with Tinfoil SDKs

Supported AI Models

Access state-of-the-art open-source language models, all running in secure hardware enclaves with verifiable privacy guarantees.

Loading models...

Model and API Availability

Available models and capabilities are subject to change. If you require SLA guarantees, specific model availability, or long-term production usage, please contact us to discuss your needs. We're also happy to work with you to add support for your desired models.