Private Inference API

Secure.Verified.

Interact with Tinfoil using the same OpenAI Chat Completions API standard:

-from openai import OpenAI
+from tinfoil import TinfoilAI
TinfoilAI supports the same interface as the OpenAI Python clientView on GitHub

Tinfoil client

The Tinfoil client provides secure, verified access to our API:

  • Requests are automatically blocked if verification of the enclavefails
  • Tinfoil is compatible with the OpenAI API endpoint, making it a drop-in replacement for existing OpenAI integrations

Model inference with Tinfoil Client

High-performance reasoning model with exceptional benchmarks

Installation:

pip install tinfoil

Inference:

from tinfoil import TinfoilAI

client = TinfoilAI(
    enclave="deepseek-r1-70b-p.model.tinfoil.sh",
    repo="tinfoilsh/confidential-deepseek-r1-70b-prod",
    api_key="YOUR_API_KEY",
)

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Hello!",
        }
    ],
    model="deepseek-r1-70b",
)
print(chat_completion.choices[0].message.content)

Available Models