Private Inference API
Secure.Verified.
Interact with Tinfoil using the same OpenAI Chat Completions API standard:
-from openai import OpenAI
+from tinfoil import TinfoilAI
TinfoilAI
supports the same interface as the OpenAI
Python clientView on GitHubTinfoil client
The Tinfoil client provides secure, verified access to our API:
- Requests are automatically blocked if verification of the enclavefails
- Tinfoil is compatible with the OpenAI API endpoint, making it a drop-in replacement for existing OpenAI integrations
Model inference with Tinfoil Client
High-performance reasoning model with exceptional benchmarks
Installation:
pip install tinfoil
Inference:
from tinfoil import TinfoilAI
client = TinfoilAI(
enclave="deepseek-r1-70b-p.model.tinfoil.sh",
repo="tinfoilsh/confidential-deepseek-r1-70b-prod",
api_key="YOUR_API_KEY",
)
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Hello!",
}
],
model="deepseek-r1-70b",
)
print(chat_completion.choices[0].message.content)
Available Models

DeepSeek-R1-Distill-Llama-70B
High-performance reasoning model with exceptional benchmarks

Mistral-Small-3.1-24B
Advanced multimodal model with enhanced vision capabilities and extended context window

Llama 3.3 70B
High-performance multilingual language model for chat and reasoning

Llama 3.2 1B
Multilingual large language model optimized for dialogue

Llama Guard 3 1B
Safety-focused model for content filtering and moderation

Qwen 2.5 Coder 0.5B
Compact code-specialized model for lightweight applications

Nomic Embed Text
Open-source text embedding model that outperforms OpenAI models on key benchmarks