Private Inference API
Simple.Secure.Verified.
Interact with Tinfoil using the same OpenAI Chat Completions API standard:
-from openai import OpenAI
+from tinfoil import TinfoilAI
TinfoilAI
supports the same interface as the OpenAI
Python clientView on GitHubTinfoil client
The Tinfoil client provides secure, verified access to our API:
- Requests are automatically blocked if verification of the enclavefails
- Tinfoil is compatible with the OpenAI API endpoint, making it a drop-in replacement for existing OpenAI integrations
Example

DeepSeek-R1-Distill-Llama-70B inference with Tinfoil Client
Installation:
curl -fsSL https://github.com/tinfoilsh/tinfoil-cli/raw/main/install.sh | sh
Inference:
$ tinfoil chat "Why is tinfoil now called aluminium foil"
Available Models

DeepSeek-R1-Distill-Llama-70B
High-performance reasoning model with exceptional benchmarks
Model ID:
deepseek-70b

Llama 3.2 1B
Multilingual large language model optimized for dialogue and edge deployment
Model ID:
llama3.2-1b

Llama Guard 3 1B
Safety-focused model for content filtering and moderation
Model ID:
llama-guard3-1b

Nomic Embed Text
Open-source text embedding model that outperforms OpenAI models on key benchmarks
Model ID:
nomic-embed-text

Qwen 2.5 Coder 0.5B
Compact code-specialized model for lightweight applications
Model ID:
qwen2.5-coder-0.5b
More models coming soon!
Join the private preview for early access