Our Mission

Our Mission

January 12, 2026

We founded Tinfoil because we wanted a private garden for thought.

We believe that AI is the most intimate technology ever built. We see AI as a space to explore, to make mistakes, to think out loud, to reflect with a beautiful intelligence.

In order to be helpful, it needs to know us in ways that perhaps we may not even know ourselves. In turn, we want to provide it the context and autonomy it needs, entrust it with a perfect articulation of our inner and outer worlds so it can predict our needs, so it can watch over us with a vigilance that never tires, so it can be a “bicycle for the mind”. We let it record our screen so it can see what we see, watch how we think so it can teach us in a way no classroom could, help us design a protocol when the standard treatment fails—because the world only builds for averages.

And the only reason we hesitate, is that we understand on some level, the terms of the bargain. The entity on the other side is not alone. You lay yourself bare while someone else records.

AI is becoming the infrastructure that is always on, always helpful, yet always watching. Whoever owns the infrastructure of intelligence has power over the people who engage with it.

We know all this, but use it anyway. We use it because it's genuinely helpful, because we can't imagine living without it, because the alternative is to be left behind while the world moves on. And so we accept the trade, but feel unclean.

Through this process, the infrastructure of AI is slowly becoming a panopticon.

We think differently under the panopticon. We stop communicating freely, even with ourselves. We feel like everything can be observed, examined, deconstructed. We stop exploring and start performing. We edit our thoughts. We lose the magic and space we had for exploration.

What we really want is the omniscience of AI, full context with no filter, but free from the voyeurism of the platform. The ability to simply think, think useless thoughts, terrible thoughts, fun thoughts, because we want to, because it pleases us, because we are in the sanctity of our own minds. That is the last bastion of personal freedom.

We built Tinfoil to be a sanctuary for thought. A private garden where my omniscient friend and I cultivate flowers that court the divine.

Founding Principles

Before TLS encryption, anyone could read your Internet traffic. Any WiFi network could be used to steal your data. Encryption didn't make eavesdropping illegal, it made eavesdropping technologically impossible. This bedrock brought commerce to the Internet. It's why we can safely make credit card transactions online.

We believe that AI needs similar infrastructure, where privacy is enforced by design, not through policies and legal frameworks. We don't believe in pinky-promises.

Here's how we think about Tinfoil:

1. Our users don't have to trust us; they can verify

Even though we operate the AI infrastructure, we cannot see our users' data, even if we were legally compelled. We open source our code and run everything inside secure hardware enclaves. Anyone can verify that we are doing the right thing. All cryptographic evidence is checked automatically, by every client, on every connection.

2. We believe in a good future with AI

We're not against AI. We use AI and truly believe in its transformative potential. What we are against is the trading of personal control in return for access to intelligence.

3. Privacy elevates the user experience

If you had the choice between two equally good technologies, where one is private, why would you ever choose the non-private one? We believe that privacy must elevate the quality of the user experience without being an impediment to the functionality. Privacy should not impact the intelligence, performance, or capabilities of AI. We go through great length to prove this by crafting a user experience that is enhanced—not degraded—by being private.

4. Commitment to transparency

We are committed to transparency: every assumption, every security boundary, and every tradeoff is drawn in public view. We will not get everything right, but we strive for a gold-standard that we feel proud to use every day to communicate with artificial intelligence.

Tinfoil Today

An early bet on secure hardware

We believe that secure hardware primitives will take us very far in accomplishing the goal of building a verifiably private infrastructure for AI.

Some believe that side-channel or physical attacks on existing secure hardware make confidential computing a doomed approach. We strongly disagree. Consider that early versions of RSA implementations were broken for over 25 years, but cryptographers persisted and this persistence is what gave us public-key encryption. Confidential computing is less than a decade old.

For example, many recent physical attacks stem from using weak memory encryption schemes (chosen to reduce performance overhead). This is an engineering limitation that is not fundamental to the premise of confidential computing itself, and given how we are still in the early days of applying it to real problems, we want to re-examine these tradeoffs.

Therefore, in the short term, we cannot protect against targeted physical attacks, but we design systems where the compromise of one machine does not affect the security of the rest of the infrastructure.

However, we don't plan to sit on the sidelines and hope for the best. We're building at the frontier of AI and secure hardware, and we're taking what we learn back to the confidential computing community to help shape where this technology goes next.

Tinfoil is about private AI, not a specific technology

With that said, people ask if we're a “confidential computing company”. We're not. We're a private AI company. It just so happens that today we use confidential computing technology to build private AI, because it's currently the best tool for the job. If fully homomorphic encryption matures to the point where it's sufficiently practical, or local compute become sufficiently powerful, then we will evaluate how these technologies help with our mission.