Multilingual large language model optimized for dialogue
The Meta Llama 3.2 1B model is optimized for multilingual dialogue use cases, including personal information management, knowledge retrieval, and rewriting tasks running locally on edge devices. It is competitive with other 1-3B parameter models.
1.24 billion
128k tokens
Best suited for personal information management, multilingual knowledge retrieval, and rewriting tasks running locally on edge.
English, French, German, Hindi, Italian, Portuguese, Spanish, Thai
Installation:
pip install tinfoil
Inference:
from tinfoil import TinfoilAI
client = TinfoilAI(
enclave="models.default.tinfoil.sh",
repo="tinfoilsh/default-models-nitro",
api_key="YOUR_API_KEY",
)
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Hello!",
}
],
model="llama3.2-1b",
)
print(chat_completion.choices[0].message.content)