API
The API is exposed through the window.ai object.
Permissions
The permissions object provides methods to interact with the permissions system.
models
The models method returns a list of available models.
await window.ai.permissions.models();Output
[
{
"model": "llama3.2",
"available": true
},
{
"model": "llama3.1",
"available": true
},
{
"model": "gemma2",
"available": true
}
]request
The request method requests access to a specific model.
await window.ai.permissions.request({ model: 'llama3.1' });Model
The model object provides methods to interact with a specific model.
connect
The connect method connects to a specific model.
const session = await window.ai.model.connect({ model: 'llama3.1' });Session
The session object provides methods to interact with the model session.
chat
The chat method starts a chat session.
await session.chat({ messages: [ { role: 'user', content: 'hello!' } ]});Output
{
"id": "fn6cq1zciy",
"choices": [
{
"message": {
"role": "assistant",
"content": "Hello! How are you today? Is there something I can help you with or would you like to chat?"
},
"finish_reason": "stop"
}
],
"created": "2024-10-11T01:14:37.204701Z",
"model": "llama3.1",
"usage": {
"total_duration": 6854656244,
"load_duration": 41462875,
"prompt_eval_count": 12,
"prompt_eval_duration": 955155000,
"eval_count": 23,
"eval_duration": 5856371000
}
}Tools
The chat method supports tool calls
Output
const addFunc = (a, b) => a + b;
const addTool = {
type: "function",
function: {
name: "addFunc",
description: "Add two numbers together.",
parameters: {
type: "object",
required: ["a", "b"],
properties: {
a: {
type: "number",
description: "The first number to add.",
},
b: {
type: "number",
description: "The second number to add.",
},
},
},
},
}
if(await window.ai.permissions.request({ model: 'llama3.2', silent: true })) {
const session = await window.ai.model.connect({ model: 'llama3.2' });
const response = await session.chat({ messages: [{ role: 'user', content: 'add 1 and 2' }], tools: [addTool] })
if (response.choices[0].message.tool_calls) {
const { name, arguments } = response.choices[0].message.tool_calls[0].function;
const result = addFunc(arguments.a, arguments.b);
console.log("Function " + name + " called with result: " + result);
}
console.log(response)
}embed
The embed method starts an embed session.
await session.embed({ input: "Hello world" });Output
{
"model": "llama3.1",
"embeddings": [
[
-0.0070958203,
-0.019203912,
...
]
]
}info
The info method provides details about the model.
const info = await window.ai.model.info({ model: 'llama3.1' });Output
{
"model": "llama3.1",
"license": "[Model License]",
"details": {
"parent_model": "",
"format": "gguf",
"family": "llama",
"families": [
"llama"
],
"parameter_size": "8.0B",
"quantization_level": "Q4_0"
}
}