Testing Local Gemini in Chrome Canary
Want to run AI locally? Google’s got your back with Gemini:Nano in Chrome Canary. It’s like having a tiny AI butler right in your browser – one that doesn’t gossip about your data to the cloud (but still reports back to Google HQ for “quality assurance” 😉).
Quick Setup
- Download Chrome Canary
- Navigate to
chrome://flags/
(where all the fun experimental stuff lives) - Enable “Prompt API for Gemini Nano” and “Enables optimization guide on device” with BypassPerfRequirement
- Restart Chrome
- Visit
chrome://components/
and update “Optimization Guide On Device Model” - Check console for
window.ai
The Cool Part: Building a Chat Interface
Here’s a simple web app that lets you chat with your browser’s built-in AI. It’s like Siri, but it actually lives in your computer instead of phoning home every five seconds.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<script type="module" src="https://cdn.jsdelivr.net/npm/@ionic/core/dist/ionic/ionic.esm.js"></script>
<script nomodule src="https://cdn.jsdelivr.net/npm/@ionic/core/dist/ionic/ionic.js"></script>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@ionic/core/css/ionic.bundle.css" />
<title>Hello Gemini</title>
<style>
#canary {
position: relative;
height: 100vh;
padding-bottom: 50px;
}
#answerItem {
position: absolute;
bottom: 0;
width: 100%;
}
</style>
</head>
<body>
<div id="canary">
<ion-list>
<ion-item>
<ion-icon slot="start" name="happy-outline" aria-hidden="true"></ion-icon>
<ion-input placeholder="Ask me anything" clear-input="true" id="user"></ion-input>
<ion-button id="askbtn">Ask</ion-button>
</ion-item>
<ion-item id="answerItem">
<ion-icon slot="start" name="logo-chrome" aria-hidden="true"></ion-icon>
<ion-label id="answer">...</ion-label>
</ion-item>
</ion-list>
</div>
<script>
<script>
/**
* Checks if the AI is available and ready for use.
* @returns {Promise<boolean>} - Returns true if AI is available and ready, otherwise false.
*/
async function isAIAvailable() {
const hasAI = window.ai != null;
const isReady = await window.ai?.canCreateTextSession();
return hasAI && isReady === "readily";
}
/**
* Handles the question and answer process using the AI session.
*/
async function handleQA() {
try {
if (!(await isAIAvailable())) {
console.log("AI is not available or not ready");
return;
}
const session = await window.ai.createTextSession();
const inputText = document.getElementById('user').value;
const stream = session.promptStreaming(inputText);
await updateDom(stream);
} catch (err) {
console.error("Error in handleQA:", err);
}
}
/**
* Updates the DOM with streamed responses from the AI session.
* @param {AsyncIterable<string>} stream - The stream of AI responses.
*/
async function updateDom(stream) {
const outputItem = document.getElementById('answer');
outputItem.textContent = ''; // Clear previous content
try {
for await (const chunk of stream) {
outputItem.textContent += chunk;
}
} catch (err) {
console.error("Error in updateDom:", err);
}
}
/**
* Initializes event listeners for the page.
*/
function initializeEventListeners() {
document.addEventListener('keydown', async function(event) {
if (event.ctrlKey && event.key === 'Enter') {
event.preventDefault(); // Prevent default behavior
document.getElementById('askbtn').dispatchEvent(new MouseEvent('mousedown'));
}
});
document.getElementById('askbtn').addEventListener('mousedown', async function(event) {
event.preventDefault(); // Prevent default mousedown behavior
await handleQA();
});
}
// Initialize event listeners when the page is loaded
document.addEventListener('DOMContentLoaded', initializeEventListeners);
</script>
</body>
</html>
What’s Going On Here?
- We’ve got a sleek UI thanks to Ionic (because nobody likes ugly apps)
- The AI checks if it’s ready faster than a caffeinated squirrel
- Responses stream in real-time (no more watching loading spinners do their hypnotic dance)
- Hit Ctrl+Enter to send messages (because clicking buttons is so 2023)
The Future is Local (and Open!)
I like that we can use a local model that seems to be pretty quick and competent. Can’t wait for a more mature API for running less demanding models, the use cases for frontend apps are plenty.
Future Chrome users will get this built-in, which is pretty neat. Local AI is trending faster than cat videos in 2005, and I’m here for it! 🚀