Gemini Nano Prompt & Translation/Detection API Demo

A starting point for a client-side, LLM in the browser, for online and offline Javascript use.

NOT YET FOR MOBILE, DO NOT TRY YET ON MOBILE!

By Jeremy Ellis uses at your own risk
Github for these gitpages at https://github.com/hpssjellis/my-examples-of-web-llm
Main Demo index at https://hpssjellis.github.io/my-examples-of-web-llm/public/index.html

IMPORTANT: If you see "Access denied because the Permission Policy is not enabled" errors, you need to serve this page using a local web server (e.g., Python's `http.server`).

How to run a simple Python local server:
1. Open your terminal or command prompt.
2. Navigate to the directory where this HTML file is saved (e.g., `cd /path/to/your/folder`).
3. Run the command: `python -m http.server 8000`
4. Open your Chrome Canary browser and go to: `http://localhost:8000/this_file_name.html` (replace `this_file_name.html` with your actual file name).

This page demonstrates the core features of the Gemini Nano Prompt API (`LanguageModel` API) available in Chrome 138+. Ensure you have enabled the necessary flags. Copy the link below and paste it into your Chrome address bar:
Then search for "optimization-guide-on-device-model" and then search for "gemini-nano"

The Gemini Nano model will download the first time you use it. That will be about 4.0 GB of download and will need about 20 GB saving space for the final folders.



Input Text:

Output:

No output yet. Click a button to begin.