StaticfunctionalityThis functionality turns a set of HTML elements into a chat interface for the LLAMA Model served by the CodBi. It enables interactive, multi-turn conversations about uploaded images and PDF documents, provides internet query access to the model via the Brave Search API and the client's location via the Geolocation API. Voice input is supported via the Media.Input.Speech.Whisper-Functionality.
If the model is not specified QWEN3-VL 2B is downloaded and utilized.
Required Elements (found by CSS class within the nearest common ancestor):
| CSS Class | Element | Purpose |
|---|---|---|
| The class tagged with this functionality | textarea |
Chat display (read-only conversation history) |
AI_LLAMA_CHAT_Input |
input type="text" or textarea |
Text input where the user types messages |
AI_LLAMA_CHAT_Send |
button |
Send button (triggers inference) |
AI_LLAMA_CHAT_Stop |
button |
Stop button (aborts running inference) |
AI_LLAMA_CHAT_Upload (Optional) |
input type="file" |
File upload for images/PDFs to chat about |
AI_LLAMA_CHAT_Thinking (Optional) |
input type="checkbox" |
Toggles thinking mode (chain-of-thought) on/off |
AI_LLAMA_CHAT_Internet (Optional) |
input type="checkbox" |
Toggles internet search availability on/off |
AI_LLAMA_CHAT_Location (Optional) |
input type="checkbox" |
Toggles geolocation (get_current_location) on/off |
AI_LLAMA_CHAT_MailForward (Optional) |
input type="checkbox" |
Toggles auto-forward of every AI response via email |
AI_LLAMA_CHAT_MailAddress (Optional) |
input type="text" or input type="email" |
Email address for auto-forwarding (shown when checkbox is checked) |
AI_LLAMA_CHAT_AlertOnFinish (Optional) |
input type="checkbox" |
Toggles alert on finish of inference |
Generated CSS Classes (injected at runtime):
| CSS Class | Element | Purpose |
|---|---|---|
LLAMA_Chat_Container |
div |
Scrollable chat wrapper replacing the hidden textarea |
LLAMA_Chat_Row |
div |
Flex row holding a single bubble |
LLAMA_Chat_Row--user |
div |
Row modifier: right-aligned (user message) |
LLAMA_Chat_Row--llama |
div |
Row modifier: left-aligned (Llama response) |
LLAMA_Chat_Row--system |
div |
Row modifier: centered (system/info messages) |
LLAMA_Chat_Bubble |
div |
Base speech-bubble styling (padding, border-radius, shadow) |
LLAMA_Chat_Bubble--user |
div |
User bubble colors (background via --user-bubble-bg) |
LLAMA_Chat_Bubble--llama |
div |
Llama bubble colors (background via --llama-bubble-bg) |
LLAMA_Chat_Bubble--system |
div |
System bubble: transparent, italic, muted |
LLAMA_Chat_Bubble--thinking |
div |
Temporary "thinking" indicator (dimmed, italic) |
LLAMA_Chat_Bubble--error |
div |
Error bubble: red-tinted background |
LLAMA_Chat_AiHint |
span |
Small "AI-Generated" label inside an AI bubble |
Behavior:
#e5e5ea).#0b93f6)."Chat ready. Attach file(s) and type your question.")."Alt+A" (default: "Alt+A").
Format: modifier(s) + key separated by +. Recognized modifiers:
Alt, Ctrl, Shift, Meta. The key part is case-insensitive."Alt+A = 🎙 on/off | Alt+Q = 🎙 off + send" (reflects the configured hotkeys)."Alt+Q" (default: "Alt+Q").
Same modifier format as VoiceHotkey."de", "en").
Empty or unset means auto-detect."Waiting for AI server\u2026")."Low Confidence")."Rethink")."Low confidence")."false" to disable highlighting (default: "true")."de", "fr"). When set, the AI is
forced to respond in this language — no auto-detection is performed.
Overrides the AI_LLAMA_STD_Language plugin property for this instance.
The chat interface reflects this language for labels where available.AI_LLAMA_STD_SPECIALIST_XXX
plugin property. When set, requests are routed to that specialist's
dedicated server instance (case-insensitive match)."true", shows a badge with the current queue position while
waiting for inference. Overrides the AI_QueueBadge plugin property
for this instance. Default: determined by plugin property."in queue" → badge shows "3 in queue"). Default: empty."true", enables PII filtering on Brave Search queries
for this instance, overriding the global AI_BraveSearch_FilterResults
plugin property. Default: determined by plugin property.Provided by the CodBi.
Provided by the CodBi.
Provides the AI_LLAMA_CHAT.functionality.
Initial Author: Callari, Salvatore (Callari@WaXCode.net) Maintainer: Callari, Salvatore (Callari@WaXCode.net)