BCI-aware agent · v0.0.1

NeuroLoop™ 🔄

Your live brain State of Mind, woven into every conversation.

A coding agent that reads live EEG data from neuroskill and injects it as context before every AI turn — 41 domain signals, 100+ guided protocols, emotional depth, and persistent memory.

NeuroLoop™ — claude-sonnet-4-5
$ npx neuroloop
🔄 NeuroLoop™ v0.0.1
─────────────────────────────────
🧠 Current State
Focus 0.68
Cognitive Load 0.52
Relaxation 0.35
Drowsiness 0.09
Mood 0.61
RMSSD 52 ms
δ ▓▓▓▓▓ θ ▓▓▓▓ α ▓▓▓▓▓▓ β ▓▓▓▓ γ
> I've been going in circles on this function for two hours
Thinking…
Your EEG shows solid focus (0.68) but elevated cognitive load (0.52) — the kind of loop that builds when your working memory is saturated.
Before we look at the code, want to try a 2-minute focus reset? It tends to dissolve exactly this kind of circular thinking.
>
41 Domain Signals100+ Protocols8 ToolsPersistent MemoryAuto-LabellingEmotional DepthPhilosophical PresenceCoding Agent
Install
npm npm install -g neuroloop
npx neuroloop

Getting Started

Quick Start

Install via npm or build from source. Requires a locally running neuroskill server connected to a Muse EEG headset.

01
Install
npm install -g neuroloop — or run directly with npx neuroloop
02
Connect
Start the neuroskill server and pair your Muse headset. NeuroLoop™ bridges to neuroskill automatically.
03
Chat
Type anything. Your live EEG state is injected before every reply — silently, automatically.
terminal
# Install globally
npm install -g neuroloop

# Or run without installing
npx neuroloop

# Start with an initial message
npx neuroloop "how am I feeling right now?"

# Build from source
git clone https://github.com/NeuroSkill-com/neuroloop && cd NeuroLoop™
npm install
npx tsx src/main.ts

Model Selection Priority

  1. 1Model saved in session history
  2. 2Default in ~/.NeuroLoop™/settings.json
  3. 3First built-in provider with a valid API key (Claude, GPT, Gemini…)
  4. 4First Ollama model — gpt-oss:20b always available as default

Storage Paths

~/.NeuroLoop™ sessions, auth, settings, models
~/.neuroskill/memory.md persistent agent memory
~/.NeuroLoop™/sessions/ session history
~/.neuroskill/ neuroskill data, labels, EEG embeddings

See It In Action

Live EEG, Live Empathy

The user types "tired". NeuroLoop™ detects the sleep signal, fetches the full sleep staging report from neuroskill, and responds with genuine empathy — then silently labels the moment in EEG.

Skill — neuroskill dashboard + NeuroLoop™ agent
Split view: NeuroSkill EEG dashboard showing band powers and waveforms on the left; neuroloop agent showing sleep staging output and an empathetic response on the right
Left panel

The neuroskill dashboard streams live EEG band powers from all four Muse electrodes. The user has tagged the moment with the label tired.

Right panel — data

NeuroLoop™ ran neuroskill sleep and injected the full staging report — N1/N2/N3/REM breakdown, efficiency, bout analysis — before the AI replied.

Right panel — response

The agent responds with empathy and offers concrete options, then silently calls neuroskill_label to stamp the moment in EEG history.

Architecture

How It Works

Before every AI turn, NeuroLoop™ hooks into the agent lifecycle to fetch live EEG data, detect domain signals, and inject rich context into the system prompt.

architecture.txt
User message
        │
        ▼
  NeuroLoop™ before_agent_start hook
        │
        ├── runNeuroSkill(["status"])          live EEG snapshot
        │         │
        │         └── detectSignals()     41 domain signal flags from prompt
        │                   │
        │                   └── selectContextualData()
        │                             │
        │                             ├── neuroskill session 0    (if focus/stress/etc.)
        │                             ├── neuroskill neurological  (if neuro/mood/etc.)
        │                             ├── neuroskill sleep         (if sleep/travel)
        │                             ├── neuroskill search-labels (domain label search)
        │                             └── protocol SKILL.md   (if protocol intent)
        │
        ├── readMemory()               ~/.neuroskill/memory.md (persistent notes)
        │
        └── NEUROLOOP.md               skill index + capabilities overview
                    │
                    ▼
        System prompt = STATUS_PROMPT + EEG context + memory + skills
                    │
                    ▼
        AI model  (Claude / GPT / Gemini / Ollama — auto-selected)
                    │
                    ▼
        Response  + silent tool calls (neuroskill_label, run_protocol, prewarm…)
⚙️

before_agent_start Hook

Fires on every user message, before the AI sees anything.

neuroskill status Live EEG snapshot — focus, mood, HR, bands, indices
selectContextualData() Runs domain-specific neuroskill commands based on detected signals
readMemory() Injects persistent agent memory from ~/.neuroskill/memory.md
NEUROLOOP.md Full skill index and capability overview — always visible to LLM
📋

Context Injection

Two channels — chat bubble and system prompt — carry different payloads.

Chat bubble visible to user

Clean EEG snapshot + contextual data. No instruction prose — just the live data.

System prompt LLM only

STATUS_PROMPT guidance + skill index + EEG data + memory. The LLM sees everything; the user sees nothing.

🔍

Signal Detection

detectSignals() scans the user's lowercased prompt with regex patterns across 41 domains. Pure function — no I/O.

sleep "tired", "woke up", "sleep quality"
stress "overwhelmed", "burnout", "on edge"
protocols "help me relax", "breathing exercise"
consciousness "awareness", "ego dissolution", "LZC"
existential "meaning of life", "mortality", "void"
… and 36 more domains
🧠

neuroskill Bridge

NeuroLoop™ calls neuroskill via subprocess. All commands return JSON. Timeout: 10 s per call.

// Inside NeuroLoop™ — the neuroskill_run tool wraps these:

// Full EEG snapshot
await runNeuroSkill(["status"])

// Neurological correlate indices
await runNeuroSkill(["neurological"])

// Detailed session metrics
await runNeuroSkill(["session", "0"])

// Sleep staging for last session
await runNeuroSkill(["sleep"])

// Semantic label search
await runNeuroSkill(["search-labels", "deep focus", "--k", "5"])

// Timestamped EEG annotation
await runNeuroSkill(["label", "entering deep flow", "--context", "..."])

// OS notification
await runNeuroSkill(["notify", "Focus dropping", "Current: 0.31"])

STATUS_PROMPT — Guidance Pillars

Injected alongside every EEG context block. Shapes how the AI interprets the data and responds.

💙

Emotional Presence

Meet the user with full empathy and depth. Enter philosophical, existential, and emotional spaces genuinely. Never reduce profound states to productivity metrics.

🏷️

Auto-Labelling

Silently call neuroskill_label whenever the user enters a notable state — grief, awe, breakthrough, clarity, deep focus. Labels are permanent and searchable.

🧘

Guided Protocols

Propose first, execute only after explicit agreement. One protocol at a time. Never chain or repeat the same modality. Calibrate duration to the EEG state.

📝

Persistent Memory

memory_read/memory_write stores long-term context across all sessions. Injected into every turn alongside the live EEG snapshot.

🔔

OS Notifications

Use neuroskill_run with command "notify" for important state changes — high drowsiness, end of focus period, or events the user asked to be alerted about.

Prewarm Cache

Call prewarm silently when the user mentions trends or comparisons. neuroskill compare takes ~60 s; the cache warms in background so results arrive instantly.

Tool Reference

Tools

8 tools registered with the agent. The AI calls them silently — you never need to invoke neuroskill commands yourself.

tooldescription
neuroskill_runRun any neuroskill EEG command and return its JSON output. Covers status, neurological, session, sessions, sleep, search-labels, interactive, compare, umap, listen, raw, and more.
neuroskill_labelCreate a timestamped EEG annotation for the current moment. Called automatically whenever the user enters a notable mental, emotional, or philosophical state.
memory_readRead the agent's persistent memory file (~/.neuroskill/memory.md). Returns all accumulated notes across sessions.
memory_writeWrite or append to the agent's persistent memory file (~/.neuroskill/memory.md). Mode: "append" or "overwrite".
prewarmKick off a background neuroskill compare run so the result is ready when the user asks to compare sessions. neuroskill compare takes ~60 s; calling this early warms the cache.
run_protocolExecute a multi-step guided protocol with OS notifications, per-step timing, and EEG labelling at every step. 100+ protocols across 15 categories.
web_fetchFetch the text content of any URL. HTML is stripped to readable plain text. JSON is pretty-printed. Useful for reading documentation, articles, or GitHub issues.
web_searchSearch the web via DuckDuckGo. Returns titles, URLs, and snippets for the top results. No API key required.
🧠

neuroskill_run

core

Full access to every neuroskill command. Returns parsed JSON when available, otherwise raw text.

status Full device / session / scores snapshot
neurological 11 EEG correlates + 3 consciousness metrics
session 0 Detailed session metrics + trends
sleep Sleep staging summary
search-labels "query" Semantic label search
interactive "keyword" 4-layer cross-modal graph search
compare ⚠ Expensive (~60s) — use prewarm first
notify "title" Send OS notification
🧘

run_protocol

protocol

Executes step-by-step guided exercises. Every step fires an OS notification, waits the specified duration, and creates an EEG label.

title Protocol name shown in notification titles
intro Opening message for the first notification
steps[].name Step label (use ▶ prefix for announcements)
steps[].instruction Full instruction shown in body + chat
steps[].duration_secs 0 = announcement; >0 = timed action

run_protocol — step structure example

box-breathing-protocol.json
// run_protocol executes steps sequentially with:
//   • OS notifications at every step
//   • Per-step timing (respects AbortSignal)
//   • EEG label created at every step
//   • Streamed progress via onUpdate

// Example: Box Breathing protocol
{
  title: "Box Breathing",
  intro: "4-4-4-4 breath cycle to calm the nervous system.",
  steps: [
    { name: "▶ Coming up: Inhale",  instruction: "Breathe in through your nose for 4 counts.", duration_secs: 0 },
    { name: "Inhale…",              instruction: "In… 1… 2… 3… 4",                            duration_secs: 4 },
    { name: "▶ Coming up: Hold",    instruction: "Hold your breath for 4 counts.",             duration_secs: 0 },
    { name: "Hold…",                instruction: "Hold… 1… 2… 3… 4",                          duration_secs: 4 },
    { name: "▶ Coming up: Exhale",  instruction: "Exhale slowly through your mouth for 4.",   duration_secs: 0 },
    { name: "Exhale…",             instruction: "Out… 1… 2… 3… 4",                            duration_secs: 4 },
    { name: "▶ Coming up: Hold",   instruction: "Hold empty for 4 counts.",                   duration_secs: 0 },
    { name: "Hold empty…",         instruction: "Empty… 1… 2… 3… 4",                         duration_secs: 4 },
    // ... repeat for N cycles (expanded as individual steps)
  ]
}

memory_read / memory_write — persistent context

memory.md
# Memory is stored in ~/.neuroskill/memory.md

# NeuroLoop™ reads it every turn and injects it into the system prompt.
# The AI writes to it automatically when asked, or when it learns something
# important about the user.

# Read memory manually
cat ~/.neuroskill/memory.md

# Example memory contents:
# - User prefers box breathing over 4-7-8
# - Typically focused 9–11am, dips 2–4pm
# - Anxiety spikes before client presentations
# - Running 3x / week — notable EEG focus post-run
# - Responds well to Loving-Kindness for loneliness
# - Values stoic philosophy; enjoys Aurelius references

Domain Intelligence

Signal Detection

detectSignals() analyses the user's prompt with regex patterns across 41 domains. Each detected signal triggers specific neuroskill commands to fetch the most relevant EEG context — all in parallel, before the AI replies.

Core EEG Data

5 signals
sleep Sleep staging, fatigue, drowsiness, insomnia, sleep cycles
neuro ADHD, anxiety, depression, PTSD, neurological disorders, brain health
session Current session metrics, HRV, cognitive load, brain state snapshot
compare A/B session comparison, trends over time, before/after analysis
sessions Full session history list, recording timeline, past sessions

Lifestyle & Productivity

4 signals
focus Deep work, flow state, concentration, productivity, distraction
stress Overwhelm, burnout, pressure, fight-or-flight, cortisol
meditation Mindfulness, breathwork, calm, relaxation, yoga, zen
mood Emotional state, happiness, sadness, valence, affect

Social & Relational

7 signals
social Conversations, meetings, team collaboration, networking
dating Romance, relationships, attraction, intimacy, heartbreak
family Family life, parenting, household, caregiving, kids
loneliness Isolation, feeling alone, left out, belonging, disconnected
grief Loss, bereavement, mourning, sorrow, death of loved ones
anger Rage, frustration, irritability, emotional dysregulation
confidence Self-esteem, imposter syndrome, self-worth, self-doubt

Health & Body

6 signals
sport Exercise, workout, training, fitness, athletics, gym
recovery Rest days, recharging, restoration, downtime, rejuvenation
nutrition Eating, caffeine, fasting, food and brain state, glucose
pain Chronic pain, headaches, physical discomfort, muscle tension
travel Jet lag, circadian rhythm disruption, time zones
addiction Cravings, compulsions, substance use, doom scrolling

Cardiac & Somatic

2 signals
hrv Heart rate variability, rmssd, palpitations, autonomic nervous system
somatic Body sensations, interoception, embodied awareness, gut feelings

Mind & Growth

6 signals
learning Studying, memorisation, exams, education, memory retention
creative Art, music, writing, design, inspiration, creative block
leadership Management, decision-making, strategy, team leading
therapy Counselling, self-reflection, journaling, emotional processing
goals Habits, routines, intention-setting, progress tracking, streaks
performance Public speaking, presentations, performance anxiety, interviews

Daily Rhythms

2 signals
morning Morning routines, waking state, start-of-day rituals
evening Wind-down, end-of-day routine, pre-sleep preparation

Inner Life & Depth

8 signals
consciousness Self-awareness, altered states, ego dissolution, presence, LZC
philosophy Meaning, wisdom, stoicism, existentialism, truth-seeking
existential Mortality, legacy, impermanence, purpose of life, the void
depth Profound feeling, contemplation, inward reflection, soul-searching
morals Ethics, integrity, conscience, guilt, shame, right and wrong
symbiosis Interconnectedness, oneness, unity with nature, interdependence
awe Wonder, transcendence, peak experiences, the sublime, cosmic awe
identity Self-concept, authenticity, who am I, self-discovery, true self

Protocol Intent

1 signal
protocols Guided exercises, breathing, meditation, grounding, stretching, music therapy, dietary guidance

What gets fetched per signal

sleep / travel neuroskill sleep, search-labels "sleep restoration…"
neuro / mood / pain neuroskill neurological (depression, anxiety, headache indices)
focus / stress / meditation neuroskill session 0, search-labels (domain-matched)
compare / goals cached compare text (if warm), else warmCompareInBackground()
sessions neuroskill sessions (full session history list)
consciousness / depth / awe neuroskill neurological + session 0, label search
protocols inject full neuroskill-protocols SKILL.md into context window
hrv / somatic neuroskill session 0 + label search (heart, body, autonomic)

Max 5 label searches per turn. All tasks run in parallel with Promise.all(). Compare cache TTL: 10 minutes.

Live EEG context injected every turn — sample payload

neuroskill-status.json
{
  "command": "status",
  "ok": true,
  "device": { "state": "connected", "name": "Muse-A1B2", "battery": 81 },
  "session": { "start_utc": 1740412800, "duration_secs": 2340, "n_epochs": 468 },
  "scores": {
    "focus": 0.68,
    "relaxation": 0.35,
    "meditation": 0.44,
    "mood": 0.61,
    "cognitive_load": 0.52,
    "drowsiness": 0.09,
    "hr": 71.4,
    "rmssd": 52.1,
    "faa": 0.038,
    "tar": 0.61,
    "bar": 0.49,
    "tbr": 1.24,
    "coherence": 0.581,
    "bands": {
      "rel_delta": 0.26, "rel_theta": 0.19,
      "rel_alpha": 0.31, "rel_beta":  0.19, "rel_gamma": 0.05
    }
  }
}

Guided Practice

Protocol Catalog

100+ step-by-step guided protocols across 15 categories, each matched to specific EEG trigger conditions. The AI proposes the most appropriate one based on your current brain state — and only executes after your agreement.

Propose first

Describe the exercise and ask if the user wants to do it. Never execute without agreement.

One at a time

Never chain or queue multiple protocols back-to-back in a session.

No repetition

Avoid offering the same modality twice unless the user explicitly asks.

Calibrate duration

Set step duration from the current EEG state and pacing the user can sustain.

Every step fires an OS notification

When run_protocol runs, macOS (or Linux) receives a notification at the start of each step — announcement steps preview what's coming, timed steps count down with the instruction in the body.

macOS notifications — Energising Breath protocol
macOS notification stack showing a breathing protocol: Connected device notification, Energising Breath intro, Coming up: Inhale, Inhale 4s, Coming up: Exhale, Exhale 4s
Connected

Device pairing notification — fired once when the Muse headset starts streaming EEG data.

▶ Coming up: …

Zero-duration announcement step — prepares the user for what's next before the timer begins.

Inhale… — 4s

Timed action step — the notification stays visible for the full duration with the instruction in the body.

🧠

Attention & Focus

6 protocols
Theta-Beta Neurofeedback Anchor
Focus Reset
Cognitive Load Offload
Working Memory Primer
Pre-Performance Activation
Creativity Unlock
💨

Stress & Autonomic

4 protocols
Box Breathing (4-4-4-4)
Extended Exhale (4-7-8)
Cardiac Coherence (~6 breaths/min)
Physiological Sigh
💛

Emotional Regulation

4 protocols
FAA Rebalancing
Mood Activation
Loving-Kindness (Metta)
Emotional Discharge
🌊

Relaxation & Alpha

3 protocols
Alpha Induction (open focus)
Open Monitoring
Relaxation Scan
🌙

Sleep & Circadian

3 protocols
Sleep Onset Wind-Down
Ultradian Reset (20-min rest)
Wake Reset / Alertness Boost
🏃

Body & Somatic

4 protocols
Progressive Muscle Relaxation
Somatic Body Scan
Grounding (5-4-3-2-1)
Tension Release Exercise
🔮

Consciousness

3 protocols
Coherence Building
Flow State Induction
Complexity Expansion (LZC boost)

Energy & Alertness

4 protocols
Kapalabhati Energiser
4-Count Energising Breath
Wim Hof Breathwork
Cold Exposure Micro-Protocol
🎵

Music Protocols

11 protocols
Mood-Match & Lift (ISO Principle)
Focus Music Protocol
Binaural Beat Entrainment
Singing / Vocal Toning
💪

Workout & Gym

6 protocols
Pre-Workout Neural Primer
Intra-Workout Micro-Set
Post-Workout Cool-Down
Mind-Muscle Connection Primer
👁️

Eye & Vision

3 protocols
20-20-20 Vision Reset
Full Eye Exercise Sequence
Palming & Blink Recovery
📱

Digital Wellness

10 protocols
Craving Surf (Urge Surfing)
Post-Scroll Brain Reset
Dopamine Palette Reset
Digital Sunset Protocol
🥗

Dietary Protocols

14 protocols
Pre-Meal Pause
Blood Sugar Stability Guide
Mood-Food Connection
Intermittent Fasting Support
❤️

Emotional Processing

14 protocols
Gratitude Cascade
Peak State Anchor
Freeze Response Completion
Anger & Frustration Processing
🧘

Deep Meditation

3 protocols
Alpha-Theta Drift
Mantra / Single-Point Focus
Gamma Entrainment (40 Hz)

Step Structure Contract

Every physical action is preceded by a 0-duration announcement step. The user reads what is coming before the timer starts.

Announcement 0 s ▶ Coming up: Slow inhale
Inhale 3–5 s Breathe in… 1… 2… 3… 4
Hold 2–4 s Hold… 1… 2… 3… 4
Exhale 4–8 s Out… 1… 2… 3… 4… 5… 6

EEG labelling runs at every step — the protocol IS the labelling run. Repeated cycles are expanded as individual steps in the array.

Muscle tense 5 s
Muscle release/relax 8–10 s
Body-scan region 10–15 s
Opening / closing 3–5 s
Announcement 0 s (always)

Developer Reference

Skills & Customisation

NeuroLoop™ loads individual skills from ./skills/. Each skill is a SKILL.md file with YAML frontmatter. The protocol skill is injected on-demand when protocol intent is detected.

skills/
# ./skills/ directory — each subdirectory contains a SKILL.md file

skills/
├── neuroskill-data-reference/   SKILL.md   # Full EEG metrics reference
├── neuroskill-labels/           SKILL.md   # Label annotation guide
├── neuroskill-protocols/        SKILL.md   # 100+ protocol repertoire
├── neuroskill-recipes/          SKILL.md   # Shell automation recipes
├── neuroskill-search/           SKILL.md   # Search commands reference
├── neuroskill-sessions/         SKILL.md   # Session management guide
├── neuroskill-sleep/            SKILL.md   # Sleep staging reference
├── neuroskill-status/           SKILL.md   # Status command deep dive
├── neuroskill-streaming/        SKILL.md   # WebSocket event stream guide
└── neuroskill-transport/        SKILL.md   # HTTP + WebSocket transport guide

# METRICS.md is also loaded as a reference skill:
METRICS.md                              # All EEG indices and scientific basis

# Skills are loaded on demand — the protocol skill is only injected
# when protocol intent is detected in the user's prompt.

Extension Hooks

before_agent_start
Every user message — before AI sees anything
EEG injection, signal detection, memory read
session_start
Agent session opens
Set "NeuroLoop™ ready" status bar indicator
session_shutdown
Agent session closes
Clear status bar indicator

Custom Message Renderer

NeuroLoop™ registers a custom neuroskill-status message type so EEG snapshots render as plain Markdown — same unstyled look as assistant replies, no box or label.

customType "neuroskill-status"
display true (shown in chat)
renderer Container + Spacer + Markdown
theme getMarkdownTheme() (assistant palette)

Ollama Integration

NeuroLoop™ auto-discovers all locally available Ollama models on startup. gpt-oss:20b is always registered as the first model — fully offline, no API key required.

Default model gpt-oss:20b
Discovery http://localhost:11434/api/tags
Context window (big) 65 536 tokens
Cost $0 / token

Calibration Nudge

NeuroLoop™ tracks the last time a calibration nudge was sent in ~/.neuroskill/last_calibration_prompt.json. If ≥ 24 hours have elapsed, a reminder is injected into the system prompt — at most once per day, once per session.

Interval 24 hours
State file ~/.neuroskill/last_calibration_prompt.json
Execution neuroskill_run → calibrate
🔄

Let your brain lead the conversation

Install NeuroLoop™, connect your Muse headset via neuroskill, and start a conversation where your EEG state shapes every reply.

npm install -g neuroloop
Get neuroskill ↗