Migration Gitea - sauvegarde locale 2025-12-04 18:58

This commit is contained in:
StillHammer 2025-12-04 18:58:32 +08:00
parent cb938500cd
commit ce2b25a599
46 changed files with 8704 additions and 8704 deletions

6
.gitmodules vendored
View File

@ -1,3 +1,3 @@
[submodule "external/whisper.cpp"]
path = external/whisper.cpp
url = https://github.com/ggerganov/whisper.cpp
[submodule "external/whisper.cpp"]
path = external/whisper.cpp
url = https://github.com/ggerganov/whisper.cpp

File diff suppressed because it is too large Load Diff

View File

@ -1,340 +1,340 @@
# AUDIT DE CONFORMITÉ GROVEENGINE - AISSIA
**Date** : 2025-11-26
**Auditeur** : Claude Code
**Version auditée** : Commit bc3b6cb
---
## RÉSUMÉ EXÉCUTIF
**Verdict : Le code contourne massivement les principes de GroveEngine.**
| Module | Lignes | Conformité Engine | Statut |
|--------|--------|-------------------|--------|
| AIModule | 306 | VIOLATION | Infrastructure dans module |
| MonitoringModule | 222 | VIOLATION | Appels OS dans module |
| StorageModule | 273 | VIOLATION | SQLite dans module |
| VoiceModule | 209 | VIOLATION | TTS/COM dans module |
| SchedulerModule | 179 | CONFORME | Logique métier pure |
| NotificationModule | 172 | CONFORME | Logique métier pure |
**Score global** : 2/6 modules conformes (33%)
---
## RAPPEL DES PRINCIPES GROVEENGINE
Selon `docs/GROVEENGINE_GUIDE.md` :
1. **Modules = Pure business logic** (200-300 lignes recommandées)
2. **No infrastructure code in modules** : threading, networking, persistence
3. **All data via IDataNode abstraction** (backend agnostic)
4. **Pull-based message processing** via IIO pub/sub
5. **Hot-reload ready** : sérialiser tout l'état dans `getState()`
---
## VIOLATIONS CRITIQUES
### 1. AIModule - Networking dans le module
**Fichier** : `src/modules/AIModule.cpp:146`
```cpp
nlohmann::json AIModule::agenticLoop(const std::string& userQuery) {
// ...
auto response = m_provider->chat(m_systemPrompt, messages, tools);
// Appel HTTP synchrone bloquant !
}
```
**Violation** : Appels HTTP synchrones directement dans `process()` via la boucle agentique.
**Impact** :
- Bloque la boucle principale pendant chaque requête LLM (timeout 60s)
- `isIdle()` retourne false pendant l'appel, mais le module reste bloquant
- Hot-reload impossible pendant une requête en cours
- Tous les autres modules sont bloqués
**Correction requise** : Déléguer les appels LLM à un service infrastructure externe, communication via IIO async.
---
### 2. StorageModule - Persistence dans le module
**Fichier** : `src/modules/StorageModule.cpp:78-91`
```cpp
bool StorageModule::openDatabase() {
int rc = sqlite3_open(m_dbPath.c_str(), &m_db);
// Handle SQLite directement dans le module
}
```
**Violation** : Gestion directe de SQLite. L'engine préconise `IDataNode` abstractions pour la persistence.
**Impact** :
- Hot-reload risqué (handle DB ouvert)
- Risque de corruption si reload pendant transaction
- Couplage fort avec SQLite
**Correction requise** : Service StorageService dans main.cpp, modules communiquent via topics `storage:*`.
---
### 3. MonitoringModule - Appels OS dans le module
**Fichier** : `src/modules/MonitoringModule.cpp:78-79`
```cpp
void MonitoringModule::checkCurrentApp(float currentTime) {
std::string newApp = m_tracker->getCurrentAppName();
// Appelle GetForegroundWindow(), OpenProcess(), etc.
}
```
**Violation** : Appels Win32 API dans `process()`. Même encapsulé dans `IWindowTracker`, c'est du code plateforme dans un module hot-reloadable.
**Impact** :
- Dépendance plateforme dans le module
- Handles système potentiellement orphelins au reload
**Correction requise** : Service PlatformService qui publie `monitoring:window_info` périodiquement.
---
### 4. VoiceModule - COM/SAPI dans le module
**Fichier** : `src/modules/VoiceModule.cpp:122`
```cpp
void VoiceModule::speak(const std::string& text) {
m_ttsEngine->speak(text, true);
// Appel ISpVoice::Speak via COM
}
```
**Fichier** : `src/shared/audio/SAPITTSEngine.hpp:26`
```cpp
HRESULT hr = CoInitializeEx(nullptr, COINIT_MULTITHREADED);
```
**Violation** : Initialisation COM et appels SAPI dans le module.
**Impact** :
- `CoInitializeEx` par thread, hot-reload peut causer des fuites
- Appels asynchrones SAPI difficiles à gérer au shutdown
**Correction requise** : Service VoiceService dédié, modules envoient `voice:speak`.
---
## PROBLÈMES DE DESIGN
### 5. Topics incohérents
**SchedulerModule.h:26** utilise le format slash :
```cpp
// "scheduler/hyperfocus_alert"
```
**AIModule.cpp:52** utilise le format colon :
```cpp
m_io->subscribe("scheduler:hyperfocus_alert", subConfig);
```
**Standard GroveEngine** : Format `module:event` (colon)
**Impact** : Les messages ne seront jamais reçus si les formats ne correspondent pas.
---
### 6. SchedulerModule - IIO non utilisé
**Fichier** : `src/modules/SchedulerModule.cpp:66-68`
```cpp
void SchedulerModule::checkHyperfocus(float currentTime) {
// ...
// Publier l'alerte (si IO disponible)
// Note: Dans une version complète, on publierait via m_io
}
```
**Problème** : Le SchedulerModule a `m_io` mais ne publie JAMAIS rien. Les autres modules s'abonnent à `scheduler:*` mais ne recevront rien.
---
### 7. État non restaurable - StorageModule
**Fichier** : `src/modules/StorageModule.cpp:246-258`
```cpp
std::unique_ptr<grove::IDataNode> StorageModule::getState() {
state->setBool("isConnected", m_isConnected);
// ...
}
void StorageModule::setState(const grove::IDataNode& state) {
// NE ROUVRE PAS la connexion DB !
m_logger->info("Etat restore...");
}
```
**Problème** : `setState()` ne restaure pas la connexion SQLite. Après hot-reload, le module est dans un état incohérent.
---
### 8. Libraries statiques dans modules
**CMakeLists.txt:86-101** :
```cmake
add_library(AissiaLLM STATIC ...)
target_link_libraries(AIModule PRIVATE AissiaLLM)
```
**Problème** : Les libs `AissiaLLM`, `AissiaPlatform`, `AissiaAudio` sont compilées en STATIC et linkées dans chaque .so.
**Impact** :
- Code dupliqué dans chaque module
- Hot-reload ne rafraîchit pas ces libs
- Pas de partage d'état entre modules
---
### 9. Dépassement limite de lignes
| Module | Lignes | Limite recommandée |
|--------|--------|-------------------|
| AIModule | 306 | 200-300 |
Le dépassement est mineur mais symptomatique : le module fait trop de choses.
---
## CE QUI EST CONFORME
### SchedulerModule & NotificationModule
Ces deux modules respectent les principes :
- Logique métier pure
- Pas d'appels système
- État sérialisable
- Taille appropriée
### Structure IModule
Tous les modules implémentent correctement :
- `process()`
- `setConfiguration()`
- `getState()` / `setState()`
- `getHealthStatus()`
- `shutdown()`
- Exports C `createModule()` / `destroyModule()`
### main.cpp
La boucle principale est bien implémentée :
- FileWatcher pour hot-reload
- Frame timing à 10Hz
- Signal handling propre
- Chargement/déchargement correct
---
## ARCHITECTURE RECOMMANDÉE
### Actuelle (INCORRECTE)
```
┌─────────────────────────────────────────────────────────┐
│ main.cpp │
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │AIModule │ │Storage │ │Monitor │ │Voice │ │
│ │ +HTTP │ │ +SQLite │ │ +Win32 │ │ +COM │ │
│ └─────────┘ └─────────┘ └─────────┘ └─────────┘ │
└─────────────────────────────────────────────────────────┘
Infrastructure DANS les modules = VIOLATION
```
### Corrigée (CONFORME)
```
┌─────────────────────────────────────────────────────────┐
│ main.cpp │
│ │
│ ┌─────────────── INFRASTRUCTURE ──────────────────┐ │
│ │ LLMService │ StorageService │ PlatformService │ │ │
│ │ (async) │ (SQLite) │ (Win32) │ │ │
│ └──────────────────────────────────────────────────┘ │
│ ↑↓ IIO pub/sub (async, non-bloquant) │
│ ┌─────────────── MODULES (hot-reload) ────────────┐ │
│ │ AIModule │ StorageModule │ MonitoringModule │ │ │
│ │ (logic) │ (logic) │ (logic) │ │ │
│ └──────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────┘
Infrastructure HORS modules = CONFORME
```
### Flux de données corrigé
```
User query → voice:transcription → AIModule
AIModule → ai:query_request → LLMService (async)
LLMService → ai:response → AIModule
AIModule → ai:response → VoiceModule → voice:speak
```
---
## ACTIONS REQUISES
### Priorité HAUTE
1. **Extraire LLM de AIModule**
- Créer `LLMService` dans main.cpp ou service dédié
- AIModule publie `ai:query_request`, reçoit `ai:response`
- Appels HTTP dans thread séparé
2. **Extraire SQLite de StorageModule**
- Créer `StorageService`
- Modules publient `storage:save_*`, reçoivent `storage:result`
3. **Extraire Win32 de MonitoringModule**
- Créer `PlatformService`
- Publie `platform:window_changed` périodiquement
4. **Extraire TTS de VoiceModule**
- Créer `VoiceService`
- Modules publient `voice:speak`
### Priorité MOYENNE
5. **Corriger format topics** : Tout en `module:event`
6. **Implémenter publish dans SchedulerModule**
7. **Corriger setState dans StorageModule**
### Priorité BASSE
8. **Refactorer libs STATIC en services**
9. **Réduire AIModule sous 300 lignes**
---
## CONCLUSION
Le code actuel **simule** l'utilisation de GroveEngine mais le **contourne** en plaçant l'infrastructure directement dans les modules.
Les modules ne sont **pas véritablement hot-reloadable** car ils :
1. Possèdent des ressources système (DB handles, COM objects)
2. Font des appels bloquants (HTTP 60s timeout, TTS)
3. Ne communiquent pas correctement via IIO
**Refactoring majeur requis** pour extraire l'infrastructure des modules vers des services dédiés dans main.cpp.
---
*Audit généré automatiquement par Claude Code*
# AUDIT DE CONFORMITÉ GROVEENGINE - AISSIA
**Date** : 2025-11-26
**Auditeur** : Claude Code
**Version auditée** : Commit bc3b6cb
---
## RÉSUMÉ EXÉCUTIF
**Verdict : Le code contourne massivement les principes de GroveEngine.**
| Module | Lignes | Conformité Engine | Statut |
|--------|--------|-------------------|--------|
| AIModule | 306 | VIOLATION | Infrastructure dans module |
| MonitoringModule | 222 | VIOLATION | Appels OS dans module |
| StorageModule | 273 | VIOLATION | SQLite dans module |
| VoiceModule | 209 | VIOLATION | TTS/COM dans module |
| SchedulerModule | 179 | CONFORME | Logique métier pure |
| NotificationModule | 172 | CONFORME | Logique métier pure |
**Score global** : 2/6 modules conformes (33%)
---
## RAPPEL DES PRINCIPES GROVEENGINE
Selon `docs/GROVEENGINE_GUIDE.md` :
1. **Modules = Pure business logic** (200-300 lignes recommandées)
2. **No infrastructure code in modules** : threading, networking, persistence
3. **All data via IDataNode abstraction** (backend agnostic)
4. **Pull-based message processing** via IIO pub/sub
5. **Hot-reload ready** : sérialiser tout l'état dans `getState()`
---
## VIOLATIONS CRITIQUES
### 1. AIModule - Networking dans le module
**Fichier** : `src/modules/AIModule.cpp:146`
```cpp
nlohmann::json AIModule::agenticLoop(const std::string& userQuery) {
// ...
auto response = m_provider->chat(m_systemPrompt, messages, tools);
// Appel HTTP synchrone bloquant !
}
```
**Violation** : Appels HTTP synchrones directement dans `process()` via la boucle agentique.
**Impact** :
- Bloque la boucle principale pendant chaque requête LLM (timeout 60s)
- `isIdle()` retourne false pendant l'appel, mais le module reste bloquant
- Hot-reload impossible pendant une requête en cours
- Tous les autres modules sont bloqués
**Correction requise** : Déléguer les appels LLM à un service infrastructure externe, communication via IIO async.
---
### 2. StorageModule - Persistence dans le module
**Fichier** : `src/modules/StorageModule.cpp:78-91`
```cpp
bool StorageModule::openDatabase() {
int rc = sqlite3_open(m_dbPath.c_str(), &m_db);
// Handle SQLite directement dans le module
}
```
**Violation** : Gestion directe de SQLite. L'engine préconise `IDataNode` abstractions pour la persistence.
**Impact** :
- Hot-reload risqué (handle DB ouvert)
- Risque de corruption si reload pendant transaction
- Couplage fort avec SQLite
**Correction requise** : Service StorageService dans main.cpp, modules communiquent via topics `storage:*`.
---
### 3. MonitoringModule - Appels OS dans le module
**Fichier** : `src/modules/MonitoringModule.cpp:78-79`
```cpp
void MonitoringModule::checkCurrentApp(float currentTime) {
std::string newApp = m_tracker->getCurrentAppName();
// Appelle GetForegroundWindow(), OpenProcess(), etc.
}
```
**Violation** : Appels Win32 API dans `process()`. Même encapsulé dans `IWindowTracker`, c'est du code plateforme dans un module hot-reloadable.
**Impact** :
- Dépendance plateforme dans le module
- Handles système potentiellement orphelins au reload
**Correction requise** : Service PlatformService qui publie `monitoring:window_info` périodiquement.
---
### 4. VoiceModule - COM/SAPI dans le module
**Fichier** : `src/modules/VoiceModule.cpp:122`
```cpp
void VoiceModule::speak(const std::string& text) {
m_ttsEngine->speak(text, true);
// Appel ISpVoice::Speak via COM
}
```
**Fichier** : `src/shared/audio/SAPITTSEngine.hpp:26`
```cpp
HRESULT hr = CoInitializeEx(nullptr, COINIT_MULTITHREADED);
```
**Violation** : Initialisation COM et appels SAPI dans le module.
**Impact** :
- `CoInitializeEx` par thread, hot-reload peut causer des fuites
- Appels asynchrones SAPI difficiles à gérer au shutdown
**Correction requise** : Service VoiceService dédié, modules envoient `voice:speak`.
---
## PROBLÈMES DE DESIGN
### 5. Topics incohérents
**SchedulerModule.h:26** utilise le format slash :
```cpp
// "scheduler/hyperfocus_alert"
```
**AIModule.cpp:52** utilise le format colon :
```cpp
m_io->subscribe("scheduler:hyperfocus_alert", subConfig);
```
**Standard GroveEngine** : Format `module:event` (colon)
**Impact** : Les messages ne seront jamais reçus si les formats ne correspondent pas.
---
### 6. SchedulerModule - IIO non utilisé
**Fichier** : `src/modules/SchedulerModule.cpp:66-68`
```cpp
void SchedulerModule::checkHyperfocus(float currentTime) {
// ...
// Publier l'alerte (si IO disponible)
// Note: Dans une version complète, on publierait via m_io
}
```
**Problème** : Le SchedulerModule a `m_io` mais ne publie JAMAIS rien. Les autres modules s'abonnent à `scheduler:*` mais ne recevront rien.
---
### 7. État non restaurable - StorageModule
**Fichier** : `src/modules/StorageModule.cpp:246-258`
```cpp
std::unique_ptr<grove::IDataNode> StorageModule::getState() {
state->setBool("isConnected", m_isConnected);
// ...
}
void StorageModule::setState(const grove::IDataNode& state) {
// NE ROUVRE PAS la connexion DB !
m_logger->info("Etat restore...");
}
```
**Problème** : `setState()` ne restaure pas la connexion SQLite. Après hot-reload, le module est dans un état incohérent.
---
### 8. Libraries statiques dans modules
**CMakeLists.txt:86-101** :
```cmake
add_library(AissiaLLM STATIC ...)
target_link_libraries(AIModule PRIVATE AissiaLLM)
```
**Problème** : Les libs `AissiaLLM`, `AissiaPlatform`, `AissiaAudio` sont compilées en STATIC et linkées dans chaque .so.
**Impact** :
- Code dupliqué dans chaque module
- Hot-reload ne rafraîchit pas ces libs
- Pas de partage d'état entre modules
---
### 9. Dépassement limite de lignes
| Module | Lignes | Limite recommandée |
|--------|--------|-------------------|
| AIModule | 306 | 200-300 |
Le dépassement est mineur mais symptomatique : le module fait trop de choses.
---
## CE QUI EST CONFORME
### SchedulerModule & NotificationModule
Ces deux modules respectent les principes :
- Logique métier pure
- Pas d'appels système
- État sérialisable
- Taille appropriée
### Structure IModule
Tous les modules implémentent correctement :
- `process()`
- `setConfiguration()`
- `getState()` / `setState()`
- `getHealthStatus()`
- `shutdown()`
- Exports C `createModule()` / `destroyModule()`
### main.cpp
La boucle principale est bien implémentée :
- FileWatcher pour hot-reload
- Frame timing à 10Hz
- Signal handling propre
- Chargement/déchargement correct
---
## ARCHITECTURE RECOMMANDÉE
### Actuelle (INCORRECTE)
```
┌─────────────────────────────────────────────────────────┐
│ main.cpp │
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
│ │AIModule │ │Storage │ │Monitor │ │Voice │ │
│ │ +HTTP │ │ +SQLite │ │ +Win32 │ │ +COM │ │
│ └─────────┘ └─────────┘ └─────────┘ └─────────┘ │
└─────────────────────────────────────────────────────────┘
Infrastructure DANS les modules = VIOLATION
```
### Corrigée (CONFORME)
```
┌─────────────────────────────────────────────────────────┐
│ main.cpp │
│ │
│ ┌─────────────── INFRASTRUCTURE ──────────────────┐ │
│ │ LLMService │ StorageService │ PlatformService │ │ │
│ │ (async) │ (SQLite) │ (Win32) │ │ │
│ └──────────────────────────────────────────────────┘ │
│ ↑↓ IIO pub/sub (async, non-bloquant) │
│ ┌─────────────── MODULES (hot-reload) ────────────┐ │
│ │ AIModule │ StorageModule │ MonitoringModule │ │ │
│ │ (logic) │ (logic) │ (logic) │ │ │
│ └──────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────┘
Infrastructure HORS modules = CONFORME
```
### Flux de données corrigé
```
User query → voice:transcription → AIModule
AIModule → ai:query_request → LLMService (async)
LLMService → ai:response → AIModule
AIModule → ai:response → VoiceModule → voice:speak
```
---
## ACTIONS REQUISES
### Priorité HAUTE
1. **Extraire LLM de AIModule**
- Créer `LLMService` dans main.cpp ou service dédié
- AIModule publie `ai:query_request`, reçoit `ai:response`
- Appels HTTP dans thread séparé
2. **Extraire SQLite de StorageModule**
- Créer `StorageService`
- Modules publient `storage:save_*`, reçoivent `storage:result`
3. **Extraire Win32 de MonitoringModule**
- Créer `PlatformService`
- Publie `platform:window_changed` périodiquement
4. **Extraire TTS de VoiceModule**
- Créer `VoiceService`
- Modules publient `voice:speak`
### Priorité MOYENNE
5. **Corriger format topics** : Tout en `module:event`
6. **Implémenter publish dans SchedulerModule**
7. **Corriger setState dans StorageModule**
### Priorité BASSE
8. **Refactorer libs STATIC en services**
9. **Réduire AIModule sous 300 lignes**
---
## CONCLUSION
Le code actuel **simule** l'utilisation de GroveEngine mais le **contourne** en plaçant l'infrastructure directement dans les modules.
Les modules ne sont **pas véritablement hot-reloadable** car ils :
1. Possèdent des ressources système (DB handles, COM objects)
2. Font des appels bloquants (HTTP 60s timeout, TTS)
3. Ne communiquent pas correctement via IIO
**Refactoring majeur requis** pour extraire l'infrastructure des modules vers des services dédiés dans main.cpp.
---
*Audit généré automatiquement par Claude Code*

View File

@ -1,305 +1,305 @@
# AISSIA MCP Configuration for Claude Code
This directory contains an example MCP (Model Context Protocol) configuration for integrating AISSIA with Claude Code.
## Quick Setup
### 1. Locate Claude Code MCP Settings
The MCP configuration file location depends on your operating system:
**Windows**:
```
%APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json
```
Full path example:
```
C:\Users\YourUsername\AppData\Roaming\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json
```
**macOS**:
```
~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
```
**Linux**:
```
~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
```
### 2. Copy Configuration
Copy the contents of `claude_code_mcp_config.json` to the Claude Code MCP settings file.
**Important**: Update the `command` path to point to your actual AISSIA executable:
```json
{
"mcpServers": {
"aissia": {
"command": "C:\\path\\to\\your\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false
}
}
}
```
### 3. Restart Claude Code
Restart VS Code (or reload window: `Ctrl+Shift+P` → "Developer: Reload Window") to apply the changes.
### 4. Verify Integration
Open Claude Code and check that AISSIA tools are available:
```
You: Can you list the available MCP servers?
Claude: I have access to the following MCP servers:
- aissia: 13 tools available
```
## Available Tools
Once configured, Claude will have access to these 13 AISSIA tools:
### AISSIA Core (5 tools)
1. **chat_with_aissia** ⭐ - Dialogue with AISSIA's AI assistant (Claude Sonnet 4)
2. **transcribe_audio** - Transcribe audio files to text
3. **text_to_speech** - Convert text to speech audio files
4. **save_memory** - Save notes to AISSIA's persistent storage
5. **search_memories** - Search through saved memories
### File System (8 tools)
6. **read_file** - Read file contents
7. **write_file** - Write content to files
8. **list_directory** - List files in a directory
9. **search_files** - Search for files by pattern
10. **file_exists** - Check if a file exists
11. **create_directory** - Create directories
12. **delete_file** - Delete files
13. **move_file** - Move or rename files
## Configuration Options
### Basic Configuration
```json
{
"mcpServers": {
"aissia": {
"command": "path/to/aissia.exe",
"args": ["--mcp-server"],
"disabled": false
}
}
}
```
### With Auto-Approval
To skip confirmation prompts for specific tools:
```json
{
"mcpServers": {
"aissia": {
"command": "path/to/aissia.exe",
"args": ["--mcp-server"],
"disabled": false,
"alwaysAllow": ["chat_with_aissia", "read_file", "write_file"]
}
}
}
```
### Disable Server
To temporarily disable AISSIA without removing the configuration:
```json
{
"mcpServers": {
"aissia": {
"command": "path/to/aissia.exe",
"args": ["--mcp-server"],
"disabled": true // <-- Set to true
}
}
}
```
## Prerequisites
Before running AISSIA in MCP server mode, ensure these config files exist:
### config/ai.json
```json
{
"provider": "claude",
"api_key": "sk-ant-api03-...",
"model": "claude-sonnet-4-20250514",
"max_iterations": 10,
"system_prompt": "Tu es AISSIA, un assistant personnel intelligent..."
}
```
### config/storage.json
```json
{
"database_path": "./data/aissia.db",
"journal_mode": "WAL",
"busy_timeout_ms": 5000
}
```
### config/voice.json (optional)
```json
{
"tts": {
"enabled": true,
"rate": 0,
"volume": 80
},
"stt": {
"active_mode": {
"enabled": false
}
}
}
```
## Testing MCP Server
You can test the MCP server independently before integrating with Claude Code:
```bash
# Test tools/list
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list"}' | ./build/aissia.exe --mcp-server
# Test chat_with_aissia tool
echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"chat_with_aissia","arguments":{"message":"What time is it?"}}}' | ./build/aissia.exe --mcp-server
```
## Troubleshooting
### "Server not found" or "Connection failed"
1. Verify the `command` path is correct and points to `aissia.exe`
2. Make sure AISSIA compiles successfully: `cmake --build build`
3. Test running `./build/aissia.exe --mcp-server` manually
### "LLMService not initialized"
AISSIA requires `config/ai.json` with a valid Claude API key. Check:
1. File exists: `config/ai.json`
2. API key is valid: `"api_key": "sk-ant-api03-..."`
3. Provider is set: `"provider": "claude"`
### "Tool execution failed"
Some tools have limited functionality in Phase 8 MVP:
- `transcribe_audio` - Not fully implemented yet (STT file support needed)
- `text_to_speech` - Not fully implemented yet (TTS file output needed)
- `save_memory` - Not fully implemented yet (Storage sync methods needed)
- `search_memories` - Not fully implemented yet (Storage sync methods needed)
These will be completed in Phase 8.1 and 8.2.
### Server starts but tools don't appear
1. Check Claude Code logs: `Ctrl+Shift+P` → "Developer: Open Extension Logs"
2. Look for MCP server initialization errors
3. Verify JSON syntax in the MCP configuration file
## Example Use Cases
### 1. Ask AISSIA for Help
```
You: Use chat_with_aissia to ask "What are my top productivity patterns?"
Claude: [calls chat_with_aissia tool]
AISSIA: Based on your activity data, your most productive hours are 9-11 AM...
```
### 2. File Operations + AI
```
You: Read my TODO.md file and ask AISSIA to prioritize the tasks
Claude: [calls read_file("TODO.md")]
Claude: [calls chat_with_aissia with task list]
AISSIA: Here's a prioritized version based on urgency and dependencies...
```
### 3. Voice Transcription (future)
```
You: Transcribe meeting-notes.wav to text
Claude: [calls transcribe_audio("meeting-notes.wav")]
Result: "Welcome to the team meeting. Today we're discussing..."
```
## Advanced Configuration
### Multiple MCP Servers
You can configure multiple MCP servers alongside AISSIA:
```json
{
"mcpServers": {
"aissia": {
"command": "C:\\path\\to\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false
},
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "C:\\Users"],
"disabled": false
},
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"disabled": false,
"env": {
"BRAVE_API_KEY": "your-brave-api-key"
}
}
}
}
```
### Environment Variables
Pass environment variables to AISSIA:
```json
{
"mcpServers": {
"aissia": {
"command": "C:\\path\\to\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false,
"env": {
"AISSIA_LOG_LEVEL": "debug",
"CLAUDE_API_KEY": "sk-ant-api03-..."
}
}
}
}
```
## References
- **Full Documentation**: `docs/CLAUDE_CODE_INTEGRATION.md`
- **MCP Specification**: https://github.com/anthropics/mcp
- **Claude Code Extension**: https://marketplace.visualstudio.com/items?itemName=saoudrizwan.claude-dev
## Support
For issues or questions:
1. Check the full documentation: `docs/CLAUDE_CODE_INTEGRATION.md`
2. Review logs: AISSIA writes to stderr in MCP mode
3. Test manually: `./build/aissia.exe --mcp-server` and send JSON-RPC requests
# AISSIA MCP Configuration for Claude Code
This directory contains an example MCP (Model Context Protocol) configuration for integrating AISSIA with Claude Code.
## Quick Setup
### 1. Locate Claude Code MCP Settings
The MCP configuration file location depends on your operating system:
**Windows**:
```
%APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json
```
Full path example:
```
C:\Users\YourUsername\AppData\Roaming\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json
```
**macOS**:
```
~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
```
**Linux**:
```
~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
```
### 2. Copy Configuration
Copy the contents of `claude_code_mcp_config.json` to the Claude Code MCP settings file.
**Important**: Update the `command` path to point to your actual AISSIA executable:
```json
{
"mcpServers": {
"aissia": {
"command": "C:\\path\\to\\your\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false
}
}
}
```
### 3. Restart Claude Code
Restart VS Code (or reload window: `Ctrl+Shift+P` → "Developer: Reload Window") to apply the changes.
### 4. Verify Integration
Open Claude Code and check that AISSIA tools are available:
```
You: Can you list the available MCP servers?
Claude: I have access to the following MCP servers:
- aissia: 13 tools available
```
## Available Tools
Once configured, Claude will have access to these 13 AISSIA tools:
### AISSIA Core (5 tools)
1. **chat_with_aissia** ⭐ - Dialogue with AISSIA's AI assistant (Claude Sonnet 4)
2. **transcribe_audio** - Transcribe audio files to text
3. **text_to_speech** - Convert text to speech audio files
4. **save_memory** - Save notes to AISSIA's persistent storage
5. **search_memories** - Search through saved memories
### File System (8 tools)
6. **read_file** - Read file contents
7. **write_file** - Write content to files
8. **list_directory** - List files in a directory
9. **search_files** - Search for files by pattern
10. **file_exists** - Check if a file exists
11. **create_directory** - Create directories
12. **delete_file** - Delete files
13. **move_file** - Move or rename files
## Configuration Options
### Basic Configuration
```json
{
"mcpServers": {
"aissia": {
"command": "path/to/aissia.exe",
"args": ["--mcp-server"],
"disabled": false
}
}
}
```
### With Auto-Approval
To skip confirmation prompts for specific tools:
```json
{
"mcpServers": {
"aissia": {
"command": "path/to/aissia.exe",
"args": ["--mcp-server"],
"disabled": false,
"alwaysAllow": ["chat_with_aissia", "read_file", "write_file"]
}
}
}
```
### Disable Server
To temporarily disable AISSIA without removing the configuration:
```json
{
"mcpServers": {
"aissia": {
"command": "path/to/aissia.exe",
"args": ["--mcp-server"],
"disabled": true // <-- Set to true
}
}
}
```
## Prerequisites
Before running AISSIA in MCP server mode, ensure these config files exist:
### config/ai.json
```json
{
"provider": "claude",
"api_key": "sk-ant-api03-...",
"model": "claude-sonnet-4-20250514",
"max_iterations": 10,
"system_prompt": "Tu es AISSIA, un assistant personnel intelligent..."
}
```
### config/storage.json
```json
{
"database_path": "./data/aissia.db",
"journal_mode": "WAL",
"busy_timeout_ms": 5000
}
```
### config/voice.json (optional)
```json
{
"tts": {
"enabled": true,
"rate": 0,
"volume": 80
},
"stt": {
"active_mode": {
"enabled": false
}
}
}
```
## Testing MCP Server
You can test the MCP server independently before integrating with Claude Code:
```bash
# Test tools/list
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list"}' | ./build/aissia.exe --mcp-server
# Test chat_with_aissia tool
echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"chat_with_aissia","arguments":{"message":"What time is it?"}}}' | ./build/aissia.exe --mcp-server
```
## Troubleshooting
### "Server not found" or "Connection failed"
1. Verify the `command` path is correct and points to `aissia.exe`
2. Make sure AISSIA compiles successfully: `cmake --build build`
3. Test running `./build/aissia.exe --mcp-server` manually
### "LLMService not initialized"
AISSIA requires `config/ai.json` with a valid Claude API key. Check:
1. File exists: `config/ai.json`
2. API key is valid: `"api_key": "sk-ant-api03-..."`
3. Provider is set: `"provider": "claude"`
### "Tool execution failed"
Some tools have limited functionality in Phase 8 MVP:
- `transcribe_audio` - Not fully implemented yet (STT file support needed)
- `text_to_speech` - Not fully implemented yet (TTS file output needed)
- `save_memory` - Not fully implemented yet (Storage sync methods needed)
- `search_memories` - Not fully implemented yet (Storage sync methods needed)
These will be completed in Phase 8.1 and 8.2.
### Server starts but tools don't appear
1. Check Claude Code logs: `Ctrl+Shift+P` → "Developer: Open Extension Logs"
2. Look for MCP server initialization errors
3. Verify JSON syntax in the MCP configuration file
## Example Use Cases
### 1. Ask AISSIA for Help
```
You: Use chat_with_aissia to ask "What are my top productivity patterns?"
Claude: [calls chat_with_aissia tool]
AISSIA: Based on your activity data, your most productive hours are 9-11 AM...
```
### 2. File Operations + AI
```
You: Read my TODO.md file and ask AISSIA to prioritize the tasks
Claude: [calls read_file("TODO.md")]
Claude: [calls chat_with_aissia with task list]
AISSIA: Here's a prioritized version based on urgency and dependencies...
```
### 3. Voice Transcription (future)
```
You: Transcribe meeting-notes.wav to text
Claude: [calls transcribe_audio("meeting-notes.wav")]
Result: "Welcome to the team meeting. Today we're discussing..."
```
## Advanced Configuration
### Multiple MCP Servers
You can configure multiple MCP servers alongside AISSIA:
```json
{
"mcpServers": {
"aissia": {
"command": "C:\\path\\to\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false
},
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "C:\\Users"],
"disabled": false
},
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"disabled": false,
"env": {
"BRAVE_API_KEY": "your-brave-api-key"
}
}
}
}
```
### Environment Variables
Pass environment variables to AISSIA:
```json
{
"mcpServers": {
"aissia": {
"command": "C:\\path\\to\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false,
"env": {
"AISSIA_LOG_LEVEL": "debug",
"CLAUDE_API_KEY": "sk-ant-api03-..."
}
}
}
}
```
## References
- **Full Documentation**: `docs/CLAUDE_CODE_INTEGRATION.md`
- **MCP Specification**: https://github.com/anthropics/mcp
- **Claude Code Extension**: https://marketplace.visualstudio.com/items?itemName=saoudrizwan.claude-dev
## Support
For issues or questions:
1. Check the full documentation: `docs/CLAUDE_CODE_INTEGRATION.md`
2. Review logs: AISSIA writes to stderr in MCP mode
3. Test manually: `./build/aissia.exe --mcp-server` and send JSON-RPC requests

View File

@ -1,10 +1,10 @@
{
"mcpServers": {
"aissia": {
"command": "C:\\Users\\alexi\\Documents\\projects\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false,
"alwaysAllow": []
}
}
}
{
"mcpServers": {
"aissia": {
"command": "C:\\Users\\alexi\\Documents\\projects\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false,
"alwaysAllow": []
}
}
}

View File

@ -1,35 +1,35 @@
#!/usr/bin/env python3
"""Generate test audio WAV file for STT testing"""
import sys
try:
from gtts import gTTS
import os
from pydub import AudioSegment
# Generate French test audio
text = "Bonjour, ceci est un test de reconnaissance vocale."
print(f"Generating audio: '{text}'")
# Create TTS
tts = gTTS(text=text, lang='fr', slow=False)
tts.save("test_audio_temp.mp3")
print("✓ Generated MP3")
# Convert to WAV (16kHz, mono, 16-bit PCM)
audio = AudioSegment.from_mp3("test_audio_temp.mp3")
audio = audio.set_frame_rate(16000).set_channels(1).set_sample_width(2)
audio.export("test_audio.wav", format="wav")
print("✓ Converted to WAV (16kHz, mono, 16-bit)")
# Cleanup
os.remove("test_audio_temp.mp3")
print("✓ Saved as test_audio.wav")
print(f"Duration: {len(audio)/1000:.1f}s")
except ImportError as e:
print(f"Missing dependency: {e}")
print("\nInstall with: pip install gtts pydub")
print("Note: pydub also requires ffmpeg")
sys.exit(1)
#!/usr/bin/env python3
"""Generate test audio WAV file for STT testing"""
import sys
try:
from gtts import gTTS
import os
from pydub import AudioSegment
# Generate French test audio
text = "Bonjour, ceci est un test de reconnaissance vocale."
print(f"Generating audio: '{text}'")
# Create TTS
tts = gTTS(text=text, lang='fr', slow=False)
tts.save("test_audio_temp.mp3")
print("✓ Generated MP3")
# Convert to WAV (16kHz, mono, 16-bit PCM)
audio = AudioSegment.from_mp3("test_audio_temp.mp3")
audio = audio.set_frame_rate(16000).set_channels(1).set_sample_width(2)
audio.export("test_audio.wav", format="wav")
print("✓ Converted to WAV (16kHz, mono, 16-bit)")
# Cleanup
os.remove("test_audio_temp.mp3")
print("✓ Saved as test_audio.wav")
print(f"Duration: {len(audio)/1000:.1f}s")
except ImportError as e:
print(f"Missing dependency: {e}")
print("\nInstall with: pip install gtts pydub")
print("Note: pydub also requires ffmpeg")
sys.exit(1)

View File

@ -1,38 +1,38 @@
#!/usr/bin/env python3
"""Generate simple test audio WAV file using only stdlib"""
import wave
import struct
import math
# WAV parameters
sample_rate = 16000
duration = 2 # seconds
frequency = 440 # Hz (A4 note)
# Generate sine wave samples
samples = []
for i in range(int(sample_rate * duration)):
# Sine wave value (-1.0 to 1.0)
value = math.sin(2.0 * math.pi * frequency * i / sample_rate)
# Convert to 16-bit PCM (-32768 to 32767)
sample = int(value * 32767)
samples.append(sample)
# Write WAV file
with wave.open("test_audio.wav", "w") as wav_file:
# Set parameters (1 channel, 2 bytes per sample, 16kHz)
wav_file.setnchannels(1)
wav_file.setsampwidth(2)
wav_file.setframerate(sample_rate)
# Write frames
for sample in samples:
wav_file.writeframes(struct.pack('<h', sample))
print(f"[OK] Generated test_audio.wav")
print(f" - Format: 16kHz, mono, 16-bit PCM")
print(f" - Duration: {duration}s")
print(f" - Frequency: {frequency}Hz (A4 tone)")
print(f" - Samples: {len(samples)}")
#!/usr/bin/env python3
"""Generate simple test audio WAV file using only stdlib"""
import wave
import struct
import math
# WAV parameters
sample_rate = 16000
duration = 2 # seconds
frequency = 440 # Hz (A4 note)
# Generate sine wave samples
samples = []
for i in range(int(sample_rate * duration)):
# Sine wave value (-1.0 to 1.0)
value = math.sin(2.0 * math.pi * frequency * i / sample_rate)
# Convert to 16-bit PCM (-32768 to 32767)
sample = int(value * 32767)
samples.append(sample)
# Write WAV file
with wave.open("test_audio.wav", "w") as wav_file:
# Set parameters (1 channel, 2 bytes per sample, 16kHz)
wav_file.setnchannels(1)
wav_file.setsampwidth(2)
wav_file.setframerate(sample_rate)
# Write frames
for sample in samples:
wav_file.writeframes(struct.pack('<h', sample))
print(f"[OK] Generated test_audio.wav")
print(f" - Format: 16kHz, mono, 16-bit PCM")
print(f" - Duration: {duration}s")
print(f" - Frequency: {frequency}Hz (A4 tone)")
print(f" - Samples: {len(samples)}")

View File

@ -1,449 +1,449 @@
# AISSIA - Claude Code Integration (Phase 8)
## Overview
AISSIA can now be exposed as an **MCP Server** (Model Context Protocol) to integrate with Claude Code and other MCP-compatible clients. This allows Claude to use AISSIA's capabilities as tools during conversations.
**Mode MCP Server**: `./aissia --mcp-server`
This mode exposes AISSIA's services via JSON-RPC 2.0 over stdio, following the MCP specification.
## Available Tools
AISSIA exposes **13 tools** total:
### 1. AISSIA Core Tools (Priority)
#### `chat_with_aissia` ⭐ **PRIORITY**
Dialogue with AISSIA's built-in AI assistant (Claude Sonnet 4). Send a message and get an intelligent response with access to AISSIA's knowledge and capabilities.
**Input**:
```json
{
"message": "string (required) - Message to send to AISSIA",
"conversation_id": "string (optional) - Conversation ID for continuity",
"system_prompt": "string (optional) - Custom system prompt"
}
```
**Output**:
```json
{
"response": "AISSIA's response text",
"conversation_id": "conversation-id",
"tokens": 1234,
"iterations": 2
}
```
**Example use case**: "Hey AISSIA, can you analyze my focus patterns this week?"
#### `transcribe_audio`
Transcribe audio file to text using Speech-to-Text engines (Whisper.cpp, OpenAI Whisper API, Google Speech).
**Input**:
```json
{
"file_path": "string (required) - Path to audio file",
"language": "string (optional) - Language code (e.g., 'fr', 'en'). Default: 'fr'"
}
```
**Output**:
```json
{
"text": "Transcribed text from audio",
"file": "/path/to/audio.wav",
"language": "fr"
}
```
**Status**: ⚠️ Not yet implemented - requires STT service file transcription support
#### `text_to_speech`
Convert text to speech audio file using Text-to-Speech synthesis. Generates audio in WAV format.
**Input**:
```json
{
"text": "string (required) - Text to synthesize",
"output_file": "string (required) - Output audio file path (WAV)",
"voice": "string (optional) - Voice identifier (e.g., 'fr-fr', 'en-us'). Default: 'fr-fr'"
}
```
**Output**:
```json
{
"success": true,
"file": "/path/to/output.wav",
"voice": "fr-fr"
}
```
**Status**: ⚠️ Not yet implemented - requires TTS engine file output support
#### `save_memory`
Save a note or memory to AISSIA's persistent storage. Memories can be tagged and searched later.
**Input**:
```json
{
"title": "string (required) - Memory title",
"content": "string (required) - Memory content",
"tags": ["array of strings (optional) - Tags for categorization"]
}
```
**Output**:
```json
{
"id": "memory-uuid",
"title": "Meeting notes",
"timestamp": "2025-01-30T10:00:00Z"
}
```
**Status**: ⚠️ Not yet implemented - requires StorageService sync methods
#### `search_memories`
Search through saved memories and notes in AISSIA's storage. Returns matching memories with relevance scores.
**Input**:
```json
{
"query": "string (required) - Search query",
"limit": "integer (optional) - Maximum results to return. Default: 10"
}
```
**Output**:
```json
{
"results": [
{
"id": "memory-uuid",
"title": "Meeting notes",
"content": "...",
"score": 0.85,
"tags": ["work", "meeting"]
}
],
"count": 5
}
```
**Status**: ⚠️ Not yet implemented - requires StorageService sync methods
### 2. File System Tools (8 tools)
- `read_file` - Read a file from the filesystem
- `write_file` - Write content to a file
- `list_directory` - List files in a directory
- `search_files` - Search for files by pattern
- `file_exists` - Check if a file exists
- `create_directory` - Create a new directory
- `delete_file` - Delete a file
- `move_file` - Move or rename a file
These tools provide Claude with direct filesystem access to work with files on your system.
## Installation for Claude Code
### 1. Configure Claude Code MCP
Create or edit your Claude Code MCP configuration file:
**Windows**: `%APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json`
**macOS/Linux**: `~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json`
Add AISSIA as an MCP server:
```json
{
"mcpServers": {
"aissia": {
"command": "C:\\path\\to\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false
}
}
}
```
**Note**: Replace `C:\\path\\to\\aissia\\build\\aissia.exe` with the actual path to your compiled AISSIA executable.
### 2. Verify Configuration
Restart Claude Code (or VS Code) to reload the MCP configuration.
Claude should now have access to all 13 AISSIA tools during conversations.
### 3. Test Integration
In Claude Code, try:
```
"Can you use the chat_with_aissia tool to ask AISSIA what time it is?"
```
Claude will call the `chat_with_aissia` tool, which internally uses AISSIA's LLM service to process the query.
## Architecture
### Synchronous Mode (MCP Server)
When running as an MCP server, AISSIA uses **synchronous blocking calls** instead of the async pub/sub architecture used in normal mode:
```cpp
// Normal mode (async)
io->publish("llm:request", data);
// ... wait for response on "llm:response" topic
// MCP mode (sync)
auto response = llmService->sendMessageSync(message, conversationId);
// immediate result
```
This is necessary because:
1. MCP protocol expects immediate JSON-RPC responses
2. No event loop in MCP server mode (stdin/stdout blocking I/O)
3. Simplifies integration with external tools
### Service Integration
```
MCPServer (stdio JSON-RPC)
MCPServerTools (tool handlers)
Services (sync methods)
├── LLMService::sendMessageSync()
├── VoiceService::transcribeFileSync()
├── VoiceService::textToSpeechSync()
└── StorageService (stub implementations)
```
### Tool Registry
All tools are registered in a central `ToolRegistry`:
```cpp
ToolRegistry registry;
// 1. Internal tools (get_current_time)
registry.registerTool("get_current_time", ...);
// 2. FileSystem tools (8 tools)
for (auto& toolDef : FileSystemTools::getToolDefinitions()) {
registry.registerTool(toolDef);
}
// 3. AISSIA tools (5 tools)
MCPServerTools aissiaTools(llmService, storageService, voiceService);
for (const auto& toolDef : aissiaTools.getToolDefinitions()) {
registry.registerTool(toolDef);
}
```
Total: **13 tools**
## Configuration Files
AISSIA MCP Server requires these config files (same as normal mode):
- `config/ai.json` - LLM provider configuration (Claude API key)
- `config/storage.json` - Database path and settings
- `config/voice.json` - TTS/STT engine settings
**Important**: Make sure these files are present before running `--mcp-server` mode.
## Limitations (Phase 8 MVP)
1. **STT/TTS file operations**: `transcribe_audio` and `text_to_speech` are not fully implemented yet
- STT service needs file transcription support (currently only streaming)
- TTS engine needs file output support (currently only direct playback)
2. **Storage sync methods**: `save_memory` and `search_memories` return "not implemented" errors
- StorageService needs `saveMemorySync()` and `searchMemoriesSync()` methods
- Current storage only works via async pub/sub
3. **No hot-reload**: MCP server mode doesn't load hot-reloadable modules
- Only services and tools are available
- No SchedulerModule, MonitoringModule, etc.
4. **Single-threaded**: MCP server runs synchronously on main thread
- LLMService worker thread still runs for agentic loops
- But overall server is blocking on stdin
## Roadmap
### Phase 8.1 - Complete STT/TTS Sync Methods
- [ ] Implement `VoiceService::transcribeFileSync()` using STT engines
- [ ] Implement `VoiceService::textToSpeechSync()` with file output
- [ ] Test audio file transcription via MCP
### Phase 8.2 - Storage Sync Methods
- [ ] Implement `StorageService::saveMemorySync()`
- [ ] Implement `StorageService::searchMemoriesSync()`
- [ ] Add vector embeddings for semantic search
### Phase 8.3 - Advanced Tools
- [ ] `schedule_task` - Add tasks to AISSIA's scheduler
- [ ] `get_focus_stats` - Retrieve hyperfocus detection stats
- [ ] `list_active_apps` - Get current monitored applications
- [ ] `send_notification` - Trigger system notifications
### Phase 8.4 - Multi-Modal Support
- [ ] Image input for LLM (Claude vision)
- [ ] PDF/document parsing tools
- [ ] Web scraping integration
## Use Cases
### 1. AI Assistant Collaboration
Claude Code can delegate complex reasoning tasks to AISSIA:
```
Claude: "I need to analyze user behavior patterns. Let me ask AISSIA."
→ calls chat_with_aissia("Analyze recent focus patterns")
AISSIA: "Based on monitoring data, user has 3 hyperfocus sessions daily averaging 2.5 hours..."
```
### 2. Voice Transcription Workflow
```
Claude: "Transcribe meeting-2025-01-30.wav"
→ calls transcribe_audio(file_path="meeting-2025-01-30.wav", language="en")
→ calls write_file(path="transcript.txt", content=result)
```
### 3. Knowledge Management
```
Claude: "Save this important insight to AISSIA's memory"
→ calls save_memory(
title="Project architecture decision",
content="We decided to use hot-reload modules for business logic...",
tags=["architecture", "project"]
)
```
### 4. File + AI Operations
```
Claude: "Read todos.md, ask AISSIA to prioritize tasks, update file"
→ calls read_file("todos.md")
→ calls chat_with_aissia("Prioritize these tasks: ...")
→ calls write_file("todos-prioritized.md", content=...)
```
## Development
### Adding New Tools
1. **Declare tool in MCPServerTools.hpp**:
```cpp
json handleNewTool(const json& input);
```
2. **Implement in MCPServerTools.cpp**:
```cpp
json MCPServerTools::handleNewTool(const json& input) {
// Extract input parameters
std::string param = input["param"];
// Call service
auto result = m_someService->doSomethingSync(param);
// Return JSON result
return {
{"output", result},
{"status", "success"}
};
}
```
3. **Register in getToolDefinitions()**:
```cpp
tools.push_back({
"new_tool",
"Description of what this tool does",
{
{"type", "object"},
{"properties", {
{"param", {
{"type", "string"},
{"description", "Parameter description"}
}}
}},
{"required", json::array({"param"})}
},
[this](const json& input) { return handleNewTool(input); }
});
```
4. **Add to execute() switch**:
```cpp
if (toolName == "new_tool") {
return handleNewTool(input);
}
```
### Testing MCP Server
Test with `nc` or `socat`:
```bash
# Send tools/list request
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list"}' | ./build/aissia.exe --mcp-server
# Send tool call
echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"chat_with_aissia","arguments":{"message":"Hello AISSIA"}}}' | ./build/aissia.exe --mcp-server
```
Expected output format:
```json
{"jsonrpc":"2.0","id":1,"result":{"tools":[{"name":"chat_with_aissia","description":"...","inputSchema":{...}}]}}
```
## Troubleshooting
### "LLMService not initialized"
Make sure `config/ai.json` exists with valid API key:
```json
{
"provider": "claude",
"api_key": "sk-ant-...",
"model": "claude-sonnet-4-20250514"
}
```
### "VoiceService not available"
Voice tools are optional. If you don't need STT/TTS, this is normal.
### "StorageService not available"
Make sure `config/storage.json` exists:
```json
{
"database_path": "./data/aissia.db",
"journal_mode": "WAL",
"busy_timeout_ms": 5000
}
```
### "Tool not found"
Check `tools/list` output to see which tools are actually registered.
## References
- **MCP Specification**: https://github.com/anthropics/mcp
- **AISSIA Architecture**: `docs/project-overview.md`
- **GroveEngine Guide**: `docs/GROVEENGINE_GUIDE.md`
- **LLM Service**: `src/services/LLMService.hpp`
- **MCPServer**: `src/shared/mcp/MCPServer.hpp`
# AISSIA - Claude Code Integration (Phase 8)
## Overview
AISSIA can now be exposed as an **MCP Server** (Model Context Protocol) to integrate with Claude Code and other MCP-compatible clients. This allows Claude to use AISSIA's capabilities as tools during conversations.
**Mode MCP Server**: `./aissia --mcp-server`
This mode exposes AISSIA's services via JSON-RPC 2.0 over stdio, following the MCP specification.
## Available Tools
AISSIA exposes **13 tools** total:
### 1. AISSIA Core Tools (Priority)
#### `chat_with_aissia` ⭐ **PRIORITY**
Dialogue with AISSIA's built-in AI assistant (Claude Sonnet 4). Send a message and get an intelligent response with access to AISSIA's knowledge and capabilities.
**Input**:
```json
{
"message": "string (required) - Message to send to AISSIA",
"conversation_id": "string (optional) - Conversation ID for continuity",
"system_prompt": "string (optional) - Custom system prompt"
}
```
**Output**:
```json
{
"response": "AISSIA's response text",
"conversation_id": "conversation-id",
"tokens": 1234,
"iterations": 2
}
```
**Example use case**: "Hey AISSIA, can you analyze my focus patterns this week?"
#### `transcribe_audio`
Transcribe audio file to text using Speech-to-Text engines (Whisper.cpp, OpenAI Whisper API, Google Speech).
**Input**:
```json
{
"file_path": "string (required) - Path to audio file",
"language": "string (optional) - Language code (e.g., 'fr', 'en'). Default: 'fr'"
}
```
**Output**:
```json
{
"text": "Transcribed text from audio",
"file": "/path/to/audio.wav",
"language": "fr"
}
```
**Status**: ⚠️ Not yet implemented - requires STT service file transcription support
#### `text_to_speech`
Convert text to speech audio file using Text-to-Speech synthesis. Generates audio in WAV format.
**Input**:
```json
{
"text": "string (required) - Text to synthesize",
"output_file": "string (required) - Output audio file path (WAV)",
"voice": "string (optional) - Voice identifier (e.g., 'fr-fr', 'en-us'). Default: 'fr-fr'"
}
```
**Output**:
```json
{
"success": true,
"file": "/path/to/output.wav",
"voice": "fr-fr"
}
```
**Status**: ⚠️ Not yet implemented - requires TTS engine file output support
#### `save_memory`
Save a note or memory to AISSIA's persistent storage. Memories can be tagged and searched later.
**Input**:
```json
{
"title": "string (required) - Memory title",
"content": "string (required) - Memory content",
"tags": ["array of strings (optional) - Tags for categorization"]
}
```
**Output**:
```json
{
"id": "memory-uuid",
"title": "Meeting notes",
"timestamp": "2025-01-30T10:00:00Z"
}
```
**Status**: ⚠️ Not yet implemented - requires StorageService sync methods
#### `search_memories`
Search through saved memories and notes in AISSIA's storage. Returns matching memories with relevance scores.
**Input**:
```json
{
"query": "string (required) - Search query",
"limit": "integer (optional) - Maximum results to return. Default: 10"
}
```
**Output**:
```json
{
"results": [
{
"id": "memory-uuid",
"title": "Meeting notes",
"content": "...",
"score": 0.85,
"tags": ["work", "meeting"]
}
],
"count": 5
}
```
**Status**: ⚠️ Not yet implemented - requires StorageService sync methods
### 2. File System Tools (8 tools)
- `read_file` - Read a file from the filesystem
- `write_file` - Write content to a file
- `list_directory` - List files in a directory
- `search_files` - Search for files by pattern
- `file_exists` - Check if a file exists
- `create_directory` - Create a new directory
- `delete_file` - Delete a file
- `move_file` - Move or rename a file
These tools provide Claude with direct filesystem access to work with files on your system.
## Installation for Claude Code
### 1. Configure Claude Code MCP
Create or edit your Claude Code MCP configuration file:
**Windows**: `%APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json`
**macOS/Linux**: `~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json`
Add AISSIA as an MCP server:
```json
{
"mcpServers": {
"aissia": {
"command": "C:\\path\\to\\aissia\\build\\aissia.exe",
"args": ["--mcp-server"],
"disabled": false
}
}
}
```
**Note**: Replace `C:\\path\\to\\aissia\\build\\aissia.exe` with the actual path to your compiled AISSIA executable.
### 2. Verify Configuration
Restart Claude Code (or VS Code) to reload the MCP configuration.
Claude should now have access to all 13 AISSIA tools during conversations.
### 3. Test Integration
In Claude Code, try:
```
"Can you use the chat_with_aissia tool to ask AISSIA what time it is?"
```
Claude will call the `chat_with_aissia` tool, which internally uses AISSIA's LLM service to process the query.
## Architecture
### Synchronous Mode (MCP Server)
When running as an MCP server, AISSIA uses **synchronous blocking calls** instead of the async pub/sub architecture used in normal mode:
```cpp
// Normal mode (async)
io->publish("llm:request", data);
// ... wait for response on "llm:response" topic
// MCP mode (sync)
auto response = llmService->sendMessageSync(message, conversationId);
// immediate result
```
This is necessary because:
1. MCP protocol expects immediate JSON-RPC responses
2. No event loop in MCP server mode (stdin/stdout blocking I/O)
3. Simplifies integration with external tools
### Service Integration
```
MCPServer (stdio JSON-RPC)
MCPServerTools (tool handlers)
Services (sync methods)
├── LLMService::sendMessageSync()
├── VoiceService::transcribeFileSync()
├── VoiceService::textToSpeechSync()
└── StorageService (stub implementations)
```
### Tool Registry
All tools are registered in a central `ToolRegistry`:
```cpp
ToolRegistry registry;
// 1. Internal tools (get_current_time)
registry.registerTool("get_current_time", ...);
// 2. FileSystem tools (8 tools)
for (auto& toolDef : FileSystemTools::getToolDefinitions()) {
registry.registerTool(toolDef);
}
// 3. AISSIA tools (5 tools)
MCPServerTools aissiaTools(llmService, storageService, voiceService);
for (const auto& toolDef : aissiaTools.getToolDefinitions()) {
registry.registerTool(toolDef);
}
```
Total: **13 tools**
## Configuration Files
AISSIA MCP Server requires these config files (same as normal mode):
- `config/ai.json` - LLM provider configuration (Claude API key)
- `config/storage.json` - Database path and settings
- `config/voice.json` - TTS/STT engine settings
**Important**: Make sure these files are present before running `--mcp-server` mode.
## Limitations (Phase 8 MVP)
1. **STT/TTS file operations**: `transcribe_audio` and `text_to_speech` are not fully implemented yet
- STT service needs file transcription support (currently only streaming)
- TTS engine needs file output support (currently only direct playback)
2. **Storage sync methods**: `save_memory` and `search_memories` return "not implemented" errors
- StorageService needs `saveMemorySync()` and `searchMemoriesSync()` methods
- Current storage only works via async pub/sub
3. **No hot-reload**: MCP server mode doesn't load hot-reloadable modules
- Only services and tools are available
- No SchedulerModule, MonitoringModule, etc.
4. **Single-threaded**: MCP server runs synchronously on main thread
- LLMService worker thread still runs for agentic loops
- But overall server is blocking on stdin
## Roadmap
### Phase 8.1 - Complete STT/TTS Sync Methods
- [ ] Implement `VoiceService::transcribeFileSync()` using STT engines
- [ ] Implement `VoiceService::textToSpeechSync()` with file output
- [ ] Test audio file transcription via MCP
### Phase 8.2 - Storage Sync Methods
- [ ] Implement `StorageService::saveMemorySync()`
- [ ] Implement `StorageService::searchMemoriesSync()`
- [ ] Add vector embeddings for semantic search
### Phase 8.3 - Advanced Tools
- [ ] `schedule_task` - Add tasks to AISSIA's scheduler
- [ ] `get_focus_stats` - Retrieve hyperfocus detection stats
- [ ] `list_active_apps` - Get current monitored applications
- [ ] `send_notification` - Trigger system notifications
### Phase 8.4 - Multi-Modal Support
- [ ] Image input for LLM (Claude vision)
- [ ] PDF/document parsing tools
- [ ] Web scraping integration
## Use Cases
### 1. AI Assistant Collaboration
Claude Code can delegate complex reasoning tasks to AISSIA:
```
Claude: "I need to analyze user behavior patterns. Let me ask AISSIA."
→ calls chat_with_aissia("Analyze recent focus patterns")
AISSIA: "Based on monitoring data, user has 3 hyperfocus sessions daily averaging 2.5 hours..."
```
### 2. Voice Transcription Workflow
```
Claude: "Transcribe meeting-2025-01-30.wav"
→ calls transcribe_audio(file_path="meeting-2025-01-30.wav", language="en")
→ calls write_file(path="transcript.txt", content=result)
```
### 3. Knowledge Management
```
Claude: "Save this important insight to AISSIA's memory"
→ calls save_memory(
title="Project architecture decision",
content="We decided to use hot-reload modules for business logic...",
tags=["architecture", "project"]
)
```
### 4. File + AI Operations
```
Claude: "Read todos.md, ask AISSIA to prioritize tasks, update file"
→ calls read_file("todos.md")
→ calls chat_with_aissia("Prioritize these tasks: ...")
→ calls write_file("todos-prioritized.md", content=...)
```
## Development
### Adding New Tools
1. **Declare tool in MCPServerTools.hpp**:
```cpp
json handleNewTool(const json& input);
```
2. **Implement in MCPServerTools.cpp**:
```cpp
json MCPServerTools::handleNewTool(const json& input) {
// Extract input parameters
std::string param = input["param"];
// Call service
auto result = m_someService->doSomethingSync(param);
// Return JSON result
return {
{"output", result},
{"status", "success"}
};
}
```
3. **Register in getToolDefinitions()**:
```cpp
tools.push_back({
"new_tool",
"Description of what this tool does",
{
{"type", "object"},
{"properties", {
{"param", {
{"type", "string"},
{"description", "Parameter description"}
}}
}},
{"required", json::array({"param"})}
},
[this](const json& input) { return handleNewTool(input); }
});
```
4. **Add to execute() switch**:
```cpp
if (toolName == "new_tool") {
return handleNewTool(input);
}
```
### Testing MCP Server
Test with `nc` or `socat`:
```bash
# Send tools/list request
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list"}' | ./build/aissia.exe --mcp-server
# Send tool call
echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"chat_with_aissia","arguments":{"message":"Hello AISSIA"}}}' | ./build/aissia.exe --mcp-server
```
Expected output format:
```json
{"jsonrpc":"2.0","id":1,"result":{"tools":[{"name":"chat_with_aissia","description":"...","inputSchema":{...}}]}}
```
## Troubleshooting
### "LLMService not initialized"
Make sure `config/ai.json` exists with valid API key:
```json
{
"provider": "claude",
"api_key": "sk-ant-...",
"model": "claude-sonnet-4-20250514"
}
```
### "VoiceService not available"
Voice tools are optional. If you don't need STT/TTS, this is normal.
### "StorageService not available"
Make sure `config/storage.json` exists:
```json
{
"database_path": "./data/aissia.db",
"journal_mode": "WAL",
"busy_timeout_ms": 5000
}
```
### "Tool not found"
Check `tools/list` output to see which tools are actually registered.
## References
- **MCP Specification**: https://github.com/anthropics/mcp
- **AISSIA Architecture**: `docs/project-overview.md`
- **GroveEngine Guide**: `docs/GROVEENGINE_GUIDE.md`
- **LLM Service**: `src/services/LLMService.hpp`
- **MCPServer**: `src/shared/mcp/MCPServer.hpp`

View File

@ -1,268 +1,268 @@
# Speech-to-Text (STT) Setup Guide - Windows
Guide pour configurer les moteurs de reconnaissance vocale STT sur Windows.
## État Actuel
AISSIA supporte **5 moteurs STT** avec priorités automatiques :
| Moteur | Type | Status | Requis |
|--------|------|--------|--------|
| **Whisper.cpp** | Local | ✅ Configuré | Modèle téléchargé |
| **OpenAI Whisper API** | Cloud | ✅ Configuré | API key dans .env |
| **Google Speech** | Cloud | ✅ Configuré | API key dans .env |
| **Azure STT** | Cloud | ⚠️ Optionnel | API key manquante |
| **Deepgram** | Cloud | ⚠️ Optionnel | API key manquante |
**3 moteurs sont déjà fonctionnels** (Whisper.cpp, OpenAI, Google) ✅
---
## 1. Whisper.cpp (Local, Offline) ✅
### Avantages
- ✅ Complètement offline (pas d'internet requis)
- ✅ Excellente précision (qualité OpenAI Whisper)
- ✅ Gratuit, pas de limite d'utilisation
- ✅ Support multilingue (99 langues)
- ❌ Plus lent que les APIs cloud (temps réel difficile)
### Installation
**Modèle téléchargé** : `models/ggml-base.bin` (142MB)
Autres modèles disponibles :
```bash
cd models/
# Tiny (75MB) - Rapide mais moins précis
curl -L -o ggml-tiny.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-tiny.bin
# Small (466MB) - Bon compromis
curl -L -o ggml-small.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-small.bin
# Medium (1.5GB) - Très bonne qualité
curl -L -o ggml-medium.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-medium.bin
# Large (2.9GB) - Meilleure qualité
curl -L -o ggml-large-v3.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-large-v3.bin
```
**Recommandé** : `base` ou `small` pour la plupart des usages.
---
## 2. OpenAI Whisper API ✅
### Avantages
- ✅ Très rapide (temps réel)
- ✅ Excellente précision
- ✅ Support multilingue
- ❌ Requiert internet
- ❌ Coût : $0.006/minute ($0.36/heure)
### Configuration
1. Obtenir une clé API OpenAI : https://platform.openai.com/api-keys
2. Ajouter à `.env` :
```bash
OPENAI_API_KEY=sk-proj-...
```
**Status** : ✅ Déjà configuré
---
## 3. Google Speech-to-Text ✅
### Avantages
- ✅ Très rapide
- ✅ Bonne précision
- ✅ Support multilingue (125+ langues)
- ❌ Requiert internet
- ❌ Coût : $0.006/15s ($1.44/heure)
### Configuration
1. Activer l'API : https://console.cloud.google.com/apis/library/speech.googleapis.com
2. Créer une clé API
3. Ajouter à `.env` :
```bash
GOOGLE_API_KEY=AIzaSy...
```
**Status** : ✅ Déjà configuré
---
## 4. Azure Speech-to-Text (Optionnel)
### Avantages
- ✅ Excellente précision
- ✅ Support multilingue
- ✅ Free tier : 5h/mois gratuit
- ❌ Requiert internet
### Configuration
1. Créer une ressource Azure Speech : https://portal.azure.com
2. Copier la clé et la région
3. Ajouter à `.env` :
```bash
AZURE_SPEECH_KEY=votre_cle_azure
AZURE_SPEECH_REGION=westeurope # ou votre région
```
**Status** : ⚠️ Optionnel (non configuré)
---
## 5. Deepgram (Optionnel)
### Avantages
- ✅ Très rapide (streaming temps réel)
- ✅ Bonne précision
- ✅ Free tier : $200 crédit / 45,000 minutes
- ❌ Requiert internet
### Configuration
1. Créer un compte : https://console.deepgram.com
2. Créer une API key
3. Ajouter à `.env` :
```bash
DEEPGRAM_API_KEY=votre_cle_deepgram
```
**Status** : ⚠️ Optionnel (non configuré)
---
## Tester les Moteurs STT
### Option 1 : Test avec fichier audio
1. Générer un fichier audio de test :
```bash
python create_test_audio_simple.py
```
2. Lancer le test (quand compilé) :
```bash
./build/test_stt_live test_audio.wav
```
Ceci testera automatiquement tous les moteurs disponibles.
### Option 2 : Test depuis AISSIA
Les moteurs STT sont intégrés dans `VoiceModule` et accessibles via :
- `voice:start_listening` (pub/sub)
- `voice:stop_listening`
- `voice:transcribe` (avec fichier audio)
---
## Configuration Recommandée
Pour un usage optimal, voici l'ordre de priorité recommandé :
### Pour développement/tests locaux
1. **Whisper.cpp** (`ggml-base.bin`) - Offline, gratuit
2. **OpenAI Whisper API** - Si internet disponible
3. **Google Speech** - Fallback
### Pour production/temps réel
1. **Deepgram** - Meilleur streaming temps réel
2. **Azure STT** - Bonne qualité, free tier
3. **Whisper.cpp** (`ggml-small.bin`) - Offline fallback
---
## Fichiers de Configuration
### .env (API Keys)
```bash
# OpenAI Whisper API (✅ configuré)
OPENAI_API_KEY=sk-proj-...
# Google Speech (✅ configuré)
GOOGLE_API_KEY=AIzaSy...
# Azure STT (optionnel)
#AZURE_SPEECH_KEY=votre_cle
#AZURE_SPEECH_REGION=westeurope
# Deepgram (optionnel)
#DEEPGRAM_API_KEY=votre_cle
```
### config/voice.json
```json
{
"stt": {
"active_mode": {
"enabled": true,
"engine": "whisper_cpp",
"model_path": "./models/ggml-base.bin",
"language": "fr",
"fallback_engine": "whisper_api"
}
}
}
```
---
## Dépendances
### Whisper.cpp
- ✅ Intégré dans le build (external/whisper.cpp)
- ✅ Lié statiquement à AissiaAudio
- ❌ Modèle requis : téléchargé dans `models/`
### APIs Cloud
- ✅ Httplib pour requêtes HTTP (déjà dans le projet)
- ✅ nlohmann/json pour sérialisation (déjà dans le projet)
- ❌ OpenSSL désactivé (HTTP-only mode OK)
---
## Troubleshooting
### "Whisper model not found"
```bash
cd models/
curl -L -o ggml-base.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-base.bin
```
### "API key not found"
Vérifier que `.env` contient les clés et est chargé :
```bash
cat .env | grep -E "OPENAI|GOOGLE|AZURE|DEEPGRAM"
```
### "Transcription failed"
1. Vérifier le format audio : 16kHz, mono, 16-bit PCM WAV
2. Générer un test : `python create_test_audio_simple.py`
3. Activer les logs : `spdlog::set_level(spdlog::level::debug)`
---
## Prochaines Étapes
1. ✅ Whisper.cpp configuré et fonctionnel
2. ✅ OpenAI + Google APIs configurées
3. ⚠️ Optionnel : Ajouter Azure ou Deepgram pour redondance
4. 🔜 Tester avec `./build/test_stt_live test_audio.wav`
5. 🔜 Intégrer dans VoiceModule via pub/sub
---
## Références
- [Whisper.cpp GitHub](https://github.com/ggerganov/whisper.cpp)
- [OpenAI Whisper API](https://platform.openai.com/docs/guides/speech-to-text)
- [Google Speech-to-Text](https://cloud.google.com/speech-to-text)
- [Azure Speech](https://azure.microsoft.com/en-us/services/cognitive-services/speech-to-text/)
- [Deepgram](https://developers.deepgram.com/)
# Speech-to-Text (STT) Setup Guide - Windows
Guide pour configurer les moteurs de reconnaissance vocale STT sur Windows.
## État Actuel
AISSIA supporte **5 moteurs STT** avec priorités automatiques :
| Moteur | Type | Status | Requis |
|--------|------|--------|--------|
| **Whisper.cpp** | Local | ✅ Configuré | Modèle téléchargé |
| **OpenAI Whisper API** | Cloud | ✅ Configuré | API key dans .env |
| **Google Speech** | Cloud | ✅ Configuré | API key dans .env |
| **Azure STT** | Cloud | ⚠️ Optionnel | API key manquante |
| **Deepgram** | Cloud | ⚠️ Optionnel | API key manquante |
**3 moteurs sont déjà fonctionnels** (Whisper.cpp, OpenAI, Google) ✅
---
## 1. Whisper.cpp (Local, Offline) ✅
### Avantages
- ✅ Complètement offline (pas d'internet requis)
- ✅ Excellente précision (qualité OpenAI Whisper)
- ✅ Gratuit, pas de limite d'utilisation
- ✅ Support multilingue (99 langues)
- ❌ Plus lent que les APIs cloud (temps réel difficile)
### Installation
**Modèle téléchargé** : `models/ggml-base.bin` (142MB)
Autres modèles disponibles :
```bash
cd models/
# Tiny (75MB) - Rapide mais moins précis
curl -L -o ggml-tiny.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-tiny.bin
# Small (466MB) - Bon compromis
curl -L -o ggml-small.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-small.bin
# Medium (1.5GB) - Très bonne qualité
curl -L -o ggml-medium.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-medium.bin
# Large (2.9GB) - Meilleure qualité
curl -L -o ggml-large-v3.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-large-v3.bin
```
**Recommandé** : `base` ou `small` pour la plupart des usages.
---
## 2. OpenAI Whisper API ✅
### Avantages
- ✅ Très rapide (temps réel)
- ✅ Excellente précision
- ✅ Support multilingue
- ❌ Requiert internet
- ❌ Coût : $0.006/minute ($0.36/heure)
### Configuration
1. Obtenir une clé API OpenAI : https://platform.openai.com/api-keys
2. Ajouter à `.env` :
```bash
OPENAI_API_KEY=sk-proj-...
```
**Status** : ✅ Déjà configuré
---
## 3. Google Speech-to-Text ✅
### Avantages
- ✅ Très rapide
- ✅ Bonne précision
- ✅ Support multilingue (125+ langues)
- ❌ Requiert internet
- ❌ Coût : $0.006/15s ($1.44/heure)
### Configuration
1. Activer l'API : https://console.cloud.google.com/apis/library/speech.googleapis.com
2. Créer une clé API
3. Ajouter à `.env` :
```bash
GOOGLE_API_KEY=AIzaSy...
```
**Status** : ✅ Déjà configuré
---
## 4. Azure Speech-to-Text (Optionnel)
### Avantages
- ✅ Excellente précision
- ✅ Support multilingue
- ✅ Free tier : 5h/mois gratuit
- ❌ Requiert internet
### Configuration
1. Créer une ressource Azure Speech : https://portal.azure.com
2. Copier la clé et la région
3. Ajouter à `.env` :
```bash
AZURE_SPEECH_KEY=votre_cle_azure
AZURE_SPEECH_REGION=westeurope # ou votre région
```
**Status** : ⚠️ Optionnel (non configuré)
---
## 5. Deepgram (Optionnel)
### Avantages
- ✅ Très rapide (streaming temps réel)
- ✅ Bonne précision
- ✅ Free tier : $200 crédit / 45,000 minutes
- ❌ Requiert internet
### Configuration
1. Créer un compte : https://console.deepgram.com
2. Créer une API key
3. Ajouter à `.env` :
```bash
DEEPGRAM_API_KEY=votre_cle_deepgram
```
**Status** : ⚠️ Optionnel (non configuré)
---
## Tester les Moteurs STT
### Option 1 : Test avec fichier audio
1. Générer un fichier audio de test :
```bash
python create_test_audio_simple.py
```
2. Lancer le test (quand compilé) :
```bash
./build/test_stt_live test_audio.wav
```
Ceci testera automatiquement tous les moteurs disponibles.
### Option 2 : Test depuis AISSIA
Les moteurs STT sont intégrés dans `VoiceModule` et accessibles via :
- `voice:start_listening` (pub/sub)
- `voice:stop_listening`
- `voice:transcribe` (avec fichier audio)
---
## Configuration Recommandée
Pour un usage optimal, voici l'ordre de priorité recommandé :
### Pour développement/tests locaux
1. **Whisper.cpp** (`ggml-base.bin`) - Offline, gratuit
2. **OpenAI Whisper API** - Si internet disponible
3. **Google Speech** - Fallback
### Pour production/temps réel
1. **Deepgram** - Meilleur streaming temps réel
2. **Azure STT** - Bonne qualité, free tier
3. **Whisper.cpp** (`ggml-small.bin`) - Offline fallback
---
## Fichiers de Configuration
### .env (API Keys)
```bash
# OpenAI Whisper API (✅ configuré)
OPENAI_API_KEY=sk-proj-...
# Google Speech (✅ configuré)
GOOGLE_API_KEY=AIzaSy...
# Azure STT (optionnel)
#AZURE_SPEECH_KEY=votre_cle
#AZURE_SPEECH_REGION=westeurope
# Deepgram (optionnel)
#DEEPGRAM_API_KEY=votre_cle
```
### config/voice.json
```json
{
"stt": {
"active_mode": {
"enabled": true,
"engine": "whisper_cpp",
"model_path": "./models/ggml-base.bin",
"language": "fr",
"fallback_engine": "whisper_api"
}
}
}
```
---
## Dépendances
### Whisper.cpp
- ✅ Intégré dans le build (external/whisper.cpp)
- ✅ Lié statiquement à AissiaAudio
- ❌ Modèle requis : téléchargé dans `models/`
### APIs Cloud
- ✅ Httplib pour requêtes HTTP (déjà dans le projet)
- ✅ nlohmann/json pour sérialisation (déjà dans le projet)
- ❌ OpenSSL désactivé (HTTP-only mode OK)
---
## Troubleshooting
### "Whisper model not found"
```bash
cd models/
curl -L -o ggml-base.bin https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-base.bin
```
### "API key not found"
Vérifier que `.env` contient les clés et est chargé :
```bash
cat .env | grep -E "OPENAI|GOOGLE|AZURE|DEEPGRAM"
```
### "Transcription failed"
1. Vérifier le format audio : 16kHz, mono, 16-bit PCM WAV
2. Générer un test : `python create_test_audio_simple.py`
3. Activer les logs : `spdlog::set_level(spdlog::level::debug)`
---
## Prochaines Étapes
1. ✅ Whisper.cpp configuré et fonctionnel
2. ✅ OpenAI + Google APIs configurées
3. ⚠️ Optionnel : Ajouter Azure ou Deepgram pour redondance
4. 🔜 Tester avec `./build/test_stt_live test_audio.wav`
5. 🔜 Intégrer dans VoiceModule via pub/sub
---
## Références
- [Whisper.cpp GitHub](https://github.com/ggerganov/whisper.cpp)
- [OpenAI Whisper API](https://platform.openai.com/docs/guides/speech-to-text)
- [Google Speech-to-Text](https://cloud.google.com/speech-to-text)
- [Azure Speech](https://azure.microsoft.com/en-us/services/cognitive-services/speech-to-text/)
- [Deepgram](https://developers.deepgram.com/)

View File

@ -1,227 +1,227 @@
# Document de Succession - AISSIA
## Contexte
AISSIA = Assistant vocal agentique basé sur GroveEngine (C++17 hot-reload). Architecture "Claude Code en vocal" avec tools internes + FileSystem + MCP.
**Dernier commit** : `37b62b5`
## État Actuel
### Ce qui fonctionne
**Build complet** - `cmake -B build && cmake --build build -j4`
**6 modules hot-reload** - Scheduler, Notification, Monitoring, AI, Voice, Storage
**4 services** - LLMService, StorageService, PlatformService, VoiceService
**17 tools pour l'agent** :
- 11 tools internes (via IIO pub/sub)
- 6 FileSystem tools (read/write/edit/list/glob/grep)
- MCP tools (désactivés par défaut)
**Tests** - 67/75 tests modules+types passent
### Lancement
```bash
# Build
cmake -B build && cmake --build build -j4
# Run (depuis racine ou build/)
./build/aissia
# Mode MCP Server (expose les tools via JSON-RPC stdio)
./build/aissia --mcp-server
# Tests
cmake -B build -DBUILD_TESTING=ON
./build/tests/aissia_tests "[scheduler],[notification]" # Modules
./build/tests/aissia_tests "[types]" # MCP types
```
### Variables d'Environnement
```bash
export ANTHROPIC_API_KEY="sk-ant-..." # Requis pour Claude API
```
## Architecture
```
┌─────────────────────────────────────────────────────────────┐
│ LLMService │
│ (Agentic Loop) │
├─────────────────────────────────────────────────────────────┤
│ ToolRegistry │
│ ├── InternalTools (11) ─────► IIO pub/sub │
│ ├── FileSystemTools (6) ────► Direct C++ (read/write/edit) │
│ └── MCPClient (optionnel) ──► stdio JSON-RPC │
└─────────────────────────────────────────────────────────────┘
┌──────────────┬─────┴──────┬──────────────┐
Scheduler Monitoring Storage Voice
Module Module Module Module
```
### Tools Disponibles
| Catégorie | Tools | Communication |
|-----------|-------|---------------|
| Scheduler | get_current_task, list_tasks, start_task, complete_task, start_break | IIO |
| Monitoring | get_focus_stats, get_current_app | IIO |
| Storage | save_note, query_notes, get_session_history | IIO |
| Voice | speak | IIO |
| FileSystem | read_file, write_file, edit_file, list_directory, glob_files, grep_files | Direct C++ |
### FileSystem Tools (Nouveau)
Implémentés dans `src/shared/tools/FileSystemTools.*` :
```cpp
// Lecture avec numéros de ligne
FileSystemTools::execute("read_file", {{"path", "/path/to/file"}, {"limit", 10}});
// Édition style Claude Code
FileSystemTools::execute("edit_file", {
{"path", "/path/to/file"},
{"old_string", "foo"},
{"new_string", "bar"}
});
// Recherche
FileSystemTools::execute("glob_files", {{"pattern", "**/*.cpp"}});
FileSystemTools::execute("grep_files", {{"pattern", "TODO"}, {"path", "./src"}});
```
**Sécurité** :
- Chemins autorisés configurables
- Patterns bloqués : `*.env`, `*.key`, `*credentials*`
- Limites : 1MB lecture, 10MB écriture
## Fichiers Clés
### Nouveaux (Session actuelle)
```
src/shared/tools/FileSystemTools.hpp
src/shared/tools/FileSystemTools.cpp
PLAN_FILESYSTEM_TOOLS.md
```
### Services
```
src/services/LLMService.* # Agentic loop, tools registry
src/services/StorageService.* # SQLite persistence
src/services/PlatformService.* # Window tracking
src/services/VoiceService.* # TTS/STT
```
### Modules (Hot-Reload)
```
src/modules/SchedulerModule.*
src/modules/NotificationModule.*
src/modules/MonitoringModule.*
src/modules/AIModule.*
src/modules/VoiceModule.*
src/modules/StorageModule.*
```
### MCP
```
src/shared/mcp/MCPTypes.hpp
src/shared/mcp/MCPClient.* # Client MCP (consomme des serveurs externes)
src/shared/mcp/MCPServer.* # Serveur MCP (expose AISSIA comme serveur)
src/shared/mcp/StdioTransport.*
config/mcp.json
```
## Tests
```bash
# Build avec tests
cmake -B build -DBUILD_TESTING=ON && cmake --build build -j4
# Par catégorie
./build/tests/aissia_tests "[scheduler]" # 10 tests
./build/tests/aissia_tests "[notification]" # 10 tests
./build/tests/aissia_tests "[types]" # 15 tests MCP
# Tous les modules
./build/tests/aissia_tests "[scheduler],[notification],[monitoring],[ai],[voice],[storage]"
```
**Résultats actuels** :
- Modules : 52/60 (87%)
- MCP Types : 15/15 (100%)
- MCP Transport/Client : Nécessite fix serveurs Python
## Prochaines Étapes
### Priorité Haute
1. **Tester avec API key** - Vérifier la boucle agentique complète
2. **Activer MCP filesystem** - Pour tests end-to-end avec tools externes
### Priorité Moyenne
3. **Fixer tests MCP Transport** - Les serveurs Python reçoivent EOF
4. **Ajouter plus de tools** - add_task, set_reminder, etc.
5. **Streaming responses** - Feedback temps réel pendant génération
### Priorité Basse
6. **Tests end-to-end** - Flux complet inter-modules
7. **CI/CD** - GitHub Actions
8. **Documentation API** - Doxygen
## MCP Server Mode
AISSIA peut fonctionner comme **serveur MCP**, exposant ses tools à des clients externes via JSON-RPC sur stdio.
```bash
./build/aissia --mcp-server
```
### Protocole
Communication JSON-RPC 2.0 sur stdin/stdout :
```json
// Client → AISSIA
{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"clientInfo":{"name":"client","version":"1.0"}}}
// AISSIA → Client
{"jsonrpc":"2.0","id":1,"result":{"protocolVersion":"2024-11-05","serverInfo":{"name":"aissia","version":"0.2.0"},...}}
// Lister les tools
{"jsonrpc":"2.0","id":2,"method":"tools/list","params":{}}
// Appeler un tool
{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"list_directory","arguments":{"path":"."}}}
```
### Utilisation avec Claude Code
Ajouter dans la config MCP :
```json
{
"servers": {
"aissia": {
"command": "/chemin/vers/build/aissia",
"args": ["--mcp-server"]
}
}
}
```
### Tools Exposés (actuellement)
6 FileSystem tools. TODO: exposer les tools internes (scheduler, voice, etc.).
## Notes Techniques
### WSL
- Window tracker non disponible (stub utilisé)
- espeak non installé (TTS stub)
- Tout le reste fonctionne
### Hot-Reload
Les modules sont des `.so` chargés dynamiquement. Pour recompiler un module :
```bash
cmake --build build --target SchedulerModule
# Le module sera rechargé au prochain cycle si modifié
```
# Document de Succession - AISSIA
## Contexte
AISSIA = Assistant vocal agentique basé sur GroveEngine (C++17 hot-reload). Architecture "Claude Code en vocal" avec tools internes + FileSystem + MCP.
**Dernier commit** : `37b62b5`
## État Actuel
### Ce qui fonctionne
**Build complet** - `cmake -B build && cmake --build build -j4`
**6 modules hot-reload** - Scheduler, Notification, Monitoring, AI, Voice, Storage
**4 services** - LLMService, StorageService, PlatformService, VoiceService
**17 tools pour l'agent** :
- 11 tools internes (via IIO pub/sub)
- 6 FileSystem tools (read/write/edit/list/glob/grep)
- MCP tools (désactivés par défaut)
**Tests** - 67/75 tests modules+types passent
### Lancement
```bash
# Build
cmake -B build && cmake --build build -j4
# Run (depuis racine ou build/)
./build/aissia
# Mode MCP Server (expose les tools via JSON-RPC stdio)
./build/aissia --mcp-server
# Tests
cmake -B build -DBUILD_TESTING=ON
./build/tests/aissia_tests "[scheduler],[notification]" # Modules
./build/tests/aissia_tests "[types]" # MCP types
```
### Variables d'Environnement
```bash
export ANTHROPIC_API_KEY="sk-ant-..." # Requis pour Claude API
```
## Architecture
```
┌─────────────────────────────────────────────────────────────┐
│ LLMService │
│ (Agentic Loop) │
├─────────────────────────────────────────────────────────────┤
│ ToolRegistry │
│ ├── InternalTools (11) ─────► IIO pub/sub │
│ ├── FileSystemTools (6) ────► Direct C++ (read/write/edit) │
│ └── MCPClient (optionnel) ──► stdio JSON-RPC │
└─────────────────────────────────────────────────────────────┘
┌──────────────┬─────┴──────┬──────────────┐
Scheduler Monitoring Storage Voice
Module Module Module Module
```
### Tools Disponibles
| Catégorie | Tools | Communication |
|-----------|-------|---------------|
| Scheduler | get_current_task, list_tasks, start_task, complete_task, start_break | IIO |
| Monitoring | get_focus_stats, get_current_app | IIO |
| Storage | save_note, query_notes, get_session_history | IIO |
| Voice | speak | IIO |
| FileSystem | read_file, write_file, edit_file, list_directory, glob_files, grep_files | Direct C++ |
### FileSystem Tools (Nouveau)
Implémentés dans `src/shared/tools/FileSystemTools.*` :
```cpp
// Lecture avec numéros de ligne
FileSystemTools::execute("read_file", {{"path", "/path/to/file"}, {"limit", 10}});
// Édition style Claude Code
FileSystemTools::execute("edit_file", {
{"path", "/path/to/file"},
{"old_string", "foo"},
{"new_string", "bar"}
});
// Recherche
FileSystemTools::execute("glob_files", {{"pattern", "**/*.cpp"}});
FileSystemTools::execute("grep_files", {{"pattern", "TODO"}, {"path", "./src"}});
```
**Sécurité** :
- Chemins autorisés configurables
- Patterns bloqués : `*.env`, `*.key`, `*credentials*`
- Limites : 1MB lecture, 10MB écriture
## Fichiers Clés
### Nouveaux (Session actuelle)
```
src/shared/tools/FileSystemTools.hpp
src/shared/tools/FileSystemTools.cpp
PLAN_FILESYSTEM_TOOLS.md
```
### Services
```
src/services/LLMService.* # Agentic loop, tools registry
src/services/StorageService.* # SQLite persistence
src/services/PlatformService.* # Window tracking
src/services/VoiceService.* # TTS/STT
```
### Modules (Hot-Reload)
```
src/modules/SchedulerModule.*
src/modules/NotificationModule.*
src/modules/MonitoringModule.*
src/modules/AIModule.*
src/modules/VoiceModule.*
src/modules/StorageModule.*
```
### MCP
```
src/shared/mcp/MCPTypes.hpp
src/shared/mcp/MCPClient.* # Client MCP (consomme des serveurs externes)
src/shared/mcp/MCPServer.* # Serveur MCP (expose AISSIA comme serveur)
src/shared/mcp/StdioTransport.*
config/mcp.json
```
## Tests
```bash
# Build avec tests
cmake -B build -DBUILD_TESTING=ON && cmake --build build -j4
# Par catégorie
./build/tests/aissia_tests "[scheduler]" # 10 tests
./build/tests/aissia_tests "[notification]" # 10 tests
./build/tests/aissia_tests "[types]" # 15 tests MCP
# Tous les modules
./build/tests/aissia_tests "[scheduler],[notification],[monitoring],[ai],[voice],[storage]"
```
**Résultats actuels** :
- Modules : 52/60 (87%)
- MCP Types : 15/15 (100%)
- MCP Transport/Client : Nécessite fix serveurs Python
## Prochaines Étapes
### Priorité Haute
1. **Tester avec API key** - Vérifier la boucle agentique complète
2. **Activer MCP filesystem** - Pour tests end-to-end avec tools externes
### Priorité Moyenne
3. **Fixer tests MCP Transport** - Les serveurs Python reçoivent EOF
4. **Ajouter plus de tools** - add_task, set_reminder, etc.
5. **Streaming responses** - Feedback temps réel pendant génération
### Priorité Basse
6. **Tests end-to-end** - Flux complet inter-modules
7. **CI/CD** - GitHub Actions
8. **Documentation API** - Doxygen
## MCP Server Mode
AISSIA peut fonctionner comme **serveur MCP**, exposant ses tools à des clients externes via JSON-RPC sur stdio.
```bash
./build/aissia --mcp-server
```
### Protocole
Communication JSON-RPC 2.0 sur stdin/stdout :
```json
// Client → AISSIA
{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"clientInfo":{"name":"client","version":"1.0"}}}
// AISSIA → Client
{"jsonrpc":"2.0","id":1,"result":{"protocolVersion":"2024-11-05","serverInfo":{"name":"aissia","version":"0.2.0"},...}}
// Lister les tools
{"jsonrpc":"2.0","id":2,"method":"tools/list","params":{}}
// Appeler un tool
{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"list_directory","arguments":{"path":"."}}}
```
### Utilisation avec Claude Code
Ajouter dans la config MCP :
```json
{
"servers": {
"aissia": {
"command": "/chemin/vers/build/aissia",
"args": ["--mcp-server"]
}
}
}
```
### Tools Exposés (actuellement)
6 FileSystem tools. TODO: exposer les tools internes (scheduler, voice, etc.).
## Notes Techniques
### WSL
- Window tracker non disponible (stub utilisé)
- espeak non installé (TTS stub)
- Tout le reste fonctionne
### Hot-Reload
Les modules sont des `.so` chargés dynamiquement. Pour recompiler un module :
```bash
cmake --build build --target SchedulerModule
# Le module sera rechargé au prochain cycle si modifié
```

View File

@ -1,16 +1,16 @@
$ErrorActionPreference = "Continue"
cd "C:\Users\alexi\Documents\projects\aissia"
Write-Host "=== Running aissia_tests.exe ===" -ForegroundColor Cyan
& ".\build\tests\aissia_tests.exe" 2>&1 | Tee-Object -FilePath "test_output.txt"
$testExitCode = $LASTEXITCODE
Write-Host "`nTest exit code: $testExitCode" -ForegroundColor $(if ($testExitCode -eq 0) { "Green" } else { "Red" })
Write-Host "`n=== Running test_stt_engines.exe ===" -ForegroundColor Cyan
& ".\build\test_stt_engines.exe" 2>&1 | Tee-Object -FilePath "stt_test_output.txt" -Append
$sttExitCode = $LASTEXITCODE
Write-Host "`nSTT Test exit code: $sttExitCode" -ForegroundColor $(if ($sttExitCode -eq 0) { "Green" } else { "Red" })
Write-Host "`n=== Test Summary ===" -ForegroundColor Cyan
Write-Host "aissia_tests: $(if ($testExitCode -eq 0) { 'PASSED' } else { 'FAILED' })"
Write-Host "test_stt_engines: $(if ($sttExitCode -eq 0) { 'PASSED' } else { 'FAILED' })"
$ErrorActionPreference = "Continue"
cd "C:\Users\alexi\Documents\projects\aissia"
Write-Host "=== Running aissia_tests.exe ===" -ForegroundColor Cyan
& ".\build\tests\aissia_tests.exe" 2>&1 | Tee-Object -FilePath "test_output.txt"
$testExitCode = $LASTEXITCODE
Write-Host "`nTest exit code: $testExitCode" -ForegroundColor $(if ($testExitCode -eq 0) { "Green" } else { "Red" })
Write-Host "`n=== Running test_stt_engines.exe ===" -ForegroundColor Cyan
& ".\build\test_stt_engines.exe" 2>&1 | Tee-Object -FilePath "stt_test_output.txt" -Append
$sttExitCode = $LASTEXITCODE
Write-Host "`nSTT Test exit code: $sttExitCode" -ForegroundColor $(if ($sttExitCode -eq 0) { "Green" } else { "Red" })
Write-Host "`n=== Test Summary ===" -ForegroundColor Cyan
Write-Host "aissia_tests: $(if ($testExitCode -eq 0) { 'PASSED' } else { 'FAILED' })"
Write-Host "test_stt_engines: $(if ($sttExitCode -eq 0) { 'PASSED' } else { 'FAILED' })"

View File

@ -1,39 +1,39 @@
#pragma once
#include <grove/IIO.h>
#include <string>
namespace aissia {
/**
* @brief Interface for infrastructure services
*
* Services handle non-hot-reloadable infrastructure:
* - LLM HTTP calls
* - SQLite database
* - Platform APIs (Win32/X11)
* - TTS/STT engines
*
* Services communicate with modules via IIO pub/sub.
*/
class IService {
public:
virtual ~IService() = default;
/// Initialize the service with IIO for pub/sub
virtual bool initialize(grove::IIO* io) = 0;
/// Process pending work (called each frame from main loop)
virtual void process() = 0;
/// Clean shutdown
virtual void shutdown() = 0;
/// Service name for logging
virtual std::string getName() const = 0;
/// Check if service is healthy
virtual bool isHealthy() const = 0;
};
} // namespace aissia
#pragma once
#include <grove/IIO.h>
#include <string>
namespace aissia {
/**
* @brief Interface for infrastructure services
*
* Services handle non-hot-reloadable infrastructure:
* - LLM HTTP calls
* - SQLite database
* - Platform APIs (Win32/X11)
* - TTS/STT engines
*
* Services communicate with modules via IIO pub/sub.
*/
class IService {
public:
virtual ~IService() = default;
/// Initialize the service with IIO for pub/sub
virtual bool initialize(grove::IIO* io) = 0;
/// Process pending work (called each frame from main loop)
virtual void process() = 0;
/// Clean shutdown
virtual void shutdown() = 0;
/// Service name for logging
virtual std::string getName() const = 0;
/// Check if service is healthy
virtual bool isHealthy() const = 0;
};
} // namespace aissia

View File

@ -1,375 +1,375 @@
#include "LLMService.hpp"
#include "../shared/llm/LLMProviderFactory.hpp"
#include <spdlog/sinks/stdout_color_sinks.h>
#include <fstream>
namespace aissia {
LLMService::LLMService() {
m_logger = spdlog::get("LLMService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("LLMService");
}
}
LLMService::~LLMService() {
shutdown();
}
bool LLMService::initialize(grove::IIO* io) {
m_io = io;
if (m_io) {
grove::SubscriptionConfig config;
m_io->subscribe("llm:request", config);
}
// Start worker thread
m_running = true;
m_workerThread = std::thread(&LLMService::workerLoop, this);
m_logger->info("LLMService initialized");
return true;
}
bool LLMService::loadConfig(const std::string& configPath) {
try {
std::ifstream file(configPath);
if (!file.is_open()) {
m_logger->warn("Config file not found: {}", configPath);
return false;
}
nlohmann::json config;
file >> config;
m_provider = LLMProviderFactory::create(config);
if (!m_provider) {
m_logger->error("Failed to create LLM provider");
return false;
}
m_providerName = config.value("provider", "claude");
m_maxIterations = config.value("max_iterations", 10);
m_defaultSystemPrompt = config.value("system_prompt",
"Tu es AISSIA, un assistant personnel intelligent. "
"Tu peux utiliser des tools pour accomplir des taches: "
"gerer le planning, verifier le focus, sauvegarder des notes, "
"lire des fichiers, faire des recherches web, etc.");
m_logger->info("LLM provider loaded: {} ({})", m_providerName, m_provider->getModel());
// Initialize tools after provider is ready
initializeTools();
return true;
} catch (const std::exception& e) {
m_logger->error("Failed to load config: {}", e.what());
return false;
}
}
void LLMService::initializeTools() {
m_logger->info("Initializing tools...");
// 1. Internal tools (via GroveEngine IIO)
if (m_io) {
m_internalTools = std::make_unique<InternalTools>(m_io);
for (const auto& tool : m_internalTools->getTools()) {
m_toolRegistry.registerTool(tool);
}
m_logger->info("Registered {} internal tools", m_internalTools->size());
}
// 2. FileSystem tools (direct C++ execution)
for (const auto& toolDef : tools::FileSystemTools::getToolDefinitions()) {
std::string toolName = toolDef["name"].get<std::string>();
m_toolRegistry.registerTool(
toolName,
toolDef["description"].get<std::string>(),
toolDef["input_schema"],
[toolName](const nlohmann::json& input) -> nlohmann::json {
return tools::FileSystemTools::execute(toolName, input);
}
);
}
m_logger->info("Registered {} filesystem tools", tools::FileSystemTools::getToolDefinitions().size());
// 3. MCP tools (via external servers)
m_mcpClient = std::make_unique<mcp::MCPClient>();
if (loadMCPConfig("config/mcp.json")) {
int connected = m_mcpClient->connectAll();
if (connected > 0) {
for (const auto& tool : m_mcpClient->listAllTools()) {
// Convert MCP tool to our ToolDefinition format
m_toolRegistry.registerTool(
tool.name,
tool.description,
tool.inputSchema,
[this, toolName = tool.name](const nlohmann::json& input) -> nlohmann::json {
auto result = m_mcpClient->callTool(toolName, input);
// Convert MCP result to simple JSON
if (result.isError) {
return {{"error", true}, {"content", result.content}};
}
// Extract text content
std::string text;
for (const auto& content : result.content) {
if (content.contains("text")) {
text += content["text"].get<std::string>();
}
}
return {{"content", text}};
}
);
}
m_logger->info("Registered {} MCP tools from {} servers",
m_mcpClient->toolCount(), connected);
}
}
m_logger->info("Total tools available: {}", m_toolRegistry.size());
}
bool LLMService::loadMCPConfig(const std::string& configPath) {
return m_mcpClient->loadConfig(configPath);
}
void LLMService::registerTool(const std::string& name, const std::string& description,
const nlohmann::json& schema,
std::function<nlohmann::json(const nlohmann::json&)> handler) {
m_toolRegistry.registerTool(name, description, schema, handler);
m_logger->debug("Tool registered: {}", name);
}
void LLMService::process() {
processIncomingMessages();
publishResponses();
}
void LLMService::processIncomingMessages() {
if (!m_io) return;
while (m_io->hasMessages() > 0) {
auto msg = m_io->pullMessage();
if (msg.topic == "llm:request" && msg.data) {
Request req;
req.query = msg.data->getString("query", "");
req.systemPrompt = msg.data->getString("systemPrompt", m_defaultSystemPrompt);
req.conversationId = msg.data->getString("conversationId", "default");
req.maxIterations = msg.data->getInt("maxIterations", m_maxIterations);
// Get tools from message or use registered tools
auto* toolsNode = msg.data->getChildReadOnly("tools");
if (toolsNode) {
// Custom tools from message
// (would need to parse from IDataNode)
}
if (!req.query.empty()) {
std::lock_guard<std::mutex> lock(m_requestMutex);
m_requestQueue.push(std::move(req));
m_requestCV.notify_one();
m_logger->debug("Request queued: {}", req.query.substr(0, 50));
}
}
}
}
void LLMService::publishResponses() {
if (!m_io) return;
std::lock_guard<std::mutex> lock(m_responseMutex);
while (!m_responseQueue.empty()) {
auto resp = std::move(m_responseQueue.front());
m_responseQueue.pop();
if (resp.isError) {
auto event = std::make_unique<grove::JsonDataNode>("error");
event->setString("message", resp.text);
event->setString("conversationId", resp.conversationId);
m_io->publish("llm:error", std::move(event));
} else {
auto event = std::make_unique<grove::JsonDataNode>("response");
event->setString("text", resp.text);
event->setString("conversationId", resp.conversationId);
event->setInt("tokens", resp.tokens);
event->setInt("iterations", resp.iterations);
m_io->publish("llm:response", std::move(event));
m_logger->info("Response published: {} chars", resp.text.size());
}
}
}
void LLMService::workerLoop() {
m_logger->debug("Worker thread started");
while (m_running) {
Request req;
{
std::unique_lock<std::mutex> lock(m_requestMutex);
m_requestCV.wait_for(lock, std::chrono::milliseconds(100), [this] {
return !m_requestQueue.empty() || !m_running;
});
if (!m_running) break;
if (m_requestQueue.empty()) continue;
req = std::move(m_requestQueue.front());
m_requestQueue.pop();
}
// Process request (HTTP calls happen here)
auto resp = processRequest(req);
{
std::lock_guard<std::mutex> lock(m_responseMutex);
m_responseQueue.push(std::move(resp));
}
}
m_logger->debug("Worker thread stopped");
}
LLMService::Response LLMService::processRequest(const Request& request) {
Response resp;
resp.conversationId = request.conversationId;
if (!m_provider) {
resp.text = "LLM provider not initialized";
resp.isError = true;
return resp;
}
try {
// Get or create conversation history
auto& history = m_conversations[request.conversationId];
if (history.is_null()) {
history = nlohmann::json::array();
}
// Add user message
history.push_back({{"role", "user"}, {"content", request.query}});
// Get tool definitions
nlohmann::json tools = m_toolRegistry.getToolDefinitions();
// Run agentic loop
auto result = agenticLoop(request.query, request.systemPrompt,
history, tools, request.maxIterations);
if (result.contains("error")) {
resp.text = result["error"].get<std::string>();
resp.isError = true;
} else {
resp.text = result["response"].get<std::string>();
resp.tokens = result.value("tokens", 0);
resp.iterations = result.value("iterations", 1);
// Add assistant response to history
history.push_back({{"role", "assistant"}, {"content", resp.text}});
}
} catch (const std::exception& e) {
resp.text = e.what();
resp.isError = true;
}
return resp;
}
nlohmann::json LLMService::agenticLoop(const std::string& query, const std::string& systemPrompt,
nlohmann::json& messages, const nlohmann::json& tools,
int maxIterations) {
int totalTokens = 0;
for (int iteration = 0; iteration < maxIterations; iteration++) {
m_logger->debug("Agentic loop iteration {}", iteration + 1);
auto response = m_provider->chat(systemPrompt, messages, tools);
totalTokens += response.input_tokens + response.output_tokens;
if (response.is_end_turn) {
return {
{"response", response.text},
{"iterations", iteration + 1},
{"tokens", totalTokens}
};
}
// Execute tool calls
if (!response.tool_calls.empty()) {
std::vector<ToolResult> results;
for (const auto& call : response.tool_calls) {
m_logger->debug("Executing tool: {}", call.name);
nlohmann::json result = m_toolRegistry.execute(call.name, call.input);
results.push_back({call.id, result.dump(), false});
}
// Append assistant message and tool results
m_provider->appendAssistantMessage(messages, response);
auto toolResultsMsg = m_provider->formatToolResults(results);
if (toolResultsMsg.is_array()) {
for (const auto& msg : toolResultsMsg) {
messages.push_back(msg);
}
} else {
messages.push_back(toolResultsMsg);
}
}
}
return {{"error", "max_iterations_reached"}};
}
void LLMService::shutdown() {
m_running = false;
m_requestCV.notify_all();
if (m_workerThread.joinable()) {
m_workerThread.join();
}
m_logger->info("LLMService shutdown");
}
LLMService::SyncResponse LLMService::sendMessageSync(
const std::string& message,
const std::string& conversationId,
const std::string& systemPrompt
) {
SyncResponse syncResp;
// Create request (same as async mode)
Request request;
request.query = message;
request.conversationId = conversationId.empty() ? "mcp-session" : conversationId;
request.systemPrompt = systemPrompt.empty() ? m_defaultSystemPrompt : systemPrompt;
request.maxIterations = m_maxIterations;
// Process synchronously (blocking call)
auto response = processRequest(request);
// Convert to SyncResponse
if (!response.isError) {
syncResp.text = response.text;
syncResp.tokens = response.tokens;
syncResp.iterations = response.iterations;
} else {
// On error, return error in text
syncResp.text = "Error: " + response.text;
syncResp.tokens = 0;
syncResp.iterations = 0;
}
return syncResp;
}
} // namespace aissia
#include "LLMService.hpp"
#include "../shared/llm/LLMProviderFactory.hpp"
#include <spdlog/sinks/stdout_color_sinks.h>
#include <fstream>
namespace aissia {
LLMService::LLMService() {
m_logger = spdlog::get("LLMService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("LLMService");
}
}
LLMService::~LLMService() {
shutdown();
}
bool LLMService::initialize(grove::IIO* io) {
m_io = io;
if (m_io) {
grove::SubscriptionConfig config;
m_io->subscribe("llm:request", config);
}
// Start worker thread
m_running = true;
m_workerThread = std::thread(&LLMService::workerLoop, this);
m_logger->info("LLMService initialized");
return true;
}
bool LLMService::loadConfig(const std::string& configPath) {
try {
std::ifstream file(configPath);
if (!file.is_open()) {
m_logger->warn("Config file not found: {}", configPath);
return false;
}
nlohmann::json config;
file >> config;
m_provider = LLMProviderFactory::create(config);
if (!m_provider) {
m_logger->error("Failed to create LLM provider");
return false;
}
m_providerName = config.value("provider", "claude");
m_maxIterations = config.value("max_iterations", 10);
m_defaultSystemPrompt = config.value("system_prompt",
"Tu es AISSIA, un assistant personnel intelligent. "
"Tu peux utiliser des tools pour accomplir des taches: "
"gerer le planning, verifier le focus, sauvegarder des notes, "
"lire des fichiers, faire des recherches web, etc.");
m_logger->info("LLM provider loaded: {} ({})", m_providerName, m_provider->getModel());
// Initialize tools after provider is ready
initializeTools();
return true;
} catch (const std::exception& e) {
m_logger->error("Failed to load config: {}", e.what());
return false;
}
}
void LLMService::initializeTools() {
m_logger->info("Initializing tools...");
// 1. Internal tools (via GroveEngine IIO)
if (m_io) {
m_internalTools = std::make_unique<InternalTools>(m_io);
for (const auto& tool : m_internalTools->getTools()) {
m_toolRegistry.registerTool(tool);
}
m_logger->info("Registered {} internal tools", m_internalTools->size());
}
// 2. FileSystem tools (direct C++ execution)
for (const auto& toolDef : tools::FileSystemTools::getToolDefinitions()) {
std::string toolName = toolDef["name"].get<std::string>();
m_toolRegistry.registerTool(
toolName,
toolDef["description"].get<std::string>(),
toolDef["input_schema"],
[toolName](const nlohmann::json& input) -> nlohmann::json {
return tools::FileSystemTools::execute(toolName, input);
}
);
}
m_logger->info("Registered {} filesystem tools", tools::FileSystemTools::getToolDefinitions().size());
// 3. MCP tools (via external servers)
m_mcpClient = std::make_unique<mcp::MCPClient>();
if (loadMCPConfig("config/mcp.json")) {
int connected = m_mcpClient->connectAll();
if (connected > 0) {
for (const auto& tool : m_mcpClient->listAllTools()) {
// Convert MCP tool to our ToolDefinition format
m_toolRegistry.registerTool(
tool.name,
tool.description,
tool.inputSchema,
[this, toolName = tool.name](const nlohmann::json& input) -> nlohmann::json {
auto result = m_mcpClient->callTool(toolName, input);
// Convert MCP result to simple JSON
if (result.isError) {
return {{"error", true}, {"content", result.content}};
}
// Extract text content
std::string text;
for (const auto& content : result.content) {
if (content.contains("text")) {
text += content["text"].get<std::string>();
}
}
return {{"content", text}};
}
);
}
m_logger->info("Registered {} MCP tools from {} servers",
m_mcpClient->toolCount(), connected);
}
}
m_logger->info("Total tools available: {}", m_toolRegistry.size());
}
bool LLMService::loadMCPConfig(const std::string& configPath) {
return m_mcpClient->loadConfig(configPath);
}
void LLMService::registerTool(const std::string& name, const std::string& description,
const nlohmann::json& schema,
std::function<nlohmann::json(const nlohmann::json&)> handler) {
m_toolRegistry.registerTool(name, description, schema, handler);
m_logger->debug("Tool registered: {}", name);
}
void LLMService::process() {
processIncomingMessages();
publishResponses();
}
void LLMService::processIncomingMessages() {
if (!m_io) return;
while (m_io->hasMessages() > 0) {
auto msg = m_io->pullMessage();
if (msg.topic == "llm:request" && msg.data) {
Request req;
req.query = msg.data->getString("query", "");
req.systemPrompt = msg.data->getString("systemPrompt", m_defaultSystemPrompt);
req.conversationId = msg.data->getString("conversationId", "default");
req.maxIterations = msg.data->getInt("maxIterations", m_maxIterations);
// Get tools from message or use registered tools
auto* toolsNode = msg.data->getChildReadOnly("tools");
if (toolsNode) {
// Custom tools from message
// (would need to parse from IDataNode)
}
if (!req.query.empty()) {
std::lock_guard<std::mutex> lock(m_requestMutex);
m_requestQueue.push(std::move(req));
m_requestCV.notify_one();
m_logger->debug("Request queued: {}", req.query.substr(0, 50));
}
}
}
}
void LLMService::publishResponses() {
if (!m_io) return;
std::lock_guard<std::mutex> lock(m_responseMutex);
while (!m_responseQueue.empty()) {
auto resp = std::move(m_responseQueue.front());
m_responseQueue.pop();
if (resp.isError) {
auto event = std::make_unique<grove::JsonDataNode>("error");
event->setString("message", resp.text);
event->setString("conversationId", resp.conversationId);
m_io->publish("llm:error", std::move(event));
} else {
auto event = std::make_unique<grove::JsonDataNode>("response");
event->setString("text", resp.text);
event->setString("conversationId", resp.conversationId);
event->setInt("tokens", resp.tokens);
event->setInt("iterations", resp.iterations);
m_io->publish("llm:response", std::move(event));
m_logger->info("Response published: {} chars", resp.text.size());
}
}
}
void LLMService::workerLoop() {
m_logger->debug("Worker thread started");
while (m_running) {
Request req;
{
std::unique_lock<std::mutex> lock(m_requestMutex);
m_requestCV.wait_for(lock, std::chrono::milliseconds(100), [this] {
return !m_requestQueue.empty() || !m_running;
});
if (!m_running) break;
if (m_requestQueue.empty()) continue;
req = std::move(m_requestQueue.front());
m_requestQueue.pop();
}
// Process request (HTTP calls happen here)
auto resp = processRequest(req);
{
std::lock_guard<std::mutex> lock(m_responseMutex);
m_responseQueue.push(std::move(resp));
}
}
m_logger->debug("Worker thread stopped");
}
LLMService::Response LLMService::processRequest(const Request& request) {
Response resp;
resp.conversationId = request.conversationId;
if (!m_provider) {
resp.text = "LLM provider not initialized";
resp.isError = true;
return resp;
}
try {
// Get or create conversation history
auto& history = m_conversations[request.conversationId];
if (history.is_null()) {
history = nlohmann::json::array();
}
// Add user message
history.push_back({{"role", "user"}, {"content", request.query}});
// Get tool definitions
nlohmann::json tools = m_toolRegistry.getToolDefinitions();
// Run agentic loop
auto result = agenticLoop(request.query, request.systemPrompt,
history, tools, request.maxIterations);
if (result.contains("error")) {
resp.text = result["error"].get<std::string>();
resp.isError = true;
} else {
resp.text = result["response"].get<std::string>();
resp.tokens = result.value("tokens", 0);
resp.iterations = result.value("iterations", 1);
// Add assistant response to history
history.push_back({{"role", "assistant"}, {"content", resp.text}});
}
} catch (const std::exception& e) {
resp.text = e.what();
resp.isError = true;
}
return resp;
}
nlohmann::json LLMService::agenticLoop(const std::string& query, const std::string& systemPrompt,
nlohmann::json& messages, const nlohmann::json& tools,
int maxIterations) {
int totalTokens = 0;
for (int iteration = 0; iteration < maxIterations; iteration++) {
m_logger->debug("Agentic loop iteration {}", iteration + 1);
auto response = m_provider->chat(systemPrompt, messages, tools);
totalTokens += response.input_tokens + response.output_tokens;
if (response.is_end_turn) {
return {
{"response", response.text},
{"iterations", iteration + 1},
{"tokens", totalTokens}
};
}
// Execute tool calls
if (!response.tool_calls.empty()) {
std::vector<ToolResult> results;
for (const auto& call : response.tool_calls) {
m_logger->debug("Executing tool: {}", call.name);
nlohmann::json result = m_toolRegistry.execute(call.name, call.input);
results.push_back({call.id, result.dump(), false});
}
// Append assistant message and tool results
m_provider->appendAssistantMessage(messages, response);
auto toolResultsMsg = m_provider->formatToolResults(results);
if (toolResultsMsg.is_array()) {
for (const auto& msg : toolResultsMsg) {
messages.push_back(msg);
}
} else {
messages.push_back(toolResultsMsg);
}
}
}
return {{"error", "max_iterations_reached"}};
}
void LLMService::shutdown() {
m_running = false;
m_requestCV.notify_all();
if (m_workerThread.joinable()) {
m_workerThread.join();
}
m_logger->info("LLMService shutdown");
}
LLMService::SyncResponse LLMService::sendMessageSync(
const std::string& message,
const std::string& conversationId,
const std::string& systemPrompt
) {
SyncResponse syncResp;
// Create request (same as async mode)
Request request;
request.query = message;
request.conversationId = conversationId.empty() ? "mcp-session" : conversationId;
request.systemPrompt = systemPrompt.empty() ? m_defaultSystemPrompt : systemPrompt;
request.maxIterations = m_maxIterations;
// Process synchronously (blocking call)
auto response = processRequest(request);
// Convert to SyncResponse
if (!response.isError) {
syncResp.text = response.text;
syncResp.tokens = response.tokens;
syncResp.iterations = response.iterations;
} else {
// On error, return error in text
syncResp.text = "Error: " + response.text;
syncResp.tokens = 0;
syncResp.iterations = 0;
}
return syncResp;
}
} // namespace aissia

View File

@ -1,140 +1,140 @@
#pragma once
#include "IService.hpp"
#include "../shared/llm/ILLMProvider.hpp"
#include "../shared/llm/ToolRegistry.hpp"
#include "../shared/tools/InternalTools.hpp"
#include "../shared/tools/FileSystemTools.hpp"
#include "../shared/mcp/MCPClient.hpp"
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <spdlog/spdlog.h>
#include <memory>
#include <string>
#include <queue>
#include <mutex>
#include <thread>
#include <atomic>
#include <condition_variable>
namespace aissia {
/**
* @brief LLM Service - Async HTTP calls to LLM providers
*
* Handles all LLM API calls in a background thread.
* Modules communicate via IIO:
*
* Subscribes to:
* - "llm:request" : { query, systemPrompt?, tools?, conversationId? }
*
* Publishes:
* - "llm:response" : { text, conversationId, tokens, iterations }
* - "llm:error" : { message, conversationId }
* - "llm:thinking" : { conversationId } (during agentic loop)
*/
class LLMService : public IService {
public:
LLMService();
~LLMService() override;
bool initialize(grove::IIO* io) override;
void process() override;
void shutdown() override;
std::string getName() const override { return "LLMService"; }
bool isHealthy() const override { return m_provider != nullptr; }
/// Load provider from config file
bool loadConfig(const std::string& configPath);
/// Register a tool that can be called by the LLM
void registerTool(const std::string& name, const std::string& description,
const nlohmann::json& schema,
std::function<nlohmann::json(const nlohmann::json&)> handler);
/// Load and initialize all tools (internal + MCP)
void initializeTools();
/// Load MCP server configurations
bool loadMCPConfig(const std::string& configPath);
/**
* @brief Synchronous response structure for MCP Server mode
*/
struct SyncResponse {
std::string text;
int tokens = 0;
int iterations = 0;
};
/**
* @brief Send message synchronously (blocking, for MCP Server mode)
*
* @param message User message
* @param conversationId Conversation ID (optional)
* @param systemPrompt Custom system prompt (optional)
* @return Sync response with text, tokens, iterations
*/
SyncResponse sendMessageSync(
const std::string& message,
const std::string& conversationId = "",
const std::string& systemPrompt = ""
);
private:
struct Request {
std::string query;
std::string systemPrompt;
std::string conversationId;
nlohmann::json tools;
int maxIterations = 10;
};
struct Response {
std::string text;
std::string conversationId;
int tokens = 0;
int iterations = 0;
bool isError = false;
};
// Configuration
std::string m_providerName = "claude";
std::string m_defaultSystemPrompt;
int m_maxIterations = 10;
// State
std::unique_ptr<ILLMProvider> m_provider;
ToolRegistry m_toolRegistry;
std::unique_ptr<InternalTools> m_internalTools;
std::unique_ptr<mcp::MCPClient> m_mcpClient;
std::map<std::string, nlohmann::json> m_conversations; // conversationId -> history
// Threading
std::thread m_workerThread;
std::atomic<bool> m_running{false};
std::queue<Request> m_requestQueue;
std::queue<Response> m_responseQueue;
std::mutex m_requestMutex;
std::mutex m_responseMutex;
std::condition_variable m_requestCV;
// Services
grove::IIO* m_io = nullptr;
std::shared_ptr<spdlog::logger> m_logger;
// Worker thread
void workerLoop();
Response processRequest(const Request& request);
nlohmann::json agenticLoop(const std::string& query, const std::string& systemPrompt,
nlohmann::json& messages, const nlohmann::json& tools,
int maxIterations);
// Message handling
void processIncomingMessages();
void publishResponses();
};
} // namespace aissia
#pragma once
#include "IService.hpp"
#include "../shared/llm/ILLMProvider.hpp"
#include "../shared/llm/ToolRegistry.hpp"
#include "../shared/tools/InternalTools.hpp"
#include "../shared/tools/FileSystemTools.hpp"
#include "../shared/mcp/MCPClient.hpp"
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <spdlog/spdlog.h>
#include <memory>
#include <string>
#include <queue>
#include <mutex>
#include <thread>
#include <atomic>
#include <condition_variable>
namespace aissia {
/**
* @brief LLM Service - Async HTTP calls to LLM providers
*
* Handles all LLM API calls in a background thread.
* Modules communicate via IIO:
*
* Subscribes to:
* - "llm:request" : { query, systemPrompt?, tools?, conversationId? }
*
* Publishes:
* - "llm:response" : { text, conversationId, tokens, iterations }
* - "llm:error" : { message, conversationId }
* - "llm:thinking" : { conversationId } (during agentic loop)
*/
class LLMService : public IService {
public:
LLMService();
~LLMService() override;
bool initialize(grove::IIO* io) override;
void process() override;
void shutdown() override;
std::string getName() const override { return "LLMService"; }
bool isHealthy() const override { return m_provider != nullptr; }
/// Load provider from config file
bool loadConfig(const std::string& configPath);
/// Register a tool that can be called by the LLM
void registerTool(const std::string& name, const std::string& description,
const nlohmann::json& schema,
std::function<nlohmann::json(const nlohmann::json&)> handler);
/// Load and initialize all tools (internal + MCP)
void initializeTools();
/// Load MCP server configurations
bool loadMCPConfig(const std::string& configPath);
/**
* @brief Synchronous response structure for MCP Server mode
*/
struct SyncResponse {
std::string text;
int tokens = 0;
int iterations = 0;
};
/**
* @brief Send message synchronously (blocking, for MCP Server mode)
*
* @param message User message
* @param conversationId Conversation ID (optional)
* @param systemPrompt Custom system prompt (optional)
* @return Sync response with text, tokens, iterations
*/
SyncResponse sendMessageSync(
const std::string& message,
const std::string& conversationId = "",
const std::string& systemPrompt = ""
);
private:
struct Request {
std::string query;
std::string systemPrompt;
std::string conversationId;
nlohmann::json tools;
int maxIterations = 10;
};
struct Response {
std::string text;
std::string conversationId;
int tokens = 0;
int iterations = 0;
bool isError = false;
};
// Configuration
std::string m_providerName = "claude";
std::string m_defaultSystemPrompt;
int m_maxIterations = 10;
// State
std::unique_ptr<ILLMProvider> m_provider;
ToolRegistry m_toolRegistry;
std::unique_ptr<InternalTools> m_internalTools;
std::unique_ptr<mcp::MCPClient> m_mcpClient;
std::map<std::string, nlohmann::json> m_conversations; // conversationId -> history
// Threading
std::thread m_workerThread;
std::atomic<bool> m_running{false};
std::queue<Request> m_requestQueue;
std::queue<Response> m_responseQueue;
std::mutex m_requestMutex;
std::mutex m_responseMutex;
std::condition_variable m_requestCV;
// Services
grove::IIO* m_io = nullptr;
std::shared_ptr<spdlog::logger> m_logger;
// Worker thread
void workerLoop();
Response processRequest(const Request& request);
nlohmann::json agenticLoop(const std::string& query, const std::string& systemPrompt,
nlohmann::json& messages, const nlohmann::json& tools,
int maxIterations);
// Message handling
void processIncomingMessages();
void publishResponses();
};
} // namespace aissia

View File

@ -1,139 +1,139 @@
#include "PlatformService.hpp"
#include <spdlog/sinks/stdout_color_sinks.h>
namespace aissia {
PlatformService::PlatformService() {
m_logger = spdlog::get("PlatformService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("PlatformService");
}
}
bool PlatformService::initialize(grove::IIO* io) {
m_io = io;
// Create platform-specific window tracker
m_tracker = WindowTrackerFactory::create();
if (!m_tracker || !m_tracker->isAvailable()) {
m_logger->warn("Window tracker not available on this platform");
return true; // Non-fatal, module can work without tracking
}
if (m_io) {
grove::SubscriptionConfig config;
m_io->subscribe("platform:query_window", config);
}
m_logger->info("PlatformService initialized: {}", m_tracker->getPlatformName());
return true;
}
void PlatformService::configure(int pollIntervalMs, int idleThresholdSeconds) {
m_pollIntervalMs = pollIntervalMs;
m_idleThresholdSeconds = idleThresholdSeconds;
m_logger->debug("Configured: poll={}ms, idle={}s", pollIntervalMs, idleThresholdSeconds);
}
void PlatformService::process() {
if (!m_tracker || !m_tracker->isAvailable()) return;
// Use monotonic clock for timing
static auto startTime = std::chrono::steady_clock::now();
auto now = std::chrono::steady_clock::now();
float currentTime = std::chrono::duration<float>(now - startTime).count();
float pollIntervalSec = m_pollIntervalMs / 1000.0f;
if (currentTime - m_lastPollTime >= pollIntervalSec) {
m_lastPollTime = currentTime;
pollWindowInfo(currentTime);
}
// Handle query requests
if (m_io) {
while (m_io->hasMessages() > 0) {
auto msg = m_io->pullMessage();
if (msg.topic == "platform:query_window") {
publishWindowInfo();
}
}
}
}
void PlatformService::pollWindowInfo(float currentTime) {
std::string newApp = m_tracker->getCurrentAppName();
std::string newTitle = m_tracker->getCurrentWindowTitle();
// Check for app change
if (newApp != m_currentApp) {
int duration = static_cast<int>(currentTime - m_appStartTime);
if (!m_currentApp.empty() && duration > 0) {
publishWindowChanged(m_currentApp, newApp, duration);
}
m_currentApp = newApp;
m_currentWindowTitle = newTitle;
m_appStartTime = currentTime;
m_logger->debug("App: {} - {}", m_currentApp,
m_currentWindowTitle.size() > 50 ?
m_currentWindowTitle.substr(0, 50) + "..." : m_currentWindowTitle);
}
// Check idle state
bool isIdle = m_tracker->isUserIdle(m_idleThresholdSeconds);
if (isIdle && !m_wasIdle) {
m_logger->info("User idle detected ({}s)", m_idleThresholdSeconds);
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("idle");
event->setInt("idleSeconds", m_tracker->getIdleTimeSeconds());
m_io->publish("platform:idle_detected", std::move(event));
}
}
else if (!isIdle && m_wasIdle) {
m_logger->info("User activity resumed");
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("active");
m_io->publish("platform:activity_resumed", std::move(event));
}
}
m_wasIdle = isIdle;
// Publish periodic window info
publishWindowInfo();
}
void PlatformService::publishWindowInfo() {
if (!m_io || !m_tracker) return;
auto event = std::make_unique<grove::JsonDataNode>("window");
event->setString("appName", m_currentApp);
event->setString("windowTitle", m_currentWindowTitle);
event->setBool("isIdle", m_wasIdle);
event->setInt("idleSeconds", m_tracker->getIdleTimeSeconds());
m_io->publish("platform:window_info", std::move(event));
}
void PlatformService::publishWindowChanged(const std::string& oldApp,
const std::string& newApp,
int duration) {
if (!m_io) return;
auto event = std::make_unique<grove::JsonDataNode>("changed");
event->setString("oldApp", oldApp);
event->setString("newApp", newApp);
event->setInt("duration", duration);
m_io->publish("platform:window_changed", std::move(event));
}
void PlatformService::shutdown() {
m_tracker.reset();
m_logger->info("PlatformService shutdown");
}
} // namespace aissia
#include "PlatformService.hpp"
#include <spdlog/sinks/stdout_color_sinks.h>
namespace aissia {
PlatformService::PlatformService() {
m_logger = spdlog::get("PlatformService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("PlatformService");
}
}
bool PlatformService::initialize(grove::IIO* io) {
m_io = io;
// Create platform-specific window tracker
m_tracker = WindowTrackerFactory::create();
if (!m_tracker || !m_tracker->isAvailable()) {
m_logger->warn("Window tracker not available on this platform");
return true; // Non-fatal, module can work without tracking
}
if (m_io) {
grove::SubscriptionConfig config;
m_io->subscribe("platform:query_window", config);
}
m_logger->info("PlatformService initialized: {}", m_tracker->getPlatformName());
return true;
}
void PlatformService::configure(int pollIntervalMs, int idleThresholdSeconds) {
m_pollIntervalMs = pollIntervalMs;
m_idleThresholdSeconds = idleThresholdSeconds;
m_logger->debug("Configured: poll={}ms, idle={}s", pollIntervalMs, idleThresholdSeconds);
}
void PlatformService::process() {
if (!m_tracker || !m_tracker->isAvailable()) return;
// Use monotonic clock for timing
static auto startTime = std::chrono::steady_clock::now();
auto now = std::chrono::steady_clock::now();
float currentTime = std::chrono::duration<float>(now - startTime).count();
float pollIntervalSec = m_pollIntervalMs / 1000.0f;
if (currentTime - m_lastPollTime >= pollIntervalSec) {
m_lastPollTime = currentTime;
pollWindowInfo(currentTime);
}
// Handle query requests
if (m_io) {
while (m_io->hasMessages() > 0) {
auto msg = m_io->pullMessage();
if (msg.topic == "platform:query_window") {
publishWindowInfo();
}
}
}
}
void PlatformService::pollWindowInfo(float currentTime) {
std::string newApp = m_tracker->getCurrentAppName();
std::string newTitle = m_tracker->getCurrentWindowTitle();
// Check for app change
if (newApp != m_currentApp) {
int duration = static_cast<int>(currentTime - m_appStartTime);
if (!m_currentApp.empty() && duration > 0) {
publishWindowChanged(m_currentApp, newApp, duration);
}
m_currentApp = newApp;
m_currentWindowTitle = newTitle;
m_appStartTime = currentTime;
m_logger->debug("App: {} - {}", m_currentApp,
m_currentWindowTitle.size() > 50 ?
m_currentWindowTitle.substr(0, 50) + "..." : m_currentWindowTitle);
}
// Check idle state
bool isIdle = m_tracker->isUserIdle(m_idleThresholdSeconds);
if (isIdle && !m_wasIdle) {
m_logger->info("User idle detected ({}s)", m_idleThresholdSeconds);
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("idle");
event->setInt("idleSeconds", m_tracker->getIdleTimeSeconds());
m_io->publish("platform:idle_detected", std::move(event));
}
}
else if (!isIdle && m_wasIdle) {
m_logger->info("User activity resumed");
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("active");
m_io->publish("platform:activity_resumed", std::move(event));
}
}
m_wasIdle = isIdle;
// Publish periodic window info
publishWindowInfo();
}
void PlatformService::publishWindowInfo() {
if (!m_io || !m_tracker) return;
auto event = std::make_unique<grove::JsonDataNode>("window");
event->setString("appName", m_currentApp);
event->setString("windowTitle", m_currentWindowTitle);
event->setBool("isIdle", m_wasIdle);
event->setInt("idleSeconds", m_tracker->getIdleTimeSeconds());
m_io->publish("platform:window_info", std::move(event));
}
void PlatformService::publishWindowChanged(const std::string& oldApp,
const std::string& newApp,
int duration) {
if (!m_io) return;
auto event = std::make_unique<grove::JsonDataNode>("changed");
event->setString("oldApp", oldApp);
event->setString("newApp", newApp);
event->setInt("duration", duration);
m_io->publish("platform:window_changed", std::move(event));
}
void PlatformService::shutdown() {
m_tracker.reset();
m_logger->info("PlatformService shutdown");
}
} // namespace aissia

View File

@ -1,67 +1,67 @@
#pragma once
#include "IService.hpp"
#include "../shared/platform/IWindowTracker.hpp"
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <spdlog/spdlog.h>
#include <memory>
#include <string>
namespace aissia {
/**
* @brief Platform Service - OS-specific APIs (window tracking, etc.)
*
* Handles platform-specific operations that can't be in hot-reload modules.
* Polls foreground window at configurable interval.
*
* Subscribes to:
* - "platform:query_window" : Request current window info
*
* Publishes:
* - "platform:window_info" : { appName, windowTitle, isIdle, idleSeconds }
* - "platform:window_changed" : { oldApp, newApp, duration }
* - "platform:idle_detected" : { idleSeconds }
* - "platform:activity_resumed" : {}
*/
class PlatformService : public IService {
public:
PlatformService();
~PlatformService() override = default;
bool initialize(grove::IIO* io) override;
void process() override;
void shutdown() override;
std::string getName() const override { return "PlatformService"; }
bool isHealthy() const override { return m_tracker != nullptr && m_tracker->isAvailable(); }
/// Configure polling interval and idle threshold
void configure(int pollIntervalMs = 1000, int idleThresholdSeconds = 300);
private:
// Configuration
int m_pollIntervalMs = 1000;
int m_idleThresholdSeconds = 300;
// State
std::unique_ptr<IWindowTracker> m_tracker;
std::string m_currentApp;
std::string m_currentWindowTitle;
float m_appStartTime = 0.0f;
float m_lastPollTime = 0.0f;
bool m_wasIdle = false;
// Services
grove::IIO* m_io = nullptr;
std::shared_ptr<spdlog::logger> m_logger;
// Helpers
void pollWindowInfo(float currentTime);
void publishWindowInfo();
void publishWindowChanged(const std::string& oldApp, const std::string& newApp, int duration);
};
} // namespace aissia
#pragma once
#include "IService.hpp"
#include "../shared/platform/IWindowTracker.hpp"
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <spdlog/spdlog.h>
#include <memory>
#include <string>
namespace aissia {
/**
* @brief Platform Service - OS-specific APIs (window tracking, etc.)
*
* Handles platform-specific operations that can't be in hot-reload modules.
* Polls foreground window at configurable interval.
*
* Subscribes to:
* - "platform:query_window" : Request current window info
*
* Publishes:
* - "platform:window_info" : { appName, windowTitle, isIdle, idleSeconds }
* - "platform:window_changed" : { oldApp, newApp, duration }
* - "platform:idle_detected" : { idleSeconds }
* - "platform:activity_resumed" : {}
*/
class PlatformService : public IService {
public:
PlatformService();
~PlatformService() override = default;
bool initialize(grove::IIO* io) override;
void process() override;
void shutdown() override;
std::string getName() const override { return "PlatformService"; }
bool isHealthy() const override { return m_tracker != nullptr && m_tracker->isAvailable(); }
/// Configure polling interval and idle threshold
void configure(int pollIntervalMs = 1000, int idleThresholdSeconds = 300);
private:
// Configuration
int m_pollIntervalMs = 1000;
int m_idleThresholdSeconds = 300;
// State
std::unique_ptr<IWindowTracker> m_tracker;
std::string m_currentApp;
std::string m_currentWindowTitle;
float m_appStartTime = 0.0f;
float m_lastPollTime = 0.0f;
bool m_wasIdle = false;
// Services
grove::IIO* m_io = nullptr;
std::shared_ptr<spdlog::logger> m_logger;
// Helpers
void pollWindowInfo(float currentTime);
void publishWindowInfo();
void publishWindowChanged(const std::string& oldApp, const std::string& newApp, int duration);
};
} // namespace aissia

View File

@ -1,189 +1,189 @@
// CRITICAL ORDER: Include system headers first
#include <nlohmann/json.hpp>
#include <cstdlib>
#include <memory>
#include <string>
// Include local headers before spdlog
#include "STTService.hpp"
#include "../shared/audio/ISTTEngine.hpp"
// Include spdlog after local headers
#include <spdlog/spdlog.h>
#include <spdlog/sinks/stdout_color_sinks.h>
namespace aissia {
STTService::STTService(const nlohmann::json& config)
: m_config(config)
{
m_logger = spdlog::get("STTService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("STTService");
}
// Extract language from config
if (config.contains("active_mode") && config["active_mode"].contains("language")) {
m_language = config["active_mode"]["language"].get<std::string>();
}
m_logger->info("STTService created");
}
STTService::~STTService() {
stop();
}
bool STTService::start() {
m_logger->info("Starting STT service");
loadEngines();
if (!m_activeEngine || !m_activeEngine->isAvailable()) {
m_logger->error("No active STT engine available");
return false;
}
m_logger->info("STT service started");
return true;
}
void STTService::stop() {
m_logger->info("Stopping STT service");
stopListening();
m_activeEngine.reset();
}
void STTService::setMode(STTMode mode) {
if (m_currentMode == mode) {
return;
}
m_logger->info("Switching STT mode");
m_currentMode = mode;
}
std::string STTService::transcribeFile(const std::string& filePath) {
if (!m_activeEngine || !m_activeEngine->isAvailable()) {
m_logger->warn("No STT engine available for transcription");
return "";
}
m_logger->info("Transcribing file");
try {
std::string result = m_activeEngine->transcribeFile(filePath);
m_logger->info("Transcription complete");
return result;
} catch (const std::exception& e) {
m_logger->error("Transcription failed");
return "";
}
}
std::string STTService::transcribe(const std::vector<float>& audioData) {
if (!m_activeEngine || !m_activeEngine->isAvailable()) {
return "";
}
if (audioData.empty()) {
return "";
}
try {
std::string result = m_activeEngine->transcribe(audioData);
if (!result.empty() && m_listening && m_onTranscription) {
m_onTranscription(result, m_currentMode);
}
return result;
} catch (const std::exception& e) {
m_logger->error("Transcription failed");
return "";
}
}
void STTService::startListening(TranscriptionCallback onTranscription,
KeywordCallback onKeyword) {
m_logger->info("Start listening");
m_onTranscription = onTranscription;
m_onKeyword = onKeyword;
m_listening = true;
m_logger->warn("Streaming microphone capture not yet implemented");
}
void STTService::stopListening() {
if (!m_listening) {
return;
}
m_logger->info("Stop listening");
m_listening = false;
}
void STTService::setLanguage(const std::string& language) {
m_logger->info("Setting language");
m_language = language;
if (m_activeEngine) {
m_activeEngine->setLanguage(language);
}
}
bool STTService::isAvailable() const {
return m_activeEngine && m_activeEngine->isAvailable();
}
std::string STTService::getCurrentEngine() const {
if (m_activeEngine) {
return m_activeEngine->getEngineName();
}
return "none";
}
void STTService::loadEngines() {
m_logger->info("Loading STT engines");
std::string engineType = "auto";
if (m_config.contains("active_mode")) {
const auto& activeMode = m_config["active_mode"];
if (activeMode.contains("engine")) {
engineType = activeMode["engine"];
}
}
std::string modelPath;
if (m_config.contains("active_mode")) {
const auto& activeMode = m_config["active_mode"];
if (activeMode.contains("model_path")) {
modelPath = activeMode["model_path"];
}
}
std::string apiKey;
if (m_config.contains("whisper_api")) {
const auto& whisperApi = m_config["whisper_api"];
std::string apiKeyEnv = "OPENAI_API_KEY";
if (whisperApi.contains("api_key_env")) {
apiKeyEnv = whisperApi["api_key_env"];
}
const char* envVal = std::getenv(apiKeyEnv.c_str());
if (envVal) {
apiKey = envVal;
}
}
m_activeEngine = STTEngineFactory::create(engineType, modelPath, apiKey);
if (m_activeEngine && m_activeEngine->isAvailable()) {
m_activeEngine->setLanguage(m_language);
m_logger->info("STT engine loaded successfully");
} else {
m_logger->warn("No active STT engine available");
}
}
} // namespace aissia
// CRITICAL ORDER: Include system headers first
#include <nlohmann/json.hpp>
#include <cstdlib>
#include <memory>
#include <string>
// Include local headers before spdlog
#include "STTService.hpp"
#include "../shared/audio/ISTTEngine.hpp"
// Include spdlog after local headers
#include <spdlog/spdlog.h>
#include <spdlog/sinks/stdout_color_sinks.h>
namespace aissia {
STTService::STTService(const nlohmann::json& config)
: m_config(config)
{
m_logger = spdlog::get("STTService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("STTService");
}
// Extract language from config
if (config.contains("active_mode") && config["active_mode"].contains("language")) {
m_language = config["active_mode"]["language"].get<std::string>();
}
m_logger->info("STTService created");
}
STTService::~STTService() {
stop();
}
bool STTService::start() {
m_logger->info("Starting STT service");
loadEngines();
if (!m_activeEngine || !m_activeEngine->isAvailable()) {
m_logger->error("No active STT engine available");
return false;
}
m_logger->info("STT service started");
return true;
}
void STTService::stop() {
m_logger->info("Stopping STT service");
stopListening();
m_activeEngine.reset();
}
void STTService::setMode(STTMode mode) {
if (m_currentMode == mode) {
return;
}
m_logger->info("Switching STT mode");
m_currentMode = mode;
}
std::string STTService::transcribeFile(const std::string& filePath) {
if (!m_activeEngine || !m_activeEngine->isAvailable()) {
m_logger->warn("No STT engine available for transcription");
return "";
}
m_logger->info("Transcribing file");
try {
std::string result = m_activeEngine->transcribeFile(filePath);
m_logger->info("Transcription complete");
return result;
} catch (const std::exception& e) {
m_logger->error("Transcription failed");
return "";
}
}
std::string STTService::transcribe(const std::vector<float>& audioData) {
if (!m_activeEngine || !m_activeEngine->isAvailable()) {
return "";
}
if (audioData.empty()) {
return "";
}
try {
std::string result = m_activeEngine->transcribe(audioData);
if (!result.empty() && m_listening && m_onTranscription) {
m_onTranscription(result, m_currentMode);
}
return result;
} catch (const std::exception& e) {
m_logger->error("Transcription failed");
return "";
}
}
void STTService::startListening(TranscriptionCallback onTranscription,
KeywordCallback onKeyword) {
m_logger->info("Start listening");
m_onTranscription = onTranscription;
m_onKeyword = onKeyword;
m_listening = true;
m_logger->warn("Streaming microphone capture not yet implemented");
}
void STTService::stopListening() {
if (!m_listening) {
return;
}
m_logger->info("Stop listening");
m_listening = false;
}
void STTService::setLanguage(const std::string& language) {
m_logger->info("Setting language");
m_language = language;
if (m_activeEngine) {
m_activeEngine->setLanguage(language);
}
}
bool STTService::isAvailable() const {
return m_activeEngine && m_activeEngine->isAvailable();
}
std::string STTService::getCurrentEngine() const {
if (m_activeEngine) {
return m_activeEngine->getEngineName();
}
return "none";
}
void STTService::loadEngines() {
m_logger->info("Loading STT engines");
std::string engineType = "auto";
if (m_config.contains("active_mode")) {
const auto& activeMode = m_config["active_mode"];
if (activeMode.contains("engine")) {
engineType = activeMode["engine"];
}
}
std::string modelPath;
if (m_config.contains("active_mode")) {
const auto& activeMode = m_config["active_mode"];
if (activeMode.contains("model_path")) {
modelPath = activeMode["model_path"];
}
}
std::string apiKey;
if (m_config.contains("whisper_api")) {
const auto& whisperApi = m_config["whisper_api"];
std::string apiKeyEnv = "OPENAI_API_KEY";
if (whisperApi.contains("api_key_env")) {
apiKeyEnv = whisperApi["api_key_env"];
}
const char* envVal = std::getenv(apiKeyEnv.c_str());
if (envVal) {
apiKey = envVal;
}
}
m_activeEngine = STTEngineFactory::create(engineType, modelPath, apiKey);
if (m_activeEngine && m_activeEngine->isAvailable()) {
m_activeEngine->setLanguage(m_language);
m_logger->info("STT engine loaded successfully");
} else {
m_logger->warn("No active STT engine available");
}
}
} // namespace aissia

View File

@ -1,348 +1,348 @@
#include "StorageService.hpp"
#include <spdlog/sinks/stdout_color_sinks.h>
#include <sqlite3.h>
#include <filesystem>
#include <ctime>
namespace fs = std::filesystem;
namespace aissia {
StorageService::StorageService() {
m_logger = spdlog::get("StorageService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("StorageService");
}
}
StorageService::~StorageService() {
shutdown();
}
bool StorageService::initialize(grove::IIO* io) {
m_io = io;
if (m_io) {
grove::SubscriptionConfig config;
m_io->subscribe("storage:save_session", config);
m_io->subscribe("storage:save_app_usage", config);
m_io->subscribe("storage:save_conversation", config);
m_io->subscribe("storage:update_metrics", config);
}
m_logger->info("StorageService initialized");
return true;
}
bool StorageService::openDatabase(const std::string& dbPath,
const std::string& journalMode,
int busyTimeoutMs) {
m_dbPath = dbPath;
// Ensure directory exists
fs::path path(dbPath);
if (path.has_parent_path()) {
fs::create_directories(path.parent_path());
}
int rc = sqlite3_open(dbPath.c_str(), &m_db);
if (rc != SQLITE_OK) {
m_logger->error("SQLite open error: {}", sqlite3_errmsg(m_db));
return false;
}
// Set pragmas
std::string pragmas = "PRAGMA journal_mode=" + journalMode + ";"
"PRAGMA busy_timeout=" + std::to_string(busyTimeoutMs) + ";"
"PRAGMA foreign_keys=ON;";
if (!executeSQL(pragmas)) {
return false;
}
if (!initializeSchema()) {
return false;
}
if (!prepareStatements()) {
return false;
}
m_isConnected = true;
// Publish ready event
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("ready");
event->setString("database", dbPath);
m_io->publish("storage:ready", std::move(event));
}
m_logger->info("Database opened: {}", dbPath);
return true;
}
bool StorageService::initializeSchema() {
const char* schema = R"SQL(
CREATE TABLE IF NOT EXISTS work_sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
task_name TEXT,
start_time INTEGER,
end_time INTEGER,
duration_minutes INTEGER,
hyperfocus_detected BOOLEAN DEFAULT 0,
created_at INTEGER DEFAULT (strftime('%s', 'now'))
);
CREATE TABLE IF NOT EXISTS app_usage (
id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id INTEGER,
app_name TEXT,
duration_seconds INTEGER,
is_productive BOOLEAN,
created_at INTEGER DEFAULT (strftime('%s', 'now')),
FOREIGN KEY (session_id) REFERENCES work_sessions(id)
);
CREATE TABLE IF NOT EXISTS conversations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
role TEXT,
content TEXT,
provider TEXT,
model TEXT,
tokens_used INTEGER,
created_at INTEGER DEFAULT (strftime('%s', 'now'))
);
CREATE TABLE IF NOT EXISTS daily_metrics (
date TEXT PRIMARY KEY,
total_focus_minutes INTEGER DEFAULT 0,
total_breaks INTEGER DEFAULT 0,
hyperfocus_count INTEGER DEFAULT 0,
updated_at INTEGER DEFAULT (strftime('%s', 'now'))
);
CREATE INDEX IF NOT EXISTS idx_sessions_date ON work_sessions(created_at);
CREATE INDEX IF NOT EXISTS idx_app_usage_session ON app_usage(session_id);
CREATE INDEX IF NOT EXISTS idx_conversations_date ON conversations(created_at);
)SQL";
return executeSQL(schema);
}
bool StorageService::prepareStatements() {
int rc;
// Save session statement
const char* sqlSession = "INSERT INTO work_sessions "
"(task_name, start_time, end_time, duration_minutes, hyperfocus_detected) "
"VALUES (?, ?, ?, ?, ?)";
rc = sqlite3_prepare_v2(m_db, sqlSession, -1, &m_stmtSaveSession, nullptr);
if (rc != SQLITE_OK) {
m_logger->error("Failed to prepare save_session: {}", sqlite3_errmsg(m_db));
return false;
}
// Save app usage statement
const char* sqlAppUsage = "INSERT INTO app_usage "
"(session_id, app_name, duration_seconds, is_productive) "
"VALUES (?, ?, ?, ?)";
rc = sqlite3_prepare_v2(m_db, sqlAppUsage, -1, &m_stmtSaveAppUsage, nullptr);
if (rc != SQLITE_OK) {
m_logger->error("Failed to prepare save_app_usage: {}", sqlite3_errmsg(m_db));
return false;
}
// Save conversation statement
const char* sqlConv = "INSERT INTO conversations "
"(role, content, provider, model, tokens_used) "
"VALUES (?, ?, ?, ?, ?)";
rc = sqlite3_prepare_v2(m_db, sqlConv, -1, &m_stmtSaveConversation, nullptr);
if (rc != SQLITE_OK) {
m_logger->error("Failed to prepare save_conversation: {}", sqlite3_errmsg(m_db));
return false;
}
// Update metrics statement
const char* sqlMetrics = "INSERT INTO daily_metrics "
"(date, total_focus_minutes, total_breaks, hyperfocus_count) "
"VALUES (?, ?, ?, ?) "
"ON CONFLICT(date) DO UPDATE SET "
"total_focus_minutes = total_focus_minutes + excluded.total_focus_minutes, "
"total_breaks = total_breaks + excluded.total_breaks, "
"hyperfocus_count = hyperfocus_count + excluded.hyperfocus_count, "
"updated_at = strftime('%s', 'now')";
rc = sqlite3_prepare_v2(m_db, sqlMetrics, -1, &m_stmtUpdateMetrics, nullptr);
if (rc != SQLITE_OK) {
m_logger->error("Failed to prepare update_metrics: {}", sqlite3_errmsg(m_db));
return false;
}
m_logger->debug("Prepared statements created");
return true;
}
void StorageService::finalizeStatements() {
if (m_stmtSaveSession) { sqlite3_finalize(m_stmtSaveSession); m_stmtSaveSession = nullptr; }
if (m_stmtSaveAppUsage) { sqlite3_finalize(m_stmtSaveAppUsage); m_stmtSaveAppUsage = nullptr; }
if (m_stmtSaveConversation) { sqlite3_finalize(m_stmtSaveConversation); m_stmtSaveConversation = nullptr; }
if (m_stmtUpdateMetrics) { sqlite3_finalize(m_stmtUpdateMetrics); m_stmtUpdateMetrics = nullptr; }
}
void StorageService::process() {
processMessages();
}
void StorageService::processMessages() {
if (!m_io || !m_isConnected) return;
while (m_io->hasMessages() > 0) {
auto msg = m_io->pullMessage();
if (msg.topic == "storage:save_session" && msg.data) {
handleSaveSession(*msg.data);
}
else if (msg.topic == "storage:save_app_usage" && msg.data) {
handleSaveAppUsage(*msg.data);
}
else if (msg.topic == "storage:save_conversation" && msg.data) {
handleSaveConversation(*msg.data);
}
else if (msg.topic == "storage:update_metrics" && msg.data) {
handleUpdateMetrics(*msg.data);
}
}
}
void StorageService::handleSaveSession(const grove::IDataNode& data) {
std::string taskName = data.getString("taskName", "unknown");
int durationMinutes = data.getInt("durationMinutes", 0);
bool hyperfocus = data.getBool("hyperfocus", false);
std::time_t now = std::time(nullptr);
std::time_t startTime = now - (durationMinutes * 60);
sqlite3_reset(m_stmtSaveSession);
sqlite3_bind_text(m_stmtSaveSession, 1, taskName.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_int64(m_stmtSaveSession, 2, startTime);
sqlite3_bind_int64(m_stmtSaveSession, 3, now);
sqlite3_bind_int(m_stmtSaveSession, 4, durationMinutes);
sqlite3_bind_int(m_stmtSaveSession, 5, hyperfocus ? 1 : 0);
int rc = sqlite3_step(m_stmtSaveSession);
if (rc == SQLITE_DONE) {
m_lastSessionId = static_cast<int>(sqlite3_last_insert_rowid(m_db));
m_totalQueries++;
m_logger->debug("Session saved: {} ({}min), id={}", taskName, durationMinutes, m_lastSessionId);
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("saved");
event->setInt("sessionId", m_lastSessionId);
m_io->publish("storage:session_saved", std::move(event));
}
} else {
publishError(sqlite3_errmsg(m_db));
}
}
void StorageService::handleSaveAppUsage(const grove::IDataNode& data) {
int sessionId = data.getInt("sessionId", m_lastSessionId);
std::string appName = data.getString("appName", "");
int durationSeconds = data.getInt("durationSeconds", 0);
bool productive = data.getBool("productive", false);
sqlite3_reset(m_stmtSaveAppUsage);
sqlite3_bind_int(m_stmtSaveAppUsage, 1, sessionId);
sqlite3_bind_text(m_stmtSaveAppUsage, 2, appName.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_int(m_stmtSaveAppUsage, 3, durationSeconds);
sqlite3_bind_int(m_stmtSaveAppUsage, 4, productive ? 1 : 0);
int rc = sqlite3_step(m_stmtSaveAppUsage);
if (rc == SQLITE_DONE) {
m_totalQueries++;
} else {
publishError(sqlite3_errmsg(m_db));
}
}
void StorageService::handleSaveConversation(const grove::IDataNode& data) {
std::string role = data.getString("role", "");
std::string content = data.getString("content", "");
std::string provider = data.getString("provider", "");
std::string model = data.getString("model", "");
int tokens = data.getInt("tokens", 0);
sqlite3_reset(m_stmtSaveConversation);
sqlite3_bind_text(m_stmtSaveConversation, 1, role.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_text(m_stmtSaveConversation, 2, content.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_text(m_stmtSaveConversation, 3, provider.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_text(m_stmtSaveConversation, 4, model.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_int(m_stmtSaveConversation, 5, tokens);
int rc = sqlite3_step(m_stmtSaveConversation);
if (rc == SQLITE_DONE) {
m_totalQueries++;
} else {
publishError(sqlite3_errmsg(m_db));
}
}
void StorageService::handleUpdateMetrics(const grove::IDataNode& data) {
int focusMinutes = data.getInt("focusMinutes", 0);
int breaks = data.getInt("breaks", 0);
int hyperfocusCount = data.getInt("hyperfocusCount", 0);
std::time_t now = std::time(nullptr);
std::tm* tm = std::localtime(&now);
char dateStr[11];
std::strftime(dateStr, sizeof(dateStr), "%Y-%m-%d", tm);
sqlite3_reset(m_stmtUpdateMetrics);
sqlite3_bind_text(m_stmtUpdateMetrics, 1, dateStr, -1, SQLITE_TRANSIENT);
sqlite3_bind_int(m_stmtUpdateMetrics, 2, focusMinutes);
sqlite3_bind_int(m_stmtUpdateMetrics, 3, breaks);
sqlite3_bind_int(m_stmtUpdateMetrics, 4, hyperfocusCount);
int rc = sqlite3_step(m_stmtUpdateMetrics);
if (rc == SQLITE_DONE) {
m_totalQueries++;
} else {
publishError(sqlite3_errmsg(m_db));
}
}
bool StorageService::executeSQL(const std::string& sql) {
char* errMsg = nullptr;
int rc = sqlite3_exec(m_db, sql.c_str(), nullptr, nullptr, &errMsg);
if (rc != SQLITE_OK) {
m_logger->error("SQL error: {}", errMsg ? errMsg : "unknown");
sqlite3_free(errMsg);
return false;
}
m_totalQueries++;
return true;
}
void StorageService::publishError(const std::string& message) {
m_logger->error("Storage error: {}", message);
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("error");
event->setString("message", message);
m_io->publish("storage:error", std::move(event));
}
}
void StorageService::shutdown() {
finalizeStatements();
if (m_db) {
sqlite3_close(m_db);
m_db = nullptr;
m_isConnected = false;
}
m_logger->info("StorageService shutdown. Total queries: {}", m_totalQueries);
}
} // namespace aissia
#include "StorageService.hpp"
#include <spdlog/sinks/stdout_color_sinks.h>
#include <sqlite3.h>
#include <filesystem>
#include <ctime>
namespace fs = std::filesystem;
namespace aissia {
StorageService::StorageService() {
m_logger = spdlog::get("StorageService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("StorageService");
}
}
StorageService::~StorageService() {
shutdown();
}
bool StorageService::initialize(grove::IIO* io) {
m_io = io;
if (m_io) {
grove::SubscriptionConfig config;
m_io->subscribe("storage:save_session", config);
m_io->subscribe("storage:save_app_usage", config);
m_io->subscribe("storage:save_conversation", config);
m_io->subscribe("storage:update_metrics", config);
}
m_logger->info("StorageService initialized");
return true;
}
bool StorageService::openDatabase(const std::string& dbPath,
const std::string& journalMode,
int busyTimeoutMs) {
m_dbPath = dbPath;
// Ensure directory exists
fs::path path(dbPath);
if (path.has_parent_path()) {
fs::create_directories(path.parent_path());
}
int rc = sqlite3_open(dbPath.c_str(), &m_db);
if (rc != SQLITE_OK) {
m_logger->error("SQLite open error: {}", sqlite3_errmsg(m_db));
return false;
}
// Set pragmas
std::string pragmas = "PRAGMA journal_mode=" + journalMode + ";"
"PRAGMA busy_timeout=" + std::to_string(busyTimeoutMs) + ";"
"PRAGMA foreign_keys=ON;";
if (!executeSQL(pragmas)) {
return false;
}
if (!initializeSchema()) {
return false;
}
if (!prepareStatements()) {
return false;
}
m_isConnected = true;
// Publish ready event
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("ready");
event->setString("database", dbPath);
m_io->publish("storage:ready", std::move(event));
}
m_logger->info("Database opened: {}", dbPath);
return true;
}
bool StorageService::initializeSchema() {
const char* schema = R"SQL(
CREATE TABLE IF NOT EXISTS work_sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
task_name TEXT,
start_time INTEGER,
end_time INTEGER,
duration_minutes INTEGER,
hyperfocus_detected BOOLEAN DEFAULT 0,
created_at INTEGER DEFAULT (strftime('%s', 'now'))
);
CREATE TABLE IF NOT EXISTS app_usage (
id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id INTEGER,
app_name TEXT,
duration_seconds INTEGER,
is_productive BOOLEAN,
created_at INTEGER DEFAULT (strftime('%s', 'now')),
FOREIGN KEY (session_id) REFERENCES work_sessions(id)
);
CREATE TABLE IF NOT EXISTS conversations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
role TEXT,
content TEXT,
provider TEXT,
model TEXT,
tokens_used INTEGER,
created_at INTEGER DEFAULT (strftime('%s', 'now'))
);
CREATE TABLE IF NOT EXISTS daily_metrics (
date TEXT PRIMARY KEY,
total_focus_minutes INTEGER DEFAULT 0,
total_breaks INTEGER DEFAULT 0,
hyperfocus_count INTEGER DEFAULT 0,
updated_at INTEGER DEFAULT (strftime('%s', 'now'))
);
CREATE INDEX IF NOT EXISTS idx_sessions_date ON work_sessions(created_at);
CREATE INDEX IF NOT EXISTS idx_app_usage_session ON app_usage(session_id);
CREATE INDEX IF NOT EXISTS idx_conversations_date ON conversations(created_at);
)SQL";
return executeSQL(schema);
}
bool StorageService::prepareStatements() {
int rc;
// Save session statement
const char* sqlSession = "INSERT INTO work_sessions "
"(task_name, start_time, end_time, duration_minutes, hyperfocus_detected) "
"VALUES (?, ?, ?, ?, ?)";
rc = sqlite3_prepare_v2(m_db, sqlSession, -1, &m_stmtSaveSession, nullptr);
if (rc != SQLITE_OK) {
m_logger->error("Failed to prepare save_session: {}", sqlite3_errmsg(m_db));
return false;
}
// Save app usage statement
const char* sqlAppUsage = "INSERT INTO app_usage "
"(session_id, app_name, duration_seconds, is_productive) "
"VALUES (?, ?, ?, ?)";
rc = sqlite3_prepare_v2(m_db, sqlAppUsage, -1, &m_stmtSaveAppUsage, nullptr);
if (rc != SQLITE_OK) {
m_logger->error("Failed to prepare save_app_usage: {}", sqlite3_errmsg(m_db));
return false;
}
// Save conversation statement
const char* sqlConv = "INSERT INTO conversations "
"(role, content, provider, model, tokens_used) "
"VALUES (?, ?, ?, ?, ?)";
rc = sqlite3_prepare_v2(m_db, sqlConv, -1, &m_stmtSaveConversation, nullptr);
if (rc != SQLITE_OK) {
m_logger->error("Failed to prepare save_conversation: {}", sqlite3_errmsg(m_db));
return false;
}
// Update metrics statement
const char* sqlMetrics = "INSERT INTO daily_metrics "
"(date, total_focus_minutes, total_breaks, hyperfocus_count) "
"VALUES (?, ?, ?, ?) "
"ON CONFLICT(date) DO UPDATE SET "
"total_focus_minutes = total_focus_minutes + excluded.total_focus_minutes, "
"total_breaks = total_breaks + excluded.total_breaks, "
"hyperfocus_count = hyperfocus_count + excluded.hyperfocus_count, "
"updated_at = strftime('%s', 'now')";
rc = sqlite3_prepare_v2(m_db, sqlMetrics, -1, &m_stmtUpdateMetrics, nullptr);
if (rc != SQLITE_OK) {
m_logger->error("Failed to prepare update_metrics: {}", sqlite3_errmsg(m_db));
return false;
}
m_logger->debug("Prepared statements created");
return true;
}
void StorageService::finalizeStatements() {
if (m_stmtSaveSession) { sqlite3_finalize(m_stmtSaveSession); m_stmtSaveSession = nullptr; }
if (m_stmtSaveAppUsage) { sqlite3_finalize(m_stmtSaveAppUsage); m_stmtSaveAppUsage = nullptr; }
if (m_stmtSaveConversation) { sqlite3_finalize(m_stmtSaveConversation); m_stmtSaveConversation = nullptr; }
if (m_stmtUpdateMetrics) { sqlite3_finalize(m_stmtUpdateMetrics); m_stmtUpdateMetrics = nullptr; }
}
void StorageService::process() {
processMessages();
}
void StorageService::processMessages() {
if (!m_io || !m_isConnected) return;
while (m_io->hasMessages() > 0) {
auto msg = m_io->pullMessage();
if (msg.topic == "storage:save_session" && msg.data) {
handleSaveSession(*msg.data);
}
else if (msg.topic == "storage:save_app_usage" && msg.data) {
handleSaveAppUsage(*msg.data);
}
else if (msg.topic == "storage:save_conversation" && msg.data) {
handleSaveConversation(*msg.data);
}
else if (msg.topic == "storage:update_metrics" && msg.data) {
handleUpdateMetrics(*msg.data);
}
}
}
void StorageService::handleSaveSession(const grove::IDataNode& data) {
std::string taskName = data.getString("taskName", "unknown");
int durationMinutes = data.getInt("durationMinutes", 0);
bool hyperfocus = data.getBool("hyperfocus", false);
std::time_t now = std::time(nullptr);
std::time_t startTime = now - (durationMinutes * 60);
sqlite3_reset(m_stmtSaveSession);
sqlite3_bind_text(m_stmtSaveSession, 1, taskName.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_int64(m_stmtSaveSession, 2, startTime);
sqlite3_bind_int64(m_stmtSaveSession, 3, now);
sqlite3_bind_int(m_stmtSaveSession, 4, durationMinutes);
sqlite3_bind_int(m_stmtSaveSession, 5, hyperfocus ? 1 : 0);
int rc = sqlite3_step(m_stmtSaveSession);
if (rc == SQLITE_DONE) {
m_lastSessionId = static_cast<int>(sqlite3_last_insert_rowid(m_db));
m_totalQueries++;
m_logger->debug("Session saved: {} ({}min), id={}", taskName, durationMinutes, m_lastSessionId);
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("saved");
event->setInt("sessionId", m_lastSessionId);
m_io->publish("storage:session_saved", std::move(event));
}
} else {
publishError(sqlite3_errmsg(m_db));
}
}
void StorageService::handleSaveAppUsage(const grove::IDataNode& data) {
int sessionId = data.getInt("sessionId", m_lastSessionId);
std::string appName = data.getString("appName", "");
int durationSeconds = data.getInt("durationSeconds", 0);
bool productive = data.getBool("productive", false);
sqlite3_reset(m_stmtSaveAppUsage);
sqlite3_bind_int(m_stmtSaveAppUsage, 1, sessionId);
sqlite3_bind_text(m_stmtSaveAppUsage, 2, appName.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_int(m_stmtSaveAppUsage, 3, durationSeconds);
sqlite3_bind_int(m_stmtSaveAppUsage, 4, productive ? 1 : 0);
int rc = sqlite3_step(m_stmtSaveAppUsage);
if (rc == SQLITE_DONE) {
m_totalQueries++;
} else {
publishError(sqlite3_errmsg(m_db));
}
}
void StorageService::handleSaveConversation(const grove::IDataNode& data) {
std::string role = data.getString("role", "");
std::string content = data.getString("content", "");
std::string provider = data.getString("provider", "");
std::string model = data.getString("model", "");
int tokens = data.getInt("tokens", 0);
sqlite3_reset(m_stmtSaveConversation);
sqlite3_bind_text(m_stmtSaveConversation, 1, role.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_text(m_stmtSaveConversation, 2, content.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_text(m_stmtSaveConversation, 3, provider.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_text(m_stmtSaveConversation, 4, model.c_str(), -1, SQLITE_TRANSIENT);
sqlite3_bind_int(m_stmtSaveConversation, 5, tokens);
int rc = sqlite3_step(m_stmtSaveConversation);
if (rc == SQLITE_DONE) {
m_totalQueries++;
} else {
publishError(sqlite3_errmsg(m_db));
}
}
void StorageService::handleUpdateMetrics(const grove::IDataNode& data) {
int focusMinutes = data.getInt("focusMinutes", 0);
int breaks = data.getInt("breaks", 0);
int hyperfocusCount = data.getInt("hyperfocusCount", 0);
std::time_t now = std::time(nullptr);
std::tm* tm = std::localtime(&now);
char dateStr[11];
std::strftime(dateStr, sizeof(dateStr), "%Y-%m-%d", tm);
sqlite3_reset(m_stmtUpdateMetrics);
sqlite3_bind_text(m_stmtUpdateMetrics, 1, dateStr, -1, SQLITE_TRANSIENT);
sqlite3_bind_int(m_stmtUpdateMetrics, 2, focusMinutes);
sqlite3_bind_int(m_stmtUpdateMetrics, 3, breaks);
sqlite3_bind_int(m_stmtUpdateMetrics, 4, hyperfocusCount);
int rc = sqlite3_step(m_stmtUpdateMetrics);
if (rc == SQLITE_DONE) {
m_totalQueries++;
} else {
publishError(sqlite3_errmsg(m_db));
}
}
bool StorageService::executeSQL(const std::string& sql) {
char* errMsg = nullptr;
int rc = sqlite3_exec(m_db, sql.c_str(), nullptr, nullptr, &errMsg);
if (rc != SQLITE_OK) {
m_logger->error("SQL error: {}", errMsg ? errMsg : "unknown");
sqlite3_free(errMsg);
return false;
}
m_totalQueries++;
return true;
}
void StorageService::publishError(const std::string& message) {
m_logger->error("Storage error: {}", message);
if (m_io) {
auto event = std::make_unique<grove::JsonDataNode>("error");
event->setString("message", message);
m_io->publish("storage:error", std::move(event));
}
}
void StorageService::shutdown() {
finalizeStatements();
if (m_db) {
sqlite3_close(m_db);
m_db = nullptr;
m_isConnected = false;
}
m_logger->info("StorageService shutdown. Total queries: {}", m_totalQueries);
}
} // namespace aissia

View File

@ -1,91 +1,91 @@
#pragma once
#include "IService.hpp"
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <spdlog/spdlog.h>
#include <memory>
#include <string>
#include <queue>
#include <mutex>
struct sqlite3;
struct sqlite3_stmt;
namespace aissia {
/**
* @brief Storage Service - SQLite persistence
*
* Handles all database operations synchronously in main thread.
* Uses prepared statements to prevent SQL injection.
*
* Subscribes to:
* - "storage:save_session" : { taskName, durationMinutes, hyperfocus }
* - "storage:save_app_usage" : { sessionId, appName, durationSeconds, productive }
* - "storage:save_conversation" : { role, content, provider, model, tokens }
* - "storage:update_metrics" : { focusMinutes, breaks, hyperfocusCount }
* - "storage:query" : { sql, params[] }
*
* Publishes:
* - "storage:ready" : Database initialized
* - "storage:session_saved": { sessionId }
* - "storage:error" : { message }
*/
class StorageService : public IService {
public:
StorageService();
~StorageService() override;
bool initialize(grove::IIO* io) override;
void process() override;
void shutdown() override;
std::string getName() const override { return "StorageService"; }
bool isHealthy() const override { return m_isConnected; }
/// Open database with config
bool openDatabase(const std::string& dbPath,
const std::string& journalMode = "WAL",
int busyTimeoutMs = 5000);
/// Get last inserted session ID
int getLastSessionId() const { return m_lastSessionId; }
private:
// Database
sqlite3* m_db = nullptr;
std::string m_dbPath;
bool m_isConnected = false;
int m_lastSessionId = 0;
int m_totalQueries = 0;
// Prepared statements
sqlite3_stmt* m_stmtSaveSession = nullptr;
sqlite3_stmt* m_stmtSaveAppUsage = nullptr;
sqlite3_stmt* m_stmtSaveConversation = nullptr;
sqlite3_stmt* m_stmtUpdateMetrics = nullptr;
// Services
grove::IIO* m_io = nullptr;
std::shared_ptr<spdlog::logger> m_logger;
// Database operations
bool initializeSchema();
bool prepareStatements();
void finalizeStatements();
// Message handlers
void processMessages();
void handleSaveSession(const grove::IDataNode& data);
void handleSaveAppUsage(const grove::IDataNode& data);
void handleSaveConversation(const grove::IDataNode& data);
void handleUpdateMetrics(const grove::IDataNode& data);
// Helpers
bool executeSQL(const std::string& sql);
void publishError(const std::string& message);
};
} // namespace aissia
#pragma once
#include "IService.hpp"
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <spdlog/spdlog.h>
#include <memory>
#include <string>
#include <queue>
#include <mutex>
struct sqlite3;
struct sqlite3_stmt;
namespace aissia {
/**
* @brief Storage Service - SQLite persistence
*
* Handles all database operations synchronously in main thread.
* Uses prepared statements to prevent SQL injection.
*
* Subscribes to:
* - "storage:save_session" : { taskName, durationMinutes, hyperfocus }
* - "storage:save_app_usage" : { sessionId, appName, durationSeconds, productive }
* - "storage:save_conversation" : { role, content, provider, model, tokens }
* - "storage:update_metrics" : { focusMinutes, breaks, hyperfocusCount }
* - "storage:query" : { sql, params[] }
*
* Publishes:
* - "storage:ready" : Database initialized
* - "storage:session_saved": { sessionId }
* - "storage:error" : { message }
*/
class StorageService : public IService {
public:
StorageService();
~StorageService() override;
bool initialize(grove::IIO* io) override;
void process() override;
void shutdown() override;
std::string getName() const override { return "StorageService"; }
bool isHealthy() const override { return m_isConnected; }
/// Open database with config
bool openDatabase(const std::string& dbPath,
const std::string& journalMode = "WAL",
int busyTimeoutMs = 5000);
/// Get last inserted session ID
int getLastSessionId() const { return m_lastSessionId; }
private:
// Database
sqlite3* m_db = nullptr;
std::string m_dbPath;
bool m_isConnected = false;
int m_lastSessionId = 0;
int m_totalQueries = 0;
// Prepared statements
sqlite3_stmt* m_stmtSaveSession = nullptr;
sqlite3_stmt* m_stmtSaveAppUsage = nullptr;
sqlite3_stmt* m_stmtSaveConversation = nullptr;
sqlite3_stmt* m_stmtUpdateMetrics = nullptr;
// Services
grove::IIO* m_io = nullptr;
std::shared_ptr<spdlog::logger> m_logger;
// Database operations
bool initializeSchema();
bool prepareStatements();
void finalizeStatements();
// Message handlers
void processMessages();
void handleSaveSession(const grove::IDataNode& data);
void handleSaveAppUsage(const grove::IDataNode& data);
void handleSaveConversation(const grove::IDataNode& data);
void handleUpdateMetrics(const grove::IDataNode& data);
// Helpers
bool executeSQL(const std::string& sql);
void publishError(const std::string& message);
};
} // namespace aissia

View File

@ -1,294 +1,294 @@
// CRITICAL ORDER: Include system headers before local headers to avoid macro conflicts
#include <nlohmann/json.hpp>
#include <cstdlib>
#include <memory>
#include <string>
#include <queue>
#include <fstream>
// Include VoiceService.hpp BEFORE spdlog to avoid logger macro conflicts
#include "VoiceService.hpp"
#include "STTService.hpp"
// Include spdlog after VoiceService.hpp
#include <spdlog/sinks/stdout_color_sinks.h>
namespace aissia {
VoiceService::VoiceService() {
m_logger = spdlog::get("VoiceService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("VoiceService");
}
}
bool VoiceService::initialize(grove::IIO* io) {
m_io = io;
// Create TTS engine
m_ttsEngine = TTSEngineFactory::create();
if (m_ttsEngine && m_ttsEngine->isAvailable()) {
m_ttsEngine->setRate(m_ttsRate);
m_ttsEngine->setVolume(m_ttsVolume);
m_logger->info("TTS engine initialized");
} else {
m_logger->warn("TTS engine not available");
}
if (m_io) {
grove::SubscriptionConfig config;
m_io->subscribe("voice:speak", config);
m_io->subscribe("voice:stop", config);
m_io->subscribe("voice:listen", config);
}
m_logger->info("VoiceService initialized");
return true;
}
void VoiceService::configureTTS(bool enabled, int rate, int volume) {
m_ttsEnabled = enabled;
m_ttsRate = rate;
m_ttsVolume = volume;
if (m_ttsEngine) {
m_ttsEngine->setRate(rate);
m_ttsEngine->setVolume(volume);
}
}
void VoiceService::configureSTT(bool enabled, const std::string& language,
const std::string& apiKey) {
m_sttEnabled = enabled;
m_language = language;
if (!apiKey.empty()) {
m_sttEngine = STTEngineFactory::create(apiKey);
if (m_sttEngine) {
m_sttEngine->setLanguage(language);
m_logger->info("STT engine configured");
}
}
}
void VoiceService::process() {
processMessages();
processSpeakQueue();
}
void VoiceService::processMessages() {
if (!m_io) return;
while (m_io->hasMessages() > 0) {
auto msg = m_io->pullMessage();
if (msg.topic == "voice:speak" && msg.data) {
handleSpeakRequest(*msg.data);
}
else if (msg.topic == "voice:stop") {
if (m_ttsEngine) {
m_ttsEngine->stop();
}
// Clear queue
while (!m_speakQueue.empty()) m_speakQueue.pop();
}
else if (msg.topic == "voice:listen" && m_sttEnabled && m_sttEngine) {
// STT would be handled here
// For now just log
m_logger->debug("STT listen requested");
}
}
}
void VoiceService::handleSpeakRequest(const grove::IDataNode& data) {
std::string text = data.getString("text", "");
bool priority = data.getBool("priority", false);
if (text.empty()) return;
if (priority) {
// Clear queue and stop current speech
while (!m_speakQueue.empty()) m_speakQueue.pop();
if (m_ttsEngine) m_ttsEngine->stop();
}
m_speakQueue.push(text);
}
void VoiceService::processSpeakQueue() {
if (!m_ttsEnabled || !m_ttsEngine || m_speakQueue.empty()) return;
// Only speak if not currently speaking
if (!m_ttsEngine->isSpeaking() && !m_speakQueue.empty()) {
std::string text = m_speakQueue.front();
m_speakQueue.pop();
speak(text);
}
}
void VoiceService::speak(const std::string& text) {
if (!m_ttsEngine || !m_ttsEnabled) return;
// Publish speaking started
if (m_io) {
auto event = std::unique_ptr<grove::IDataNode>(
new grove::JsonDataNode("event")
);
event->setString("text", text.size() > 100 ? text.substr(0, 100) + "..." : text);
m_io->publish("voice:speaking_started", std::move(event));
}
m_ttsEngine->speak(text, true);
m_totalSpoken++;
m_logger->debug("Speaking");
}
// Phase 7: New STT configuration with full config support
void VoiceService::configureSTT(const nlohmann::json& sttConfig) {
m_logger->info("[VoiceService] Configuring STT service (Phase 7)");
// Extract enabled flag
bool enabled = false;
if (sttConfig.contains("active_mode")) {
const auto& activeMode = sttConfig["active_mode"];
enabled = activeMode.value("enabled", true);
}
m_sttEnabled = enabled;
if (!enabled) {
m_logger->info("[VoiceService] STT disabled in config");
return;
}
// Create and start STT service
m_sttService = std::make_unique<STTService>(sttConfig);
if (!m_sttService->start()) {
m_logger->error("[VoiceService] Failed to start STT service");
m_sttService.reset();
return;
}
m_logger->info("[VoiceService] STT service started");
// Setup callbacks for transcription events
// Note: For MVP Milestone 1, we don't start streaming yet
// This will be implemented in Milestone 2 (passive mode)
}
// STT event handlers (Phase 7)
void VoiceService::handleKeyword(const std::string& keyword) {
m_logger->info("[VoiceService] Keyword detected");
// Publish keyword detection event
if (m_io) {
auto event = std::unique_ptr<grove::IDataNode>(
new grove::JsonDataNode("event")
);
event->setString("keyword", keyword);
event->setInt("timestamp", static_cast<int>(std::time(nullptr)));
m_io->publish("voice:keyword_detected", std::move(event));
}
// Auto-switch to active mode (Phase 7.2)
if (m_sttService) {
m_sttService->setMode(STTMode::ACTIVE);
}
}
void VoiceService::handleTranscription(const std::string& text, STTMode mode) {
m_logger->info("[VoiceService] Transcription received");
// Publish transcription event
if (m_io) {
std::string modeStr = (mode == STTMode::PASSIVE ? "passive" : "active");
auto event = std::unique_ptr<grove::IDataNode>(
new grove::JsonDataNode("event")
);
event->setString("text", text);
event->setString("mode", modeStr);
event->setInt("timestamp", static_cast<int>(std::time(nullptr)));
m_io->publish("voice:transcription", std::move(event));
}
}
void VoiceService::shutdown() {
if (m_ttsEngine) {
m_ttsEngine->stop();
}
if (m_sttService) {
m_sttService->stop();
}
m_logger->info("[VoiceService] Shutdown");
}
bool VoiceService::loadConfig(const std::string& configPath) {
try {
std::ifstream file(configPath);
if (!file.is_open()) {
m_logger->warn("[VoiceService] Config file not found: {}", configPath);
return false;
}
nlohmann::json config;
file >> config;
// Load TTS config
if (config.contains("tts")) {
const auto& ttsConfig = config["tts"];
m_ttsEnabled = ttsConfig.value("enabled", true);
m_ttsRate = ttsConfig.value("rate", 0);
m_ttsVolume = ttsConfig.value("volume", 80);
}
// Load STT config (Phase 7 format)
if (config.contains("stt")) {
configureSTT(config["stt"]);
}
m_logger->info("[VoiceService] Config loaded from {}", configPath);
return true;
} catch (const std::exception& e) {
m_logger->error("[VoiceService] Failed to load config: {}", e.what());
return false;
}
}
std::string VoiceService::transcribeFileSync(
const std::string& filePath,
const std::string& language
) {
m_logger->info("[VoiceService] transcribeFileSync: {}", filePath);
if (!m_sttService) {
throw std::runtime_error("STT service not initialized");
}
// Use STT service to transcribe file synchronously
// Note: This requires STT service to support file transcription
// For MVP, we'll throw not implemented
throw std::runtime_error("transcribeFileSync not yet implemented - STT service needs file transcription support");
}
bool VoiceService::textToSpeechSync(
const std::string& text,
const std::string& outputFile,
const std::string& voice
) {
m_logger->info("[VoiceService] textToSpeechSync: {} -> {}", text.substr(0, 50), outputFile);
if (!m_ttsEngine) {
throw std::runtime_error("TTS engine not initialized");
}
// For MVP, we don't support saving to file yet
// The TTS engine currently only speaks directly
throw std::runtime_error("textToSpeechSync file output not yet implemented - TTS engine needs file output support");
}
} // namespace aissia
// CRITICAL ORDER: Include system headers before local headers to avoid macro conflicts
#include <nlohmann/json.hpp>
#include <cstdlib>
#include <memory>
#include <string>
#include <queue>
#include <fstream>
// Include VoiceService.hpp BEFORE spdlog to avoid logger macro conflicts
#include "VoiceService.hpp"
#include "STTService.hpp"
// Include spdlog after VoiceService.hpp
#include <spdlog/sinks/stdout_color_sinks.h>
namespace aissia {
VoiceService::VoiceService() {
m_logger = spdlog::get("VoiceService");
if (!m_logger) {
m_logger = spdlog::stdout_color_mt("VoiceService");
}
}
bool VoiceService::initialize(grove::IIO* io) {
m_io = io;
// Create TTS engine
m_ttsEngine = TTSEngineFactory::create();
if (m_ttsEngine && m_ttsEngine->isAvailable()) {
m_ttsEngine->setRate(m_ttsRate);
m_ttsEngine->setVolume(m_ttsVolume);
m_logger->info("TTS engine initialized");
} else {
m_logger->warn("TTS engine not available");
}
if (m_io) {
grove::SubscriptionConfig config;
m_io->subscribe("voice:speak", config);
m_io->subscribe("voice:stop", config);
m_io->subscribe("voice:listen", config);
}
m_logger->info("VoiceService initialized");
return true;
}
void VoiceService::configureTTS(bool enabled, int rate, int volume) {
m_ttsEnabled = enabled;
m_ttsRate = rate;
m_ttsVolume = volume;
if (m_ttsEngine) {
m_ttsEngine->setRate(rate);
m_ttsEngine->setVolume(volume);
}
}
void VoiceService::configureSTT(bool enabled, const std::string& language,
const std::string& apiKey) {
m_sttEnabled = enabled;
m_language = language;
if (!apiKey.empty()) {
m_sttEngine = STTEngineFactory::create(apiKey);
if (m_sttEngine) {
m_sttEngine->setLanguage(language);
m_logger->info("STT engine configured");
}
}
}
void VoiceService::process() {
processMessages();
processSpeakQueue();
}
void VoiceService::processMessages() {
if (!m_io) return;
while (m_io->hasMessages() > 0) {
auto msg = m_io->pullMessage();
if (msg.topic == "voice:speak" && msg.data) {
handleSpeakRequest(*msg.data);
}
else if (msg.topic == "voice:stop") {
if (m_ttsEngine) {
m_ttsEngine->stop();
}
// Clear queue
while (!m_speakQueue.empty()) m_speakQueue.pop();
}
else if (msg.topic == "voice:listen" && m_sttEnabled && m_sttEngine) {
// STT would be handled here
// For now just log
m_logger->debug("STT listen requested");
}
}
}
void VoiceService::handleSpeakRequest(const grove::IDataNode& data) {
std::string text = data.getString("text", "");
bool priority = data.getBool("priority", false);
if (text.empty()) return;
if (priority) {
// Clear queue and stop current speech
while (!m_speakQueue.empty()) m_speakQueue.pop();
if (m_ttsEngine) m_ttsEngine->stop();
}
m_speakQueue.push(text);
}
void VoiceService::processSpeakQueue() {
if (!m_ttsEnabled || !m_ttsEngine || m_speakQueue.empty()) return;
// Only speak if not currently speaking
if (!m_ttsEngine->isSpeaking() && !m_speakQueue.empty()) {
std::string text = m_speakQueue.front();
m_speakQueue.pop();
speak(text);
}
}
void VoiceService::speak(const std::string& text) {
if (!m_ttsEngine || !m_ttsEnabled) return;
// Publish speaking started
if (m_io) {
auto event = std::unique_ptr<grove::IDataNode>(
new grove::JsonDataNode("event")
);
event->setString("text", text.size() > 100 ? text.substr(0, 100) + "..." : text);
m_io->publish("voice:speaking_started", std::move(event));
}
m_ttsEngine->speak(text, true);
m_totalSpoken++;
m_logger->debug("Speaking");
}
// Phase 7: New STT configuration with full config support
void VoiceService::configureSTT(const nlohmann::json& sttConfig) {
m_logger->info("[VoiceService] Configuring STT service (Phase 7)");
// Extract enabled flag
bool enabled = false;
if (sttConfig.contains("active_mode")) {
const auto& activeMode = sttConfig["active_mode"];
enabled = activeMode.value("enabled", true);
}
m_sttEnabled = enabled;
if (!enabled) {
m_logger->info("[VoiceService] STT disabled in config");
return;
}
// Create and start STT service
m_sttService = std::make_unique<STTService>(sttConfig);
if (!m_sttService->start()) {
m_logger->error("[VoiceService] Failed to start STT service");
m_sttService.reset();
return;
}
m_logger->info("[VoiceService] STT service started");
// Setup callbacks for transcription events
// Note: For MVP Milestone 1, we don't start streaming yet
// This will be implemented in Milestone 2 (passive mode)
}
// STT event handlers (Phase 7)
void VoiceService::handleKeyword(const std::string& keyword) {
m_logger->info("[VoiceService] Keyword detected");
// Publish keyword detection event
if (m_io) {
auto event = std::unique_ptr<grove::IDataNode>(
new grove::JsonDataNode("event")
);
event->setString("keyword", keyword);
event->setInt("timestamp", static_cast<int>(std::time(nullptr)));
m_io->publish("voice:keyword_detected", std::move(event));
}
// Auto-switch to active mode (Phase 7.2)
if (m_sttService) {
m_sttService->setMode(STTMode::ACTIVE);
}
}
void VoiceService::handleTranscription(const std::string& text, STTMode mode) {
m_logger->info("[VoiceService] Transcription received");
// Publish transcription event
if (m_io) {
std::string modeStr = (mode == STTMode::PASSIVE ? "passive" : "active");
auto event = std::unique_ptr<grove::IDataNode>(
new grove::JsonDataNode("event")
);
event->setString("text", text);
event->setString("mode", modeStr);
event->setInt("timestamp", static_cast<int>(std::time(nullptr)));
m_io->publish("voice:transcription", std::move(event));
}
}
void VoiceService::shutdown() {
if (m_ttsEngine) {
m_ttsEngine->stop();
}
if (m_sttService) {
m_sttService->stop();
}
m_logger->info("[VoiceService] Shutdown");
}
bool VoiceService::loadConfig(const std::string& configPath) {
try {
std::ifstream file(configPath);
if (!file.is_open()) {
m_logger->warn("[VoiceService] Config file not found: {}", configPath);
return false;
}
nlohmann::json config;
file >> config;
// Load TTS config
if (config.contains("tts")) {
const auto& ttsConfig = config["tts"];
m_ttsEnabled = ttsConfig.value("enabled", true);
m_ttsRate = ttsConfig.value("rate", 0);
m_ttsVolume = ttsConfig.value("volume", 80);
}
// Load STT config (Phase 7 format)
if (config.contains("stt")) {
configureSTT(config["stt"]);
}
m_logger->info("[VoiceService] Config loaded from {}", configPath);
return true;
} catch (const std::exception& e) {
m_logger->error("[VoiceService] Failed to load config: {}", e.what());
return false;
}
}
std::string VoiceService::transcribeFileSync(
const std::string& filePath,
const std::string& language
) {
m_logger->info("[VoiceService] transcribeFileSync: {}", filePath);
if (!m_sttService) {
throw std::runtime_error("STT service not initialized");
}
// Use STT service to transcribe file synchronously
// Note: This requires STT service to support file transcription
// For MVP, we'll throw not implemented
throw std::runtime_error("transcribeFileSync not yet implemented - STT service needs file transcription support");
}
bool VoiceService::textToSpeechSync(
const std::string& text,
const std::string& outputFile,
const std::string& voice
) {
m_logger->info("[VoiceService] textToSpeechSync: {} -> {}", text.substr(0, 50), outputFile);
if (!m_ttsEngine) {
throw std::runtime_error("TTS engine not initialized");
}
// For MVP, we don't support saving to file yet
// The TTS engine currently only speaks directly
throw std::runtime_error("textToSpeechSync file output not yet implemented - TTS engine needs file output support");
}
} // namespace aissia

View File

@ -1,117 +1,117 @@
#pragma once
// Include nlohmann/json BEFORE grove headers to avoid macro conflicts
#include <nlohmann/json.hpp>
#include "IService.hpp"
#include "ISTTService.hpp"
#include "../shared/audio/ITTSEngine.hpp"
#include "../shared/audio/ISTTEngine.hpp"
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <spdlog/spdlog.h>
#include <memory>
#include <string>
#include <queue>
namespace aissia {
/**
* @brief Voice Service - TTS and STT engines
*
* Handles platform-specific audio engines (SAPI on Windows, espeak on Linux).
* Manages speak queue and processes TTS/STT requests.
*
* Subscribes to:
* - "voice:speak" : { text, priority? }
* - "voice:stop" : Stop current speech
* - "voice:listen" : Start STT recording
*
* Publishes:
* - "voice:speaking_started" : { text }
* - "voice:speaking_ended" : {}
* - "voice:transcription" : { text, confidence }
*/
class VoiceService : public IService {
public:
VoiceService();
~VoiceService() override = default;
bool initialize(grove::IIO* io) override;
void process() override;
void shutdown() override;
std::string getName() const override { return "VoiceService"; }
bool isHealthy() const override { return m_ttsEngine != nullptr; }
/// Configure TTS settings
void configureTTS(bool enabled = true, int rate = 0, int volume = 80);
/// Configure STT settings (legacy API)
void configureSTT(bool enabled = true, const std::string& language = "fr",
const std::string& apiKey = "");
/// Configure STT with full config (Phase 7)
void configureSTT(const nlohmann::json& sttConfig);
/// Load configuration from JSON file
bool loadConfig(const std::string& configPath);
/**
* @brief Transcribe audio file synchronously (for MCP Server mode)
*
* @param filePath Path to audio file
* @param language Language code (e.g., "fr", "en")
* @return Transcribed text
*/
std::string transcribeFileSync(
const std::string& filePath,
const std::string& language = "fr"
);
/**
* @brief Convert text to speech synchronously (for MCP Server mode)
*
* @param text Text to synthesize
* @param outputFile Output audio file path
* @param voice Voice identifier (e.g., "fr-fr")
* @return true if successful
*/
bool textToSpeechSync(
const std::string& text,
const std::string& outputFile,
const std::string& voice = "fr-fr"
);
private:
// Configuration
bool m_ttsEnabled = true;
bool m_sttEnabled = true;
int m_ttsRate = 0;
int m_ttsVolume = 80;
std::string m_language = "fr";
// State
std::unique_ptr<ITTSEngine> m_ttsEngine;
std::unique_ptr<ISTTEngine> m_sttEngine; // Legacy direct engine (deprecated)
std::unique_ptr<ISTTService> m_sttService; // Phase 7: New STT service layer
std::queue<std::string> m_speakQueue;
int m_totalSpoken = 0;
// Services
grove::IIO* m_io = nullptr;
std::shared_ptr<spdlog::logger> m_logger;
// Helpers
void processMessages();
void processSpeakQueue();
void speak(const std::string& text);
void handleSpeakRequest(const grove::IDataNode& data);
// STT handlers (Phase 7)
void handleTranscription(const std::string& text, STTMode mode);
void handleKeyword(const std::string& keyword);
};
} // namespace aissia
#pragma once
// Include nlohmann/json BEFORE grove headers to avoid macro conflicts
#include <nlohmann/json.hpp>
#include "IService.hpp"
#include "ISTTService.hpp"
#include "../shared/audio/ITTSEngine.hpp"
#include "../shared/audio/ISTTEngine.hpp"
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <spdlog/spdlog.h>
#include <memory>
#include <string>
#include <queue>
namespace aissia {
/**
* @brief Voice Service - TTS and STT engines
*
* Handles platform-specific audio engines (SAPI on Windows, espeak on Linux).
* Manages speak queue and processes TTS/STT requests.
*
* Subscribes to:
* - "voice:speak" : { text, priority? }
* - "voice:stop" : Stop current speech
* - "voice:listen" : Start STT recording
*
* Publishes:
* - "voice:speaking_started" : { text }
* - "voice:speaking_ended" : {}
* - "voice:transcription" : { text, confidence }
*/
class VoiceService : public IService {
public:
VoiceService();
~VoiceService() override = default;
bool initialize(grove::IIO* io) override;
void process() override;
void shutdown() override;
std::string getName() const override { return "VoiceService"; }
bool isHealthy() const override { return m_ttsEngine != nullptr; }
/// Configure TTS settings
void configureTTS(bool enabled = true, int rate = 0, int volume = 80);
/// Configure STT settings (legacy API)
void configureSTT(bool enabled = true, const std::string& language = "fr",
const std::string& apiKey = "");
/// Configure STT with full config (Phase 7)
void configureSTT(const nlohmann::json& sttConfig);
/// Load configuration from JSON file
bool loadConfig(const std::string& configPath);
/**
* @brief Transcribe audio file synchronously (for MCP Server mode)
*
* @param filePath Path to audio file
* @param language Language code (e.g., "fr", "en")
* @return Transcribed text
*/
std::string transcribeFileSync(
const std::string& filePath,
const std::string& language = "fr"
);
/**
* @brief Convert text to speech synchronously (for MCP Server mode)
*
* @param text Text to synthesize
* @param outputFile Output audio file path
* @param voice Voice identifier (e.g., "fr-fr")
* @return true if successful
*/
bool textToSpeechSync(
const std::string& text,
const std::string& outputFile,
const std::string& voice = "fr-fr"
);
private:
// Configuration
bool m_ttsEnabled = true;
bool m_sttEnabled = true;
int m_ttsRate = 0;
int m_ttsVolume = 80;
std::string m_language = "fr";
// State
std::unique_ptr<ITTSEngine> m_ttsEngine;
std::unique_ptr<ISTTEngine> m_sttEngine; // Legacy direct engine (deprecated)
std::unique_ptr<ISTTService> m_sttService; // Phase 7: New STT service layer
std::queue<std::string> m_speakQueue;
int m_totalSpoken = 0;
// Services
grove::IIO* m_io = nullptr;
std::shared_ptr<spdlog::logger> m_logger;
// Helpers
void processMessages();
void processSpeakQueue();
void speak(const std::string& text);
void handleSpeakRequest(const grove::IDataNode& data);
// STT handlers (Phase 7)
void handleTranscription(const std::string& text, STTMode mode);
void handleKeyword(const std::string& keyword);
};
} // namespace aissia

View File

@ -1,310 +1,310 @@
#include "MCPServerTools.hpp"
#include "../../services/LLMService.hpp"
#include "../../services/StorageService.hpp"
#include "../../services/VoiceService.hpp"
#include <spdlog/spdlog.h>
namespace aissia::tools {
MCPServerTools::MCPServerTools(
LLMService* llm,
StorageService* storage,
VoiceService* voice
) : m_llmService(llm),
m_storageService(storage),
m_voiceService(voice)
{
}
std::vector<ToolDefinition> MCPServerTools::getToolDefinitions() {
std::vector<ToolDefinition> tools;
// Tool 1: chat_with_aissia (PRIORITÉ)
if (m_llmService) {
tools.push_back({
"chat_with_aissia",
"Dialogue with AISSIA assistant (Claude Sonnet 4). Send a message and get an intelligent response with access to AISSIA's knowledge and capabilities.",
{
{"type", "object"},
{"properties", {
{"message", {
{"type", "string"},
{"description", "Message to send to AISSIA"}
}},
{"conversation_id", {
{"type", "string"},
{"description", "Conversation ID for continuity (optional)"}
}},
{"system_prompt", {
{"type", "string"},
{"description", "Custom system prompt (optional)"}
}}
}},
{"required", json::array({"message"})}
},
[this](const json& input) { return handleChatWithAissia(input); }
});
}
// Tool 2: transcribe_audio
if (m_voiceService) {
tools.push_back({
"transcribe_audio",
"Transcribe audio file to text using Speech-to-Text (Whisper.cpp, OpenAI Whisper API, or Google Speech). Supports WAV, MP3, and other common audio formats.",
{
{"type", "object"},
{"properties", {
{"file_path", {
{"type", "string"},
{"description", "Path to audio file"}
}},
{"language", {
{"type", "string"},
{"description", "Language code (e.g., 'fr', 'en'). Default: 'fr'"}
}}
}},
{"required", json::array({"file_path"})}
},
[this](const json& input) { return handleTranscribeAudio(input); }
});
// Tool 3: text_to_speech
tools.push_back({
"text_to_speech",
"Convert text to speech audio file using Text-to-Speech synthesis. Generates audio in WAV format.",
{
{"type", "object"},
{"properties", {
{"text", {
{"type", "string"},
{"description", "Text to synthesize"}
}},
{"output_file", {
{"type", "string"},
{"description", "Output audio file path (WAV)"}
}},
{"voice", {
{"type", "string"},
{"description", "Voice identifier (e.g., 'fr-fr', 'en-us'). Default: 'fr-fr'"}
}}
}},
{"required", json::array({"text", "output_file"})}
},
[this](const json& input) { return handleTextToSpeech(input); }
});
}
// Tool 4: save_memory
if (m_storageService) {
tools.push_back({
"save_memory",
"Save a note or memory to AISSIA's persistent storage. Memories can be tagged and searched later.",
{
{"type", "object"},
{"properties", {
{"title", {
{"type", "string"},
{"description", "Memory title"}
}},
{"content", {
{"type", "string"},
{"description", "Memory content"}
}},
{"tags", {
{"type", "array"},
{"items", {{"type", "string"}}},
{"description", "Tags for categorization (optional)"}
}}
}},
{"required", json::array({"title", "content"})}
},
[this](const json& input) { return handleSaveMemory(input); }
});
// Tool 5: search_memories
tools.push_back({
"search_memories",
"Search through saved memories and notes in AISSIA's storage. Returns matching memories with relevance scores.",
{
{"type", "object"},
{"properties", {
{"query", {
{"type", "string"},
{"description", "Search query"}
}},
{"limit", {
{"type", "integer"},
{"description", "Maximum results to return. Default: 10"}
}}
}},
{"required", json::array({"query"})}
},
[this](const json& input) { return handleSearchMemories(input); }
});
}
return tools;
}
json MCPServerTools::execute(const std::string& toolName, const json& input) {
if (toolName == "chat_with_aissia") {
return handleChatWithAissia(input);
} else if (toolName == "transcribe_audio") {
return handleTranscribeAudio(input);
} else if (toolName == "text_to_speech") {
return handleTextToSpeech(input);
} else if (toolName == "save_memory") {
return handleSaveMemory(input);
} else if (toolName == "search_memories") {
return handleSearchMemories(input);
}
return {
{"error", "Unknown tool: " + toolName}
};
}
// ============================================================================
// Tool Handlers
// ============================================================================
json MCPServerTools::handleChatWithAissia(const json& input) {
if (!m_llmService) {
return {{"error", "LLMService not available"}};
}
try {
std::string message = input["message"];
std::string conversationId = input.value("conversation_id", "");
std::string systemPrompt = input.value("system_prompt", "");
spdlog::info("[chat_with_aissia] Message: {}", message.substr(0, 100));
// Call synchronous LLM method
auto response = m_llmService->sendMessageSync(message, conversationId, systemPrompt);
return {
{"response", response.text},
{"conversation_id", conversationId},
{"tokens", response.tokens},
{"iterations", response.iterations}
};
} catch (const std::exception& e) {
spdlog::error("[chat_with_aissia] Error: {}", e.what());
return {{"error", e.what()}};
}
}
json MCPServerTools::handleTranscribeAudio(const json& input) {
if (!m_voiceService) {
return {{"error", "VoiceService not available"}};
}
try {
std::string filePath = input["file_path"];
std::string language = input.value("language", "fr");
spdlog::info("[transcribe_audio] File: {}, Language: {}", filePath, language);
// Call synchronous STT method
std::string text = m_voiceService->transcribeFileSync(filePath, language);
return {
{"text", text},
{"file", filePath},
{"language", language}
};
} catch (const std::exception& e) {
spdlog::error("[transcribe_audio] Error: {}", e.what());
return {{"error", e.what()}};
}
}
json MCPServerTools::handleTextToSpeech(const json& input) {
if (!m_voiceService) {
return {{"error", "VoiceService not available"}};
}
try {
std::string text = input["text"];
std::string outputFile = input["output_file"];
std::string voice = input.value("voice", "fr-fr");
spdlog::info("[text_to_speech] Text: {}, Output: {}", text.substr(0, 50), outputFile);
// Call synchronous TTS method
bool success = m_voiceService->textToSpeechSync(text, outputFile, voice);
if (success) {
return {
{"success", true},
{"file", outputFile},
{"voice", voice}
};
} else {
return {{"error", "TTS generation failed"}};
}
} catch (const std::exception& e) {
spdlog::error("[text_to_speech] Error: {}", e.what());
return {{"error", e.what()}};
}
}
json MCPServerTools::handleSaveMemory(const json& input) {
if (!m_storageService) {
return {{"error", "StorageService not available"}};
}
try {
std::string title = input["title"];
std::string content = input["content"];
std::vector<std::string> tags;
if (input.contains("tags") && input["tags"].is_array()) {
for (const auto& tag : input["tags"]) {
tags.push_back(tag.get<std::string>());
}
}
spdlog::info("[save_memory] Title: {}", title);
// TODO: Implement saveMemorySync in StorageService
// For now, return not implemented
return json({
{"error", "save_memory not yet implemented"},
{"note", "StorageService sync methods need to be added"},
{"title", title}
});
} catch (const std::exception& e) {
spdlog::error("[save_memory] Error: {}", e.what());
return {{"error", e.what()}};
}
}
json MCPServerTools::handleSearchMemories(const json& input) {
if (!m_storageService) {
return {{"error", "StorageService not available"}};
}
try {
std::string query = input["query"];
int limit = input.value("limit", 10);
spdlog::info("[search_memories] Query: {}, Limit: {}", query, limit);
// TODO: Implement searchMemoriesSync in StorageService
// For now, return not implemented
return json({
{"error", "search_memories not yet implemented"},
{"note", "StorageService sync methods need to be added"},
{"query", query},
{"limit", limit}
});
} catch (const std::exception& e) {
spdlog::error("[search_memories] Error: {}", e.what());
return {{"error", e.what()}};
}
}
} // namespace aissia::tools
#include "MCPServerTools.hpp"
#include "../../services/LLMService.hpp"
#include "../../services/StorageService.hpp"
#include "../../services/VoiceService.hpp"
#include <spdlog/spdlog.h>
namespace aissia::tools {
MCPServerTools::MCPServerTools(
LLMService* llm,
StorageService* storage,
VoiceService* voice
) : m_llmService(llm),
m_storageService(storage),
m_voiceService(voice)
{
}
std::vector<ToolDefinition> MCPServerTools::getToolDefinitions() {
std::vector<ToolDefinition> tools;
// Tool 1: chat_with_aissia (PRIORITÉ)
if (m_llmService) {
tools.push_back({
"chat_with_aissia",
"Dialogue with AISSIA assistant (Claude Sonnet 4). Send a message and get an intelligent response with access to AISSIA's knowledge and capabilities.",
{
{"type", "object"},
{"properties", {
{"message", {
{"type", "string"},
{"description", "Message to send to AISSIA"}
}},
{"conversation_id", {
{"type", "string"},
{"description", "Conversation ID for continuity (optional)"}
}},
{"system_prompt", {
{"type", "string"},
{"description", "Custom system prompt (optional)"}
}}
}},
{"required", json::array({"message"})}
},
[this](const json& input) { return handleChatWithAissia(input); }
});
}
// Tool 2: transcribe_audio
if (m_voiceService) {
tools.push_back({
"transcribe_audio",
"Transcribe audio file to text using Speech-to-Text (Whisper.cpp, OpenAI Whisper API, or Google Speech). Supports WAV, MP3, and other common audio formats.",
{
{"type", "object"},
{"properties", {
{"file_path", {
{"type", "string"},
{"description", "Path to audio file"}
}},
{"language", {
{"type", "string"},
{"description", "Language code (e.g., 'fr', 'en'). Default: 'fr'"}
}}
}},
{"required", json::array({"file_path"})}
},
[this](const json& input) { return handleTranscribeAudio(input); }
});
// Tool 3: text_to_speech
tools.push_back({
"text_to_speech",
"Convert text to speech audio file using Text-to-Speech synthesis. Generates audio in WAV format.",
{
{"type", "object"},
{"properties", {
{"text", {
{"type", "string"},
{"description", "Text to synthesize"}
}},
{"output_file", {
{"type", "string"},
{"description", "Output audio file path (WAV)"}
}},
{"voice", {
{"type", "string"},
{"description", "Voice identifier (e.g., 'fr-fr', 'en-us'). Default: 'fr-fr'"}
}}
}},
{"required", json::array({"text", "output_file"})}
},
[this](const json& input) { return handleTextToSpeech(input); }
});
}
// Tool 4: save_memory
if (m_storageService) {
tools.push_back({
"save_memory",
"Save a note or memory to AISSIA's persistent storage. Memories can be tagged and searched later.",
{
{"type", "object"},
{"properties", {
{"title", {
{"type", "string"},
{"description", "Memory title"}
}},
{"content", {
{"type", "string"},
{"description", "Memory content"}
}},
{"tags", {
{"type", "array"},
{"items", {{"type", "string"}}},
{"description", "Tags for categorization (optional)"}
}}
}},
{"required", json::array({"title", "content"})}
},
[this](const json& input) { return handleSaveMemory(input); }
});
// Tool 5: search_memories
tools.push_back({
"search_memories",
"Search through saved memories and notes in AISSIA's storage. Returns matching memories with relevance scores.",
{
{"type", "object"},
{"properties", {
{"query", {
{"type", "string"},
{"description", "Search query"}
}},
{"limit", {
{"type", "integer"},
{"description", "Maximum results to return. Default: 10"}
}}
}},
{"required", json::array({"query"})}
},
[this](const json& input) { return handleSearchMemories(input); }
});
}
return tools;
}
json MCPServerTools::execute(const std::string& toolName, const json& input) {
if (toolName == "chat_with_aissia") {
return handleChatWithAissia(input);
} else if (toolName == "transcribe_audio") {
return handleTranscribeAudio(input);
} else if (toolName == "text_to_speech") {
return handleTextToSpeech(input);
} else if (toolName == "save_memory") {
return handleSaveMemory(input);
} else if (toolName == "search_memories") {
return handleSearchMemories(input);
}
return {
{"error", "Unknown tool: " + toolName}
};
}
// ============================================================================
// Tool Handlers
// ============================================================================
json MCPServerTools::handleChatWithAissia(const json& input) {
if (!m_llmService) {
return {{"error", "LLMService not available"}};
}
try {
std::string message = input["message"];
std::string conversationId = input.value("conversation_id", "");
std::string systemPrompt = input.value("system_prompt", "");
spdlog::info("[chat_with_aissia] Message: {}", message.substr(0, 100));
// Call synchronous LLM method
auto response = m_llmService->sendMessageSync(message, conversationId, systemPrompt);
return {
{"response", response.text},
{"conversation_id", conversationId},
{"tokens", response.tokens},
{"iterations", response.iterations}
};
} catch (const std::exception& e) {
spdlog::error("[chat_with_aissia] Error: {}", e.what());
return {{"error", e.what()}};
}
}
json MCPServerTools::handleTranscribeAudio(const json& input) {
if (!m_voiceService) {
return {{"error", "VoiceService not available"}};
}
try {
std::string filePath = input["file_path"];
std::string language = input.value("language", "fr");
spdlog::info("[transcribe_audio] File: {}, Language: {}", filePath, language);
// Call synchronous STT method
std::string text = m_voiceService->transcribeFileSync(filePath, language);
return {
{"text", text},
{"file", filePath},
{"language", language}
};
} catch (const std::exception& e) {
spdlog::error("[transcribe_audio] Error: {}", e.what());
return {{"error", e.what()}};
}
}
json MCPServerTools::handleTextToSpeech(const json& input) {
if (!m_voiceService) {
return {{"error", "VoiceService not available"}};
}
try {
std::string text = input["text"];
std::string outputFile = input["output_file"];
std::string voice = input.value("voice", "fr-fr");
spdlog::info("[text_to_speech] Text: {}, Output: {}", text.substr(0, 50), outputFile);
// Call synchronous TTS method
bool success = m_voiceService->textToSpeechSync(text, outputFile, voice);
if (success) {
return {
{"success", true},
{"file", outputFile},
{"voice", voice}
};
} else {
return {{"error", "TTS generation failed"}};
}
} catch (const std::exception& e) {
spdlog::error("[text_to_speech] Error: {}", e.what());
return {{"error", e.what()}};
}
}
json MCPServerTools::handleSaveMemory(const json& input) {
if (!m_storageService) {
return {{"error", "StorageService not available"}};
}
try {
std::string title = input["title"];
std::string content = input["content"];
std::vector<std::string> tags;
if (input.contains("tags") && input["tags"].is_array()) {
for (const auto& tag : input["tags"]) {
tags.push_back(tag.get<std::string>());
}
}
spdlog::info("[save_memory] Title: {}", title);
// TODO: Implement saveMemorySync in StorageService
// For now, return not implemented
return json({
{"error", "save_memory not yet implemented"},
{"note", "StorageService sync methods need to be added"},
{"title", title}
});
} catch (const std::exception& e) {
spdlog::error("[save_memory] Error: {}", e.what());
return {{"error", e.what()}};
}
}
json MCPServerTools::handleSearchMemories(const json& input) {
if (!m_storageService) {
return {{"error", "StorageService not available"}};
}
try {
std::string query = input["query"];
int limit = input.value("limit", 10);
spdlog::info("[search_memories] Query: {}, Limit: {}", query, limit);
// TODO: Implement searchMemoriesSync in StorageService
// For now, return not implemented
return json({
{"error", "search_memories not yet implemented"},
{"note", "StorageService sync methods need to be added"},
{"query", query},
{"limit", limit}
});
} catch (const std::exception& e) {
spdlog::error("[search_memories] Error: {}", e.what());
return {{"error", e.what()}};
}
}
} // namespace aissia::tools

View File

@ -1,76 +1,76 @@
#pragma once
#include "../llm/ToolRegistry.hpp"
#include <nlohmann/json.hpp>
#include <memory>
#include <vector>
// Forward declarations
namespace aissia {
class LLMService;
class StorageService;
class VoiceService;
}
namespace aissia::tools {
using json = nlohmann::json;
/**
* @brief MCP Server Tools - Bridge between MCP Server and AISSIA services
*
* Provides tool definitions for AISSIA modules exposed via MCP Server:
* - chat_with_aissia: Dialogue with AISSIA (Claude Sonnet 4)
* - transcribe_audio: Speech-to-text (Whisper.cpp/OpenAI/Google)
* - text_to_speech: Text-to-speech synthesis
* - save_memory: Save note/memory to storage
* - search_memories: Search stored memories
*
* Note: These tools run in synchronous mode (no IIO pub/sub, no main loop)
*/
class MCPServerTools {
public:
/**
* @brief Construct MCP server tools with service dependencies
*
* @param llm LLMService for chat_with_aissia (can be nullptr)
* @param storage StorageService for save/search memories (can be nullptr)
* @param voice VoiceService for TTS/STT (can be nullptr)
*/
MCPServerTools(
LLMService* llm,
StorageService* storage,
VoiceService* voice
);
/**
* @brief Get all tool definitions for registration
*
* @return Vector of ToolDefinition structs
*/
std::vector<ToolDefinition> getToolDefinitions();
/**
* @brief Execute a tool by name
*
* @param toolName Tool to execute
* @param input Tool arguments (JSON)
* @return Tool result (JSON)
*/
json execute(const std::string& toolName, const json& input);
private:
// Tool handlers
json handleChatWithAissia(const json& input);
json handleTranscribeAudio(const json& input);
json handleTextToSpeech(const json& input);
json handleSaveMemory(const json& input);
json handleSearchMemories(const json& input);
// Service references (nullable)
LLMService* m_llmService;
StorageService* m_storageService;
VoiceService* m_voiceService;
};
} // namespace aissia::tools
#pragma once
#include "../llm/ToolRegistry.hpp"
#include <nlohmann/json.hpp>
#include <memory>
#include <vector>
// Forward declarations
namespace aissia {
class LLMService;
class StorageService;
class VoiceService;
}
namespace aissia::tools {
using json = nlohmann::json;
/**
* @brief MCP Server Tools - Bridge between MCP Server and AISSIA services
*
* Provides tool definitions for AISSIA modules exposed via MCP Server:
* - chat_with_aissia: Dialogue with AISSIA (Claude Sonnet 4)
* - transcribe_audio: Speech-to-text (Whisper.cpp/OpenAI/Google)
* - text_to_speech: Text-to-speech synthesis
* - save_memory: Save note/memory to storage
* - search_memories: Search stored memories
*
* Note: These tools run in synchronous mode (no IIO pub/sub, no main loop)
*/
class MCPServerTools {
public:
/**
* @brief Construct MCP server tools with service dependencies
*
* @param llm LLMService for chat_with_aissia (can be nullptr)
* @param storage StorageService for save/search memories (can be nullptr)
* @param voice VoiceService for TTS/STT (can be nullptr)
*/
MCPServerTools(
LLMService* llm,
StorageService* storage,
VoiceService* voice
);
/**
* @brief Get all tool definitions for registration
*
* @return Vector of ToolDefinition structs
*/
std::vector<ToolDefinition> getToolDefinitions();
/**
* @brief Execute a tool by name
*
* @param toolName Tool to execute
* @param input Tool arguments (JSON)
* @return Tool result (JSON)
*/
json execute(const std::string& toolName, const json& input);
private:
// Tool handlers
json handleChatWithAissia(const json& input);
json handleTranscribeAudio(const json& input);
json handleTextToSpeech(const json& input);
json handleSaveMemory(const json& input);
json handleSearchMemories(const json& input);
// Service references (nullable)
LLMService* m_llmService;
StorageService* m_storageService;
VoiceService* m_voiceService;
};
} // namespace aissia::tools

View File

@ -1,16 +1,16 @@
{
"environment": {
"platform": "linux",
"testDirectory": "tests/integration"
},
"summary": {
"failed": 0,
"passed": 0,
"skipped": 0,
"successRate": 0.0,
"total": 0,
"totalDurationMs": 0
},
"tests": [],
"timestamp": "2025-11-29T09:01:38Z"
{
"environment": {
"platform": "linux",
"testDirectory": "tests/integration"
},
"summary": {
"failed": 0,
"passed": 0,
"skipped": 0,
"successRate": 0.0,
"total": 0,
"totalDurationMs": 0
},
"tests": [],
"timestamp": "2025-11-29T09:01:38Z"
}

View File

@ -1,5 +1,5 @@
#!/bin/bash
set -a
source .env
set +a
echo "Quelle heure est-il ?" | timeout 30 ./build/aissia --interactive
#!/bin/bash
set -a
source .env
set +a
echo "Quelle heure est-il ?" | timeout 30 ./build/aissia --interactive

BIN
test_output.txt Normal file

Binary file not shown.

View File

@ -1,237 +1,237 @@
/**
* @file test_stt_live.cpp
* @brief Live STT testing tool - Test all 4 engines
*/
#include "src/shared/audio/ISTTEngine.hpp"
#include <spdlog/spdlog.h>
#include <iostream>
#include <fstream>
#include <vector>
#include <cstdlib>
using namespace aissia;
// Helper: Load .env file
void loadEnv(const std::string& path = ".env") {
std::ifstream file(path);
if (!file.is_open()) {
spdlog::warn("No .env file found at: {}", path);
return;
}
std::string line;
while (std::getline(file, line)) {
if (line.empty() || line[0] == '#') continue;
auto pos = line.find('=');
if (pos != std::string::npos) {
std::string key = line.substr(0, pos);
std::string value = line.substr(pos + 1);
// Remove quotes
if (!value.empty() && value.front() == '"' && value.back() == '"') {
value = value.substr(1, value.length() - 2);
}
#ifdef _WIN32
_putenv_s(key.c_str(), value.c_str());
#else
setenv(key.c_str(), value.c_str(), 1);
#endif
}
}
spdlog::info("Loaded environment from {}", path);
}
// Helper: Get API key from env
std::string getEnvVar(const std::string& name) {
const char* val = std::getenv(name.c_str());
return val ? std::string(val) : "";
}
// Helper: Load audio file as WAV (simplified - assumes 16-bit PCM)
std::vector<float> loadWavFile(const std::string& path) {
std::ifstream file(path, std::ios::binary);
if (!file.is_open()) {
spdlog::error("Failed to open audio file: {}", path);
return {};
}
// Skip WAV header (44 bytes)
file.seekg(44);
// Read 16-bit PCM samples
std::vector<int16_t> samples;
int16_t sample;
while (file.read(reinterpret_cast<char*>(&sample), sizeof(sample))) {
samples.push_back(sample);
}
// Convert to float [-1.0, 1.0]
std::vector<float> audioData;
audioData.reserve(samples.size());
for (int16_t s : samples) {
audioData.push_back(static_cast<float>(s) / 32768.0f);
}
spdlog::info("Loaded {} samples from {}", audioData.size(), path);
return audioData;
}
int main(int argc, char* argv[]) {
spdlog::set_level(spdlog::level::info);
spdlog::info("=== AISSIA STT Live Test ===");
// Load environment variables
loadEnv();
// Check command line
if (argc < 2) {
std::cout << "Usage: " << argv[0] << " <audio.wav>\n";
std::cout << "\nAvailable engines:\n";
std::cout << " 1. Whisper.cpp (local, requires models/ggml-base.bin)\n";
std::cout << " 2. Whisper API (requires OPENAI_API_KEY)\n";
std::cout << " 3. Google Speech (requires GOOGLE_API_KEY)\n";
std::cout << " 4. Azure STT (requires AZURE_SPEECH_KEY + AZURE_SPEECH_REGION)\n";
std::cout << " 5. Deepgram (requires DEEPGRAM_API_KEY)\n";
return 1;
}
std::string audioFile = argv[1];
// Load audio
std::vector<float> audioData = loadWavFile(audioFile);
if (audioData.empty()) {
spdlog::error("Failed to load audio data");
return 1;
}
// Test each engine
std::cout << "\n========================================\n";
std::cout << "Testing STT Engines\n";
std::cout << "========================================\n\n";
// 1. Whisper.cpp (local)
{
std::cout << "[1/5] Whisper.cpp (local)\n";
std::cout << "----------------------------\n";
try {
auto engine = STTEngineFactory::create("whisper_cpp", "models/ggml-base.bin");
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribe(audioData);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available (model missing?)\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
// 2. Whisper API
{
std::cout << "[2/5] OpenAI Whisper API\n";
std::cout << "----------------------------\n";
std::string apiKey = getEnvVar("OPENAI_API_KEY");
if (apiKey.empty()) {
std::cout << "❌ OPENAI_API_KEY not set\n\n";
} else {
try {
auto engine = STTEngineFactory::create("whisper_api", "", apiKey);
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribeFile(audioFile);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
}
// 3. Google Speech
{
std::cout << "[3/5] Google Speech-to-Text\n";
std::cout << "----------------------------\n";
std::string apiKey = getEnvVar("GOOGLE_API_KEY");
if (apiKey.empty()) {
std::cout << "❌ GOOGLE_API_KEY not set\n\n";
} else {
try {
auto engine = STTEngineFactory::create("google", "", apiKey);
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribeFile(audioFile);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
}
// 4. Azure Speech
{
std::cout << "[4/5] Azure Speech-to-Text\n";
std::cout << "----------------------------\n";
std::string apiKey = getEnvVar("AZURE_SPEECH_KEY");
std::string region = getEnvVar("AZURE_SPEECH_REGION");
if (apiKey.empty() || region.empty()) {
std::cout << "❌ AZURE_SPEECH_KEY or AZURE_SPEECH_REGION not set\n\n";
} else {
try {
auto engine = STTEngineFactory::create("azure", region, apiKey);
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribeFile(audioFile);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
}
// 5. Deepgram
{
std::cout << "[5/5] Deepgram\n";
std::cout << "----------------------------\n";
std::string apiKey = getEnvVar("DEEPGRAM_API_KEY");
if (apiKey.empty()) {
std::cout << "❌ DEEPGRAM_API_KEY not set\n\n";
} else {
try {
auto engine = STTEngineFactory::create("deepgram", "", apiKey);
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribeFile(audioFile);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
}
std::cout << "========================================\n";
std::cout << "Testing complete!\n";
std::cout << "========================================\n";
return 0;
}
/**
* @file test_stt_live.cpp
* @brief Live STT testing tool - Test all 4 engines
*/
#include "src/shared/audio/ISTTEngine.hpp"
#include <spdlog/spdlog.h>
#include <iostream>
#include <fstream>
#include <vector>
#include <cstdlib>
using namespace aissia;
// Helper: Load .env file
void loadEnv(const std::string& path = ".env") {
std::ifstream file(path);
if (!file.is_open()) {
spdlog::warn("No .env file found at: {}", path);
return;
}
std::string line;
while (std::getline(file, line)) {
if (line.empty() || line[0] == '#') continue;
auto pos = line.find('=');
if (pos != std::string::npos) {
std::string key = line.substr(0, pos);
std::string value = line.substr(pos + 1);
// Remove quotes
if (!value.empty() && value.front() == '"' && value.back() == '"') {
value = value.substr(1, value.length() - 2);
}
#ifdef _WIN32
_putenv_s(key.c_str(), value.c_str());
#else
setenv(key.c_str(), value.c_str(), 1);
#endif
}
}
spdlog::info("Loaded environment from {}", path);
}
// Helper: Get API key from env
std::string getEnvVar(const std::string& name) {
const char* val = std::getenv(name.c_str());
return val ? std::string(val) : "";
}
// Helper: Load audio file as WAV (simplified - assumes 16-bit PCM)
std::vector<float> loadWavFile(const std::string& path) {
std::ifstream file(path, std::ios::binary);
if (!file.is_open()) {
spdlog::error("Failed to open audio file: {}", path);
return {};
}
// Skip WAV header (44 bytes)
file.seekg(44);
// Read 16-bit PCM samples
std::vector<int16_t> samples;
int16_t sample;
while (file.read(reinterpret_cast<char*>(&sample), sizeof(sample))) {
samples.push_back(sample);
}
// Convert to float [-1.0, 1.0]
std::vector<float> audioData;
audioData.reserve(samples.size());
for (int16_t s : samples) {
audioData.push_back(static_cast<float>(s) / 32768.0f);
}
spdlog::info("Loaded {} samples from {}", audioData.size(), path);
return audioData;
}
int main(int argc, char* argv[]) {
spdlog::set_level(spdlog::level::info);
spdlog::info("=== AISSIA STT Live Test ===");
// Load environment variables
loadEnv();
// Check command line
if (argc < 2) {
std::cout << "Usage: " << argv[0] << " <audio.wav>\n";
std::cout << "\nAvailable engines:\n";
std::cout << " 1. Whisper.cpp (local, requires models/ggml-base.bin)\n";
std::cout << " 2. Whisper API (requires OPENAI_API_KEY)\n";
std::cout << " 3. Google Speech (requires GOOGLE_API_KEY)\n";
std::cout << " 4. Azure STT (requires AZURE_SPEECH_KEY + AZURE_SPEECH_REGION)\n";
std::cout << " 5. Deepgram (requires DEEPGRAM_API_KEY)\n";
return 1;
}
std::string audioFile = argv[1];
// Load audio
std::vector<float> audioData = loadWavFile(audioFile);
if (audioData.empty()) {
spdlog::error("Failed to load audio data");
return 1;
}
// Test each engine
std::cout << "\n========================================\n";
std::cout << "Testing STT Engines\n";
std::cout << "========================================\n\n";
// 1. Whisper.cpp (local)
{
std::cout << "[1/5] Whisper.cpp (local)\n";
std::cout << "----------------------------\n";
try {
auto engine = STTEngineFactory::create("whisper_cpp", "models/ggml-base.bin");
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribe(audioData);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available (model missing?)\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
// 2. Whisper API
{
std::cout << "[2/5] OpenAI Whisper API\n";
std::cout << "----------------------------\n";
std::string apiKey = getEnvVar("OPENAI_API_KEY");
if (apiKey.empty()) {
std::cout << "❌ OPENAI_API_KEY not set\n\n";
} else {
try {
auto engine = STTEngineFactory::create("whisper_api", "", apiKey);
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribeFile(audioFile);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
}
// 3. Google Speech
{
std::cout << "[3/5] Google Speech-to-Text\n";
std::cout << "----------------------------\n";
std::string apiKey = getEnvVar("GOOGLE_API_KEY");
if (apiKey.empty()) {
std::cout << "❌ GOOGLE_API_KEY not set\n\n";
} else {
try {
auto engine = STTEngineFactory::create("google", "", apiKey);
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribeFile(audioFile);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
}
// 4. Azure Speech
{
std::cout << "[4/5] Azure Speech-to-Text\n";
std::cout << "----------------------------\n";
std::string apiKey = getEnvVar("AZURE_SPEECH_KEY");
std::string region = getEnvVar("AZURE_SPEECH_REGION");
if (apiKey.empty() || region.empty()) {
std::cout << "❌ AZURE_SPEECH_KEY or AZURE_SPEECH_REGION not set\n\n";
} else {
try {
auto engine = STTEngineFactory::create("azure", region, apiKey);
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribeFile(audioFile);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
}
// 5. Deepgram
{
std::cout << "[5/5] Deepgram\n";
std::cout << "----------------------------\n";
std::string apiKey = getEnvVar("DEEPGRAM_API_KEY");
if (apiKey.empty()) {
std::cout << "❌ DEEPGRAM_API_KEY not set\n\n";
} else {
try {
auto engine = STTEngineFactory::create("deepgram", "", apiKey);
if (engine && engine->isAvailable()) {
engine->setLanguage("fr");
std::string result = engine->transcribeFile(audioFile);
std::cout << "✅ Result: " << result << "\n\n";
} else {
std::cout << "❌ Not available\n\n";
}
} catch (const std::exception& e) {
std::cout << "❌ Error: " << e.what() << "\n\n";
}
}
}
std::cout << "========================================\n";
std::cout << "Testing complete!\n";
std::cout << "========================================\n";
return 0;
}

View File

@ -1,176 +1,176 @@
# ============================================================================
# AISSIA Integration Tests
# ============================================================================
# Fetch Catch2
include(FetchContent)
FetchContent_Declare(
Catch2
GIT_REPOSITORY https://github.com/catchorg/Catch2.git
GIT_TAG v3.4.0
)
FetchContent_MakeAvailable(Catch2)
# ============================================================================
# Test executable
# ============================================================================
add_executable(aissia_tests
main.cpp
# Mocks
mocks/MockIO.cpp
# Module sources (needed for testing)
${CMAKE_SOURCE_DIR}/src/modules/SchedulerModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/NotificationModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/MonitoringModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/AIModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/VoiceModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/StorageModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/WebModule.cpp
# Module tests (70 TI)
modules/SchedulerModuleTests.cpp
modules/NotificationModuleTests.cpp
modules/MonitoringModuleTests.cpp
modules/AIModuleTests.cpp
modules/VoiceModuleTests.cpp
modules/StorageModuleTests.cpp
modules/WebModuleTests.cpp
# MCP tests (50 TI)
mcp/MCPTypesTests.cpp
mcp/StdioTransportTests.cpp
mcp/MCPClientTests.cpp
)
target_link_libraries(aissia_tests PRIVATE
Catch2::Catch2WithMain
GroveEngine::impl
AissiaTools
spdlog::spdlog
)
# WebModule needs httplib and OpenSSL
target_include_directories(aissia_tests PRIVATE
${httplib_SOURCE_DIR}
)
# Link Winsock for httplib on Windows
if(WIN32)
target_link_libraries(aissia_tests PRIVATE ws2_32)
endif()
if(OPENSSL_FOUND)
target_link_libraries(aissia_tests PRIVATE OpenSSL::SSL OpenSSL::Crypto)
target_compile_definitions(aissia_tests PRIVATE CPPHTTPLIB_OPENSSL_SUPPORT)
endif()
# Disable module factory functions during testing
target_compile_definitions(aissia_tests PRIVATE AISSIA_TEST_BUILD)
target_include_directories(aissia_tests PRIVATE
${CMAKE_SOURCE_DIR}/src
${CMAKE_CURRENT_SOURCE_DIR}
)
# ============================================================================
# Copy test fixtures to build directory
# ============================================================================
file(COPY ${CMAKE_CURRENT_SOURCE_DIR}/fixtures/
DESTINATION ${CMAKE_BINARY_DIR}/tests/fixtures)
# ============================================================================
# CTest integration
# ============================================================================
include(CTest)
# Note: catch_discover_tests requires running the exe at build time
# which can fail due to missing DLLs. Use manual test registration instead.
add_test(NAME aissia_tests COMMAND aissia_tests)
# ============================================================================
# Custom targets
# ============================================================================
# Run all tests
add_custom_target(test_all
COMMAND ${CMAKE_CTEST_COMMAND} --output-on-failure
DEPENDS aissia_tests
COMMENT "Running all integration tests"
)
# Run module tests only
add_custom_target(test_modules
COMMAND $<TARGET_FILE:aissia_tests> "[scheduler],[notification],[monitoring],[ai],[voice],[storage],[web]"
DEPENDS aissia_tests
COMMENT "Running module integration tests"
)
# Run MCP tests only
add_custom_target(test_mcp
COMMAND $<TARGET_FILE:aissia_tests> "[mcp]"
DEPENDS aissia_tests
COMMENT "Running MCP integration tests"
)
# ============================================================================
# Integration Test Modules (Dynamic .so files)
# ============================================================================
# Helper macro to create integration test modules
macro(add_integration_test TEST_NAME)
add_library(${TEST_NAME} SHARED
integration/${TEST_NAME}.cpp
)
target_include_directories(${TEST_NAME} PRIVATE
${CMAKE_SOURCE_DIR}/src
)
target_link_libraries(${TEST_NAME} PRIVATE
GroveEngine::impl
spdlog::spdlog
)
set_target_properties(${TEST_NAME} PROPERTIES
PREFIX ""
LIBRARY_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/tests/integration
)
endmacro()
# Individual integration test modules (will be added as we create them)
# Phase 2: MCP Tests
add_integration_test(IT_001_GetCurrentTime)
add_integration_test(IT_002_FileSystemWrite)
add_integration_test(IT_003_FileSystemRead)
add_integration_test(IT_004_MCPToolsList)
# Phase 3: Flow Tests
add_integration_test(IT_005_VoiceToAI)
add_integration_test(IT_006_AIToLLM)
add_integration_test(IT_007_StorageWrite)
add_integration_test(IT_008_StorageRead)
# Phase 4: End-to-End Test
add_integration_test(IT_009_FullConversationLoop)
# Phase 5: Module Tests
add_integration_test(IT_010_SchedulerHyperfocus)
add_integration_test(IT_011_NotificationAlert)
add_integration_test(IT_012_MonitoringActivity)
add_integration_test(IT_013_WebRequest)
# Custom target to build all integration tests
add_custom_target(integration_tests
DEPENDS
IT_001_GetCurrentTime
IT_002_FileSystemWrite
IT_003_FileSystemRead
IT_004_MCPToolsList
IT_005_VoiceToAI
IT_006_AIToLLM
IT_007_StorageWrite
IT_008_StorageRead
IT_009_FullConversationLoop
IT_010_SchedulerHyperfocus
IT_011_NotificationAlert
IT_012_MonitoringActivity
IT_013_WebRequest
COMMENT "Building all integration test modules"
)
# ============================================================================
# AISSIA Integration Tests
# ============================================================================
# Fetch Catch2
include(FetchContent)
FetchContent_Declare(
Catch2
GIT_REPOSITORY https://github.com/catchorg/Catch2.git
GIT_TAG v3.4.0
)
FetchContent_MakeAvailable(Catch2)
# ============================================================================
# Test executable
# ============================================================================
add_executable(aissia_tests
main.cpp
# Mocks
mocks/MockIO.cpp
# Module sources (needed for testing)
${CMAKE_SOURCE_DIR}/src/modules/SchedulerModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/NotificationModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/MonitoringModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/AIModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/VoiceModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/StorageModule.cpp
${CMAKE_SOURCE_DIR}/src/modules/WebModule.cpp
# Module tests (70 TI)
modules/SchedulerModuleTests.cpp
modules/NotificationModuleTests.cpp
modules/MonitoringModuleTests.cpp
modules/AIModuleTests.cpp
modules/VoiceModuleTests.cpp
modules/StorageModuleTests.cpp
modules/WebModuleTests.cpp
# MCP tests (50 TI)
mcp/MCPTypesTests.cpp
mcp/StdioTransportTests.cpp
mcp/MCPClientTests.cpp
)
target_link_libraries(aissia_tests PRIVATE
Catch2::Catch2WithMain
GroveEngine::impl
AissiaTools
spdlog::spdlog
)
# WebModule needs httplib and OpenSSL
target_include_directories(aissia_tests PRIVATE
${httplib_SOURCE_DIR}
)
# Link Winsock for httplib on Windows
if(WIN32)
target_link_libraries(aissia_tests PRIVATE ws2_32)
endif()
if(OPENSSL_FOUND)
target_link_libraries(aissia_tests PRIVATE OpenSSL::SSL OpenSSL::Crypto)
target_compile_definitions(aissia_tests PRIVATE CPPHTTPLIB_OPENSSL_SUPPORT)
endif()
# Disable module factory functions during testing
target_compile_definitions(aissia_tests PRIVATE AISSIA_TEST_BUILD)
target_include_directories(aissia_tests PRIVATE
${CMAKE_SOURCE_DIR}/src
${CMAKE_CURRENT_SOURCE_DIR}
)
# ============================================================================
# Copy test fixtures to build directory
# ============================================================================
file(COPY ${CMAKE_CURRENT_SOURCE_DIR}/fixtures/
DESTINATION ${CMAKE_BINARY_DIR}/tests/fixtures)
# ============================================================================
# CTest integration
# ============================================================================
include(CTest)
# Note: catch_discover_tests requires running the exe at build time
# which can fail due to missing DLLs. Use manual test registration instead.
add_test(NAME aissia_tests COMMAND aissia_tests)
# ============================================================================
# Custom targets
# ============================================================================
# Run all tests
add_custom_target(test_all
COMMAND ${CMAKE_CTEST_COMMAND} --output-on-failure
DEPENDS aissia_tests
COMMENT "Running all integration tests"
)
# Run module tests only
add_custom_target(test_modules
COMMAND $<TARGET_FILE:aissia_tests> "[scheduler],[notification],[monitoring],[ai],[voice],[storage],[web]"
DEPENDS aissia_tests
COMMENT "Running module integration tests"
)
# Run MCP tests only
add_custom_target(test_mcp
COMMAND $<TARGET_FILE:aissia_tests> "[mcp]"
DEPENDS aissia_tests
COMMENT "Running MCP integration tests"
)
# ============================================================================
# Integration Test Modules (Dynamic .so files)
# ============================================================================
# Helper macro to create integration test modules
macro(add_integration_test TEST_NAME)
add_library(${TEST_NAME} SHARED
integration/${TEST_NAME}.cpp
)
target_include_directories(${TEST_NAME} PRIVATE
${CMAKE_SOURCE_DIR}/src
)
target_link_libraries(${TEST_NAME} PRIVATE
GroveEngine::impl
spdlog::spdlog
)
set_target_properties(${TEST_NAME} PROPERTIES
PREFIX ""
LIBRARY_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/tests/integration
)
endmacro()
# Individual integration test modules (will be added as we create them)
# Phase 2: MCP Tests
add_integration_test(IT_001_GetCurrentTime)
add_integration_test(IT_002_FileSystemWrite)
add_integration_test(IT_003_FileSystemRead)
add_integration_test(IT_004_MCPToolsList)
# Phase 3: Flow Tests
add_integration_test(IT_005_VoiceToAI)
add_integration_test(IT_006_AIToLLM)
add_integration_test(IT_007_StorageWrite)
add_integration_test(IT_008_StorageRead)
# Phase 4: End-to-End Test
add_integration_test(IT_009_FullConversationLoop)
# Phase 5: Module Tests
add_integration_test(IT_010_SchedulerHyperfocus)
add_integration_test(IT_011_NotificationAlert)
add_integration_test(IT_012_MonitoringActivity)
add_integration_test(IT_013_WebRequest)
# Custom target to build all integration tests
add_custom_target(integration_tests
DEPENDS
IT_001_GetCurrentTime
IT_002_FileSystemWrite
IT_003_FileSystemRead
IT_004_MCPToolsList
IT_005_VoiceToAI
IT_006_AIToLLM
IT_007_StorageWrite
IT_008_StorageRead
IT_009_FullConversationLoop
IT_010_SchedulerHyperfocus
IT_011_NotificationAlert
IT_012_MonitoringActivity
IT_013_WebRequest
COMMENT "Building all integration test modules"
)

View File

@ -1,34 +1,34 @@
#!/usr/bin/env python3
"""
Simple JSON-RPC echo server for testing StdioTransport.
Echoes back the params of any request as the result.
"""
import json
import sys
def main():
while True:
try:
line = sys.stdin.readline()
if not line:
break
request = json.loads(line.strip())
response = {
"jsonrpc": "2.0",
"id": request.get("id"),
"result": request.get("params", {})
}
sys.stdout.write(json.dumps(response) + "\n")
sys.stdout.flush()
except json.JSONDecodeError:
# Invalid JSON, ignore
pass
except Exception:
# Other errors, continue
pass
if __name__ == "__main__":
main()
#!/usr/bin/env python3
"""
Simple JSON-RPC echo server for testing StdioTransport.
Echoes back the params of any request as the result.
"""
import json
import sys
def main():
while True:
try:
line = sys.stdin.readline()
if not line:
break
request = json.loads(line.strip())
response = {
"jsonrpc": "2.0",
"id": request.get("id"),
"result": request.get("params", {})
}
sys.stdout.write(json.dumps(response) + "\n")
sys.stdout.flush()
except json.JSONDecodeError:
# Invalid JSON, ignore
pass
except Exception:
# Other errors, continue
pass
if __name__ == "__main__":
main()

View File

@ -1,19 +1,19 @@
{
"servers": {
"mock_server": {
"command": "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe",
"args": ["-u", "tests/fixtures/mock_mcp_server.py"],
"enabled": true
},
"disabled_server": {
"command": "nonexistent_command",
"args": [],
"enabled": false
},
"echo_server": {
"command": "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe",
"args": ["-u", "tests/fixtures/echo_server.py"],
"enabled": true
}
}
}
{
"servers": {
"mock_server": {
"command": "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe",
"args": ["-u", "tests/fixtures/mock_mcp_server.py"],
"enabled": true
},
"disabled_server": {
"command": "nonexistent_command",
"args": [],
"enabled": false
},
"echo_server": {
"command": "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe",
"args": ["-u", "tests/fixtures/echo_server.py"],
"enabled": true
}
}
}

View File

@ -1,136 +1,136 @@
#!/usr/bin/env python3
"""
Mock MCP server for integration testing.
Implements the MCP protocol (initialize, tools/list, tools/call).
"""
import json
import sys
import os
# Tools exposed by this mock server
TOOLS = [
{
"name": "test_tool",
"description": "A test tool that echoes its input",
"inputSchema": {
"type": "object",
"properties": {
"message": {"type": "string", "description": "Message to echo"}
},
"required": ["message"]
}
},
{
"name": "get_time",
"description": "Returns the current server time",
"inputSchema": {
"type": "object",
"properties": {}
}
}
]
def handle_initialize(params):
"""Handle initialize request"""
return {
"protocolVersion": "2024-11-05",
"capabilities": {
"tools": {}
},
"serverInfo": {
"name": "MockMCPServer",
"version": "1.0.0"
}
}
def handle_tools_list(params):
"""Handle tools/list request"""
return {"tools": TOOLS}
def handle_tools_call(params):
"""Handle tools/call request"""
tool_name = params.get("name", "")
arguments = params.get("arguments", {})
if tool_name == "test_tool":
message = arguments.get("message", "no message")
return {
"content": [
{"type": "text", "text": f"Echo: {message}"}
]
}
elif tool_name == "get_time":
import datetime
return {
"content": [
{"type": "text", "text": datetime.datetime.now().isoformat()}
]
}
else:
return {
"content": [
{"type": "text", "text": f"Unknown tool: {tool_name}"}
],
"isError": True
}
def handle_request(request):
"""Route request to appropriate handler"""
method = request.get("method", "")
handlers = {
"initialize": handle_initialize,
"tools/list": handle_tools_list,
"tools/call": handle_tools_call,
}
handler = handlers.get(method)
if handler:
return handler(request.get("params", {}))
else:
return {"error": {"code": -32601, "message": f"Method not found: {method}"}}
def main():
while True:
try:
line = sys.stdin.readline()
if not line:
break
request = json.loads(line.strip())
result = handle_request(request)
response = {
"jsonrpc": "2.0",
"id": request.get("id")
}
if "error" in result:
response["error"] = result["error"]
else:
response["result"] = result
sys.stdout.write(json.dumps(response) + "\n")
sys.stdout.flush()
except json.JSONDecodeError as e:
error_response = {
"jsonrpc": "2.0",
"id": None,
"error": {"code": -32700, "message": f"Parse error: {str(e)}"}
}
sys.stdout.write(json.dumps(error_response) + "\n")
sys.stdout.flush()
except Exception as e:
# Log to stderr for debugging
sys.stderr.write(f"Error: {str(e)}\n")
sys.stderr.flush()
if __name__ == "__main__":
main()
#!/usr/bin/env python3
"""
Mock MCP server for integration testing.
Implements the MCP protocol (initialize, tools/list, tools/call).
"""
import json
import sys
import os
# Tools exposed by this mock server
TOOLS = [
{
"name": "test_tool",
"description": "A test tool that echoes its input",
"inputSchema": {
"type": "object",
"properties": {
"message": {"type": "string", "description": "Message to echo"}
},
"required": ["message"]
}
},
{
"name": "get_time",
"description": "Returns the current server time",
"inputSchema": {
"type": "object",
"properties": {}
}
}
]
def handle_initialize(params):
"""Handle initialize request"""
return {
"protocolVersion": "2024-11-05",
"capabilities": {
"tools": {}
},
"serverInfo": {
"name": "MockMCPServer",
"version": "1.0.0"
}
}
def handle_tools_list(params):
"""Handle tools/list request"""
return {"tools": TOOLS}
def handle_tools_call(params):
"""Handle tools/call request"""
tool_name = params.get("name", "")
arguments = params.get("arguments", {})
if tool_name == "test_tool":
message = arguments.get("message", "no message")
return {
"content": [
{"type": "text", "text": f"Echo: {message}"}
]
}
elif tool_name == "get_time":
import datetime
return {
"content": [
{"type": "text", "text": datetime.datetime.now().isoformat()}
]
}
else:
return {
"content": [
{"type": "text", "text": f"Unknown tool: {tool_name}"}
],
"isError": True
}
def handle_request(request):
"""Route request to appropriate handler"""
method = request.get("method", "")
handlers = {
"initialize": handle_initialize,
"tools/list": handle_tools_list,
"tools/call": handle_tools_call,
}
handler = handlers.get(method)
if handler:
return handler(request.get("params", {}))
else:
return {"error": {"code": -32601, "message": f"Method not found: {method}"}}
def main():
while True:
try:
line = sys.stdin.readline()
if not line:
break
request = json.loads(line.strip())
result = handle_request(request)
response = {
"jsonrpc": "2.0",
"id": request.get("id")
}
if "error" in result:
response["error"] = result["error"]
else:
response["result"] = result
sys.stdout.write(json.dumps(response) + "\n")
sys.stdout.flush()
except json.JSONDecodeError as e:
error_response = {
"jsonrpc": "2.0",
"id": None,
"error": {"code": -32700, "message": f"Parse error: {str(e)}"}
}
sys.stdout.write(json.dumps(error_response) + "\n")
sys.stdout.flush()
except Exception as e:
# Log to stderr for debugging
sys.stderr.write(f"Error: {str(e)}\n")
sys.stderr.flush()
if __name__ == "__main__":
main()

View File

@ -1,8 +1,8 @@
// AISSIA Integration Tests - Entry Point
// Using Catch2 v3 with main provided by Catch2::Catch2WithMain
// This file is intentionally minimal.
// Catch2WithMain provides the main() function automatically.
// Include common test utilities
#include "utils/TestHelpers.hpp"
// AISSIA Integration Tests - Entry Point
// Using Catch2 v3 with main provided by Catch2::Catch2WithMain
// This file is intentionally minimal.
// Catch2WithMain provides the main() function automatically.
// Include common test utilities
#include "utils/TestHelpers.hpp"

View File

@ -1,392 +1,392 @@
/**
* @file MCPClientTests.cpp
* @brief Integration tests for MCPClient (15 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "shared/mcp/MCPClient.hpp"
#include "mocks/MockTransport.hpp"
#include <fstream>
#include <filesystem>
using namespace aissia::mcp;
using namespace aissia::tests;
using json = nlohmann::json;
// ============================================================================
// Helper: Create test config file
// ============================================================================
std::string createTestConfigFile(const json& config) {
std::string path = "test_mcp_config.json";
std::ofstream file(path);
file << config.dump(2);
file.close();
return path;
}
void cleanupTestConfigFile(const std::string& path) {
std::filesystem::remove(path);
}
// ============================================================================
// TI_CLIENT_001: Load config valid
// ============================================================================
TEST_CASE("TI_CLIENT_001_LoadConfigValid", "[mcp][client]") {
json config = {
{"servers", {
{"test_server", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
bool loaded = client.loadConfig(path);
REQUIRE(loaded == true);
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_002: Load config invalid
// ============================================================================
TEST_CASE("TI_CLIENT_002_LoadConfigInvalid", "[mcp][client]") {
// Create file with invalid JSON
std::string path = "invalid_config.json";
std::ofstream file(path);
file << "{ invalid json }";
file.close();
MCPClient client;
bool loaded = client.loadConfig(path);
REQUIRE(loaded == false);
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_003: Load config missing file
// ============================================================================
TEST_CASE("TI_CLIENT_003_LoadConfigMissingFile", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("nonexistent_file.json");
REQUIRE(loaded == false);
}
// ============================================================================
// TI_CLIENT_004: ConnectAll starts servers
// ============================================================================
TEST_CASE("TI_CLIENT_004_ConnectAllStartsServers", "[mcp][client]") {
// Use the real mock MCP server fixture
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
int connected = client.connectAll();
// Should connect to enabled servers
REQUIRE(connected >= 0);
client.disconnectAll();
} else {
// Skip if fixture not available
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_005: ConnectAll skips disabled
// ============================================================================
TEST_CASE("TI_CLIENT_005_ConnectAllSkipsDisabled", "[mcp][client]") {
json config = {
{"servers", {
{"enabled_server", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}},
{"disabled_server", {
{"command", "nonexistent"},
{"enabled", false}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
int connected = client.connectAll();
// disabled_server should not be connected
REQUIRE(client.isConnected("disabled_server") == false);
client.disconnectAll();
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_006: Connect single server
// ============================================================================
TEST_CASE("TI_CLIENT_006_ConnectSingleServer", "[mcp][client]") {
json config = {
{"servers", {
{"server1", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}},
{"server2", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
// Connect only server1
bool connected = client.connect("server1");
REQUIRE(connected == true);
REQUIRE(client.isConnected("server1") == true);
REQUIRE(client.isConnected("server2") == false);
client.disconnectAll();
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_007: Disconnect single server
// ============================================================================
TEST_CASE("TI_CLIENT_007_DisconnectSingleServer", "[mcp][client]") {
json config = {
{"servers", {
{"server1", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
client.connect("server1");
REQUIRE(client.isConnected("server1") == true);
client.disconnect("server1");
REQUIRE(client.isConnected("server1") == false);
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_008: DisconnectAll cleans up
// ============================================================================
TEST_CASE("TI_CLIENT_008_DisconnectAllCleansUp", "[mcp][client]") {
json config = {
{"servers", {
{"server1", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}},
{"server2", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
client.connectAll();
client.disconnectAll();
REQUIRE(client.isConnected("server1") == false);
REQUIRE(client.isConnected("server2") == false);
REQUIRE(client.getConnectedServers().empty() == true);
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_009: ListAllTools aggregates
// ============================================================================
TEST_CASE("TI_CLIENT_009_ListAllToolsAggregates", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
client.connectAll();
auto tools = client.listAllTools();
// Should have tools from mock server
REQUIRE(tools.size() >= 0);
client.disconnectAll();
} else {
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_010: Tool name prefixed
// ============================================================================
TEST_CASE("TI_CLIENT_010_ToolNamePrefixed", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
client.connectAll();
auto tools = client.listAllTools();
bool hasPrefix = false;
for (const auto& tool : tools) {
if (tool.name.find(":") != std::string::npos) {
hasPrefix = true;
break;
}
}
if (!tools.empty()) {
REQUIRE(hasPrefix == true);
}
client.disconnectAll();
} else {
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_011: CallTool routes to server
// ============================================================================
TEST_CASE("TI_CLIENT_011_CallToolRoutesToServer", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
client.connectAll();
auto tools = client.listAllTools();
if (!tools.empty()) {
// Call the first available tool
auto result = client.callTool(tools[0].name, json::object());
// Should get some result (success or error)
REQUIRE(result.content.size() >= 0);
}
client.disconnectAll();
} else {
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_012: CallTool invalid name
// ============================================================================
TEST_CASE("TI_CLIENT_012_CallToolInvalidName", "[mcp][client]") {
MCPClient client;
client.loadConfig("tests/fixtures/mock_mcp.json");
client.connectAll();
auto result = client.callTool("nonexistent:tool", json::object());
REQUIRE(result.isError == true);
client.disconnectAll();
}
// ============================================================================
// TI_CLIENT_013: CallTool disconnected server
// ============================================================================
TEST_CASE("TI_CLIENT_013_CallToolDisconnectedServer", "[mcp][client]") {
MCPClient client;
// Don't connect any servers
auto result = client.callTool("server:tool", json::object());
REQUIRE(result.isError == true);
}
// ============================================================================
// TI_CLIENT_014: ToolCount accurate
// ============================================================================
TEST_CASE("TI_CLIENT_014_ToolCountAccurate", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
client.connectAll();
size_t count = client.toolCount();
auto tools = client.listAllTools();
REQUIRE(count == tools.size());
client.disconnectAll();
} else {
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_015: IsConnected accurate
// ============================================================================
TEST_CASE("TI_CLIENT_015_IsConnectedAccurate", "[mcp][client]") {
json config = {
{"servers", {
{"test_server", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
// Not connected yet
REQUIRE(client.isConnected("test_server") == false);
// Connect
client.connect("test_server");
REQUIRE(client.isConnected("test_server") == true);
// Disconnect
client.disconnect("test_server");
REQUIRE(client.isConnected("test_server") == false);
cleanupTestConfigFile(path);
}
/**
* @file MCPClientTests.cpp
* @brief Integration tests for MCPClient (15 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "shared/mcp/MCPClient.hpp"
#include "mocks/MockTransport.hpp"
#include <fstream>
#include <filesystem>
using namespace aissia::mcp;
using namespace aissia::tests;
using json = nlohmann::json;
// ============================================================================
// Helper: Create test config file
// ============================================================================
std::string createTestConfigFile(const json& config) {
std::string path = "test_mcp_config.json";
std::ofstream file(path);
file << config.dump(2);
file.close();
return path;
}
void cleanupTestConfigFile(const std::string& path) {
std::filesystem::remove(path);
}
// ============================================================================
// TI_CLIENT_001: Load config valid
// ============================================================================
TEST_CASE("TI_CLIENT_001_LoadConfigValid", "[mcp][client]") {
json config = {
{"servers", {
{"test_server", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
bool loaded = client.loadConfig(path);
REQUIRE(loaded == true);
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_002: Load config invalid
// ============================================================================
TEST_CASE("TI_CLIENT_002_LoadConfigInvalid", "[mcp][client]") {
// Create file with invalid JSON
std::string path = "invalid_config.json";
std::ofstream file(path);
file << "{ invalid json }";
file.close();
MCPClient client;
bool loaded = client.loadConfig(path);
REQUIRE(loaded == false);
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_003: Load config missing file
// ============================================================================
TEST_CASE("TI_CLIENT_003_LoadConfigMissingFile", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("nonexistent_file.json");
REQUIRE(loaded == false);
}
// ============================================================================
// TI_CLIENT_004: ConnectAll starts servers
// ============================================================================
TEST_CASE("TI_CLIENT_004_ConnectAllStartsServers", "[mcp][client]") {
// Use the real mock MCP server fixture
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
int connected = client.connectAll();
// Should connect to enabled servers
REQUIRE(connected >= 0);
client.disconnectAll();
} else {
// Skip if fixture not available
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_005: ConnectAll skips disabled
// ============================================================================
TEST_CASE("TI_CLIENT_005_ConnectAllSkipsDisabled", "[mcp][client]") {
json config = {
{"servers", {
{"enabled_server", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}},
{"disabled_server", {
{"command", "nonexistent"},
{"enabled", false}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
int connected = client.connectAll();
// disabled_server should not be connected
REQUIRE(client.isConnected("disabled_server") == false);
client.disconnectAll();
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_006: Connect single server
// ============================================================================
TEST_CASE("TI_CLIENT_006_ConnectSingleServer", "[mcp][client]") {
json config = {
{"servers", {
{"server1", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}},
{"server2", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
// Connect only server1
bool connected = client.connect("server1");
REQUIRE(connected == true);
REQUIRE(client.isConnected("server1") == true);
REQUIRE(client.isConnected("server2") == false);
client.disconnectAll();
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_007: Disconnect single server
// ============================================================================
TEST_CASE("TI_CLIENT_007_DisconnectSingleServer", "[mcp][client]") {
json config = {
{"servers", {
{"server1", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
client.connect("server1");
REQUIRE(client.isConnected("server1") == true);
client.disconnect("server1");
REQUIRE(client.isConnected("server1") == false);
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_008: DisconnectAll cleans up
// ============================================================================
TEST_CASE("TI_CLIENT_008_DisconnectAllCleansUp", "[mcp][client]") {
json config = {
{"servers", {
{"server1", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}},
{"server2", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
client.connectAll();
client.disconnectAll();
REQUIRE(client.isConnected("server1") == false);
REQUIRE(client.isConnected("server2") == false);
REQUIRE(client.getConnectedServers().empty() == true);
cleanupTestConfigFile(path);
}
// ============================================================================
// TI_CLIENT_009: ListAllTools aggregates
// ============================================================================
TEST_CASE("TI_CLIENT_009_ListAllToolsAggregates", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
client.connectAll();
auto tools = client.listAllTools();
// Should have tools from mock server
REQUIRE(tools.size() >= 0);
client.disconnectAll();
} else {
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_010: Tool name prefixed
// ============================================================================
TEST_CASE("TI_CLIENT_010_ToolNamePrefixed", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
client.connectAll();
auto tools = client.listAllTools();
bool hasPrefix = false;
for (const auto& tool : tools) {
if (tool.name.find(":") != std::string::npos) {
hasPrefix = true;
break;
}
}
if (!tools.empty()) {
REQUIRE(hasPrefix == true);
}
client.disconnectAll();
} else {
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_011: CallTool routes to server
// ============================================================================
TEST_CASE("TI_CLIENT_011_CallToolRoutesToServer", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
client.connectAll();
auto tools = client.listAllTools();
if (!tools.empty()) {
// Call the first available tool
auto result = client.callTool(tools[0].name, json::object());
// Should get some result (success or error)
REQUIRE(result.content.size() >= 0);
}
client.disconnectAll();
} else {
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_012: CallTool invalid name
// ============================================================================
TEST_CASE("TI_CLIENT_012_CallToolInvalidName", "[mcp][client]") {
MCPClient client;
client.loadConfig("tests/fixtures/mock_mcp.json");
client.connectAll();
auto result = client.callTool("nonexistent:tool", json::object());
REQUIRE(result.isError == true);
client.disconnectAll();
}
// ============================================================================
// TI_CLIENT_013: CallTool disconnected server
// ============================================================================
TEST_CASE("TI_CLIENT_013_CallToolDisconnectedServer", "[mcp][client]") {
MCPClient client;
// Don't connect any servers
auto result = client.callTool("server:tool", json::object());
REQUIRE(result.isError == true);
}
// ============================================================================
// TI_CLIENT_014: ToolCount accurate
// ============================================================================
TEST_CASE("TI_CLIENT_014_ToolCountAccurate", "[mcp][client]") {
MCPClient client;
bool loaded = client.loadConfig("tests/fixtures/mock_mcp.json");
if (loaded) {
client.connectAll();
size_t count = client.toolCount();
auto tools = client.listAllTools();
REQUIRE(count == tools.size());
client.disconnectAll();
} else {
SUCCEED();
}
}
// ============================================================================
// TI_CLIENT_015: IsConnected accurate
// ============================================================================
TEST_CASE("TI_CLIENT_015_IsConnectedAccurate", "[mcp][client]") {
json config = {
{"servers", {
{"test_server", {
{"command", "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe"},
{"args", json::array({"tests/fixtures/echo_server.py"})},
{"enabled", true}
}}
}}
};
auto path = createTestConfigFile(config);
MCPClient client;
client.loadConfig(path);
// Not connected yet
REQUIRE(client.isConnected("test_server") == false);
// Connect
client.connect("test_server");
REQUIRE(client.isConnected("test_server") == true);
// Disconnect
client.disconnect("test_server");
REQUIRE(client.isConnected("test_server") == false);
cleanupTestConfigFile(path);
}

View File

@ -1,298 +1,298 @@
/**
* @file MCPTypesTests.cpp
* @brief Integration tests for MCP Types (15 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "shared/mcp/MCPTypes.hpp"
using namespace aissia::mcp;
using json = nlohmann::json;
// ============================================================================
// TI_TYPES_001: MCPTool toJson
// ============================================================================
TEST_CASE("TI_TYPES_001_MCPToolToJson", "[mcp][types]") {
MCPTool tool;
tool.name = "read_file";
tool.description = "Read a file from the filesystem";
tool.inputSchema = {
{"type", "object"},
{"properties", {
{"path", {{"type", "string"}}}
}},
{"required", json::array({"path"})}
};
json j = tool.toJson();
REQUIRE(j["name"] == "read_file");
REQUIRE(j["description"] == "Read a file from the filesystem");
REQUIRE(j["inputSchema"]["type"] == "object");
REQUIRE(j["inputSchema"]["properties"]["path"]["type"] == "string");
}
// ============================================================================
// TI_TYPES_002: MCPTool fromJson
// ============================================================================
TEST_CASE("TI_TYPES_002_MCPToolFromJson", "[mcp][types]") {
json j = {
{"name", "write_file"},
{"description", "Write content to a file"},
{"inputSchema", {
{"type", "object"},
{"properties", {
{"path", {{"type", "string"}}},
{"content", {{"type", "string"}}}
}}
}}
};
auto tool = MCPTool::fromJson(j);
REQUIRE(tool.name == "write_file");
REQUIRE(tool.description == "Write content to a file");
REQUIRE(tool.inputSchema["type"] == "object");
}
// ============================================================================
// TI_TYPES_003: MCPTool fromJson with missing fields
// ============================================================================
TEST_CASE("TI_TYPES_003_MCPToolFromJsonMissingFields", "[mcp][types]") {
json j = {{"name", "minimal_tool"}};
auto tool = MCPTool::fromJson(j);
REQUIRE(tool.name == "minimal_tool");
REQUIRE(tool.description == "");
REQUIRE(tool.inputSchema.is_object());
}
// ============================================================================
// TI_TYPES_004: MCPResource fromJson
// ============================================================================
TEST_CASE("TI_TYPES_004_MCPResourceFromJson", "[mcp][types]") {
json j = {
{"uri", "file:///home/user/doc.txt"},
{"name", "Document"},
{"description", "A text document"},
{"mimeType", "text/plain"}
};
auto resource = MCPResource::fromJson(j);
REQUIRE(resource.uri == "file:///home/user/doc.txt");
REQUIRE(resource.name == "Document");
REQUIRE(resource.description == "A text document");
REQUIRE(resource.mimeType == "text/plain");
}
// ============================================================================
// TI_TYPES_005: MCPToolResult toJson
// ============================================================================
TEST_CASE("TI_TYPES_005_MCPToolResultToJson", "[mcp][types]") {
MCPToolResult result;
result.content = {
{{"type", "text"}, {"text", "File contents here"}},
{{"type", "text"}, {"text", "More content"}}
};
result.isError = false;
json j = result.toJson();
REQUIRE(j["content"].size() == 2);
REQUIRE(j["content"][0]["type"] == "text");
REQUIRE(j["isError"] == false);
}
// ============================================================================
// TI_TYPES_006: MCPCapabilities fromJson
// ============================================================================
TEST_CASE("TI_TYPES_006_MCPCapabilitiesFromJson", "[mcp][types]") {
json j = {
{"tools", json::object()},
{"resources", {{"subscribe", true}}},
{"prompts", json::object()}
};
auto caps = MCPCapabilities::fromJson(j);
REQUIRE(caps.hasTools == true);
REQUIRE(caps.hasResources == true);
REQUIRE(caps.hasPrompts == true);
}
// ============================================================================
// TI_TYPES_007: MCPCapabilities empty
// ============================================================================
TEST_CASE("TI_TYPES_007_MCPCapabilitiesEmpty", "[mcp][types]") {
json j = json::object();
auto caps = MCPCapabilities::fromJson(j);
REQUIRE(caps.hasTools == false);
REQUIRE(caps.hasResources == false);
REQUIRE(caps.hasPrompts == false);
}
// ============================================================================
// TI_TYPES_008: MCPServerInfo fromJson
// ============================================================================
TEST_CASE("TI_TYPES_008_MCPServerInfoFromJson", "[mcp][types]") {
json j = {
{"name", "filesystem-server"},
{"version", "1.2.3"},
{"capabilities", {
{"tools", json::object()}
}}
};
auto info = MCPServerInfo::fromJson(j);
REQUIRE(info.name == "filesystem-server");
REQUIRE(info.version == "1.2.3");
REQUIRE(info.capabilities.hasTools == true);
}
// ============================================================================
// TI_TYPES_009: JsonRpcRequest toJson
// ============================================================================
TEST_CASE("TI_TYPES_009_JsonRpcRequestToJson", "[mcp][types]") {
JsonRpcRequest request;
request.id = 42;
request.method = "tools/call";
request.params = {{"name", "read_file"}, {"arguments", {{"path", "/tmp/test"}}}};
json j = request.toJson();
REQUIRE(j["jsonrpc"] == "2.0");
REQUIRE(j["id"] == 42);
REQUIRE(j["method"] == "tools/call");
REQUIRE(j["params"]["name"] == "read_file");
}
// ============================================================================
// TI_TYPES_010: JsonRpcResponse fromJson
// ============================================================================
TEST_CASE("TI_TYPES_010_JsonRpcResponseFromJson", "[mcp][types]") {
json j = {
{"jsonrpc", "2.0"},
{"id", 42},
{"result", {{"tools", json::array()}}}
};
auto response = JsonRpcResponse::fromJson(j);
REQUIRE(response.jsonrpc == "2.0");
REQUIRE(response.id == 42);
REQUIRE(response.result.has_value());
REQUIRE(response.result.value()["tools"].is_array());
}
// ============================================================================
// TI_TYPES_011: JsonRpcResponse isError
// ============================================================================
TEST_CASE("TI_TYPES_011_JsonRpcResponseIsError", "[mcp][types]") {
json errorJson = {
{"jsonrpc", "2.0"},
{"id", 1},
{"error", {{"code", -32600}, {"message", "Invalid Request"}}}
};
auto response = JsonRpcResponse::fromJson(errorJson);
REQUIRE(response.isError() == true);
REQUIRE(response.error.has_value());
REQUIRE(response.error.value()["code"] == -32600);
REQUIRE(response.error.value()["message"] == "Invalid Request");
}
// ============================================================================
// TI_TYPES_012: MCPServerConfig fromJson
// ============================================================================
TEST_CASE("TI_TYPES_012_MCPServerConfigFromJson", "[mcp][types]") {
json j = {
{"command", "mcp-server-filesystem"},
{"args", json::array({"--root", "/home"})},
{"env", {{"DEBUG", "true"}}},
{"enabled", true}
};
auto config = MCPServerConfig::fromJson("filesystem", j);
REQUIRE(config.name == "filesystem");
REQUIRE(config.command == "mcp-server-filesystem");
REQUIRE(config.args.size() == 2);
REQUIRE(config.args[0] == "--root");
REQUIRE(config.args[1] == "/home");
REQUIRE(config.env["DEBUG"] == "true");
REQUIRE(config.enabled == true);
}
// ============================================================================
// TI_TYPES_013: MCPServerConfig env expansion
// ============================================================================
TEST_CASE("TI_TYPES_013_MCPServerConfigEnvExpansion", "[mcp][types]") {
json j = {
{"command", "mcp-server"},
{"env", {{"API_KEY", "${MY_API_KEY}"}}}
};
auto config = MCPServerConfig::fromJson("test", j);
// Note: Actual env expansion happens in MCPClient, not in fromJson
// This test verifies the raw value is stored
REQUIRE(config.env["API_KEY"] == "${MY_API_KEY}");
}
// ============================================================================
// TI_TYPES_014: MCPServerConfig disabled
// ============================================================================
TEST_CASE("TI_TYPES_014_MCPServerConfigDisabled", "[mcp][types]") {
json j = {
{"command", "some-server"},
{"enabled", false}
};
auto config = MCPServerConfig::fromJson("disabled_server", j);
REQUIRE(config.enabled == false);
}
// ============================================================================
// TI_TYPES_015: JsonRpcRequest ID increment
// ============================================================================
TEST_CASE("TI_TYPES_015_JsonRpcRequestIdIncrement", "[mcp][types]") {
JsonRpcRequest req1;
req1.id = 1;
req1.method = "test";
JsonRpcRequest req2;
req2.id = 2;
req2.method = "test";
// IDs should be different
REQUIRE(req1.id != req2.id);
// Both should serialize correctly
json j1 = req1.toJson();
json j2 = req2.toJson();
REQUIRE(j1["id"] == 1);
REQUIRE(j2["id"] == 2);
}
/**
* @file MCPTypesTests.cpp
* @brief Integration tests for MCP Types (15 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "shared/mcp/MCPTypes.hpp"
using namespace aissia::mcp;
using json = nlohmann::json;
// ============================================================================
// TI_TYPES_001: MCPTool toJson
// ============================================================================
TEST_CASE("TI_TYPES_001_MCPToolToJson", "[mcp][types]") {
MCPTool tool;
tool.name = "read_file";
tool.description = "Read a file from the filesystem";
tool.inputSchema = {
{"type", "object"},
{"properties", {
{"path", {{"type", "string"}}}
}},
{"required", json::array({"path"})}
};
json j = tool.toJson();
REQUIRE(j["name"] == "read_file");
REQUIRE(j["description"] == "Read a file from the filesystem");
REQUIRE(j["inputSchema"]["type"] == "object");
REQUIRE(j["inputSchema"]["properties"]["path"]["type"] == "string");
}
// ============================================================================
// TI_TYPES_002: MCPTool fromJson
// ============================================================================
TEST_CASE("TI_TYPES_002_MCPToolFromJson", "[mcp][types]") {
json j = {
{"name", "write_file"},
{"description", "Write content to a file"},
{"inputSchema", {
{"type", "object"},
{"properties", {
{"path", {{"type", "string"}}},
{"content", {{"type", "string"}}}
}}
}}
};
auto tool = MCPTool::fromJson(j);
REQUIRE(tool.name == "write_file");
REQUIRE(tool.description == "Write content to a file");
REQUIRE(tool.inputSchema["type"] == "object");
}
// ============================================================================
// TI_TYPES_003: MCPTool fromJson with missing fields
// ============================================================================
TEST_CASE("TI_TYPES_003_MCPToolFromJsonMissingFields", "[mcp][types]") {
json j = {{"name", "minimal_tool"}};
auto tool = MCPTool::fromJson(j);
REQUIRE(tool.name == "minimal_tool");
REQUIRE(tool.description == "");
REQUIRE(tool.inputSchema.is_object());
}
// ============================================================================
// TI_TYPES_004: MCPResource fromJson
// ============================================================================
TEST_CASE("TI_TYPES_004_MCPResourceFromJson", "[mcp][types]") {
json j = {
{"uri", "file:///home/user/doc.txt"},
{"name", "Document"},
{"description", "A text document"},
{"mimeType", "text/plain"}
};
auto resource = MCPResource::fromJson(j);
REQUIRE(resource.uri == "file:///home/user/doc.txt");
REQUIRE(resource.name == "Document");
REQUIRE(resource.description == "A text document");
REQUIRE(resource.mimeType == "text/plain");
}
// ============================================================================
// TI_TYPES_005: MCPToolResult toJson
// ============================================================================
TEST_CASE("TI_TYPES_005_MCPToolResultToJson", "[mcp][types]") {
MCPToolResult result;
result.content = {
{{"type", "text"}, {"text", "File contents here"}},
{{"type", "text"}, {"text", "More content"}}
};
result.isError = false;
json j = result.toJson();
REQUIRE(j["content"].size() == 2);
REQUIRE(j["content"][0]["type"] == "text");
REQUIRE(j["isError"] == false);
}
// ============================================================================
// TI_TYPES_006: MCPCapabilities fromJson
// ============================================================================
TEST_CASE("TI_TYPES_006_MCPCapabilitiesFromJson", "[mcp][types]") {
json j = {
{"tools", json::object()},
{"resources", {{"subscribe", true}}},
{"prompts", json::object()}
};
auto caps = MCPCapabilities::fromJson(j);
REQUIRE(caps.hasTools == true);
REQUIRE(caps.hasResources == true);
REQUIRE(caps.hasPrompts == true);
}
// ============================================================================
// TI_TYPES_007: MCPCapabilities empty
// ============================================================================
TEST_CASE("TI_TYPES_007_MCPCapabilitiesEmpty", "[mcp][types]") {
json j = json::object();
auto caps = MCPCapabilities::fromJson(j);
REQUIRE(caps.hasTools == false);
REQUIRE(caps.hasResources == false);
REQUIRE(caps.hasPrompts == false);
}
// ============================================================================
// TI_TYPES_008: MCPServerInfo fromJson
// ============================================================================
TEST_CASE("TI_TYPES_008_MCPServerInfoFromJson", "[mcp][types]") {
json j = {
{"name", "filesystem-server"},
{"version", "1.2.3"},
{"capabilities", {
{"tools", json::object()}
}}
};
auto info = MCPServerInfo::fromJson(j);
REQUIRE(info.name == "filesystem-server");
REQUIRE(info.version == "1.2.3");
REQUIRE(info.capabilities.hasTools == true);
}
// ============================================================================
// TI_TYPES_009: JsonRpcRequest toJson
// ============================================================================
TEST_CASE("TI_TYPES_009_JsonRpcRequestToJson", "[mcp][types]") {
JsonRpcRequest request;
request.id = 42;
request.method = "tools/call";
request.params = {{"name", "read_file"}, {"arguments", {{"path", "/tmp/test"}}}};
json j = request.toJson();
REQUIRE(j["jsonrpc"] == "2.0");
REQUIRE(j["id"] == 42);
REQUIRE(j["method"] == "tools/call");
REQUIRE(j["params"]["name"] == "read_file");
}
// ============================================================================
// TI_TYPES_010: JsonRpcResponse fromJson
// ============================================================================
TEST_CASE("TI_TYPES_010_JsonRpcResponseFromJson", "[mcp][types]") {
json j = {
{"jsonrpc", "2.0"},
{"id", 42},
{"result", {{"tools", json::array()}}}
};
auto response = JsonRpcResponse::fromJson(j);
REQUIRE(response.jsonrpc == "2.0");
REQUIRE(response.id == 42);
REQUIRE(response.result.has_value());
REQUIRE(response.result.value()["tools"].is_array());
}
// ============================================================================
// TI_TYPES_011: JsonRpcResponse isError
// ============================================================================
TEST_CASE("TI_TYPES_011_JsonRpcResponseIsError", "[mcp][types]") {
json errorJson = {
{"jsonrpc", "2.0"},
{"id", 1},
{"error", {{"code", -32600}, {"message", "Invalid Request"}}}
};
auto response = JsonRpcResponse::fromJson(errorJson);
REQUIRE(response.isError() == true);
REQUIRE(response.error.has_value());
REQUIRE(response.error.value()["code"] == -32600);
REQUIRE(response.error.value()["message"] == "Invalid Request");
}
// ============================================================================
// TI_TYPES_012: MCPServerConfig fromJson
// ============================================================================
TEST_CASE("TI_TYPES_012_MCPServerConfigFromJson", "[mcp][types]") {
json j = {
{"command", "mcp-server-filesystem"},
{"args", json::array({"--root", "/home"})},
{"env", {{"DEBUG", "true"}}},
{"enabled", true}
};
auto config = MCPServerConfig::fromJson("filesystem", j);
REQUIRE(config.name == "filesystem");
REQUIRE(config.command == "mcp-server-filesystem");
REQUIRE(config.args.size() == 2);
REQUIRE(config.args[0] == "--root");
REQUIRE(config.args[1] == "/home");
REQUIRE(config.env["DEBUG"] == "true");
REQUIRE(config.enabled == true);
}
// ============================================================================
// TI_TYPES_013: MCPServerConfig env expansion
// ============================================================================
TEST_CASE("TI_TYPES_013_MCPServerConfigEnvExpansion", "[mcp][types]") {
json j = {
{"command", "mcp-server"},
{"env", {{"API_KEY", "${MY_API_KEY}"}}}
};
auto config = MCPServerConfig::fromJson("test", j);
// Note: Actual env expansion happens in MCPClient, not in fromJson
// This test verifies the raw value is stored
REQUIRE(config.env["API_KEY"] == "${MY_API_KEY}");
}
// ============================================================================
// TI_TYPES_014: MCPServerConfig disabled
// ============================================================================
TEST_CASE("TI_TYPES_014_MCPServerConfigDisabled", "[mcp][types]") {
json j = {
{"command", "some-server"},
{"enabled", false}
};
auto config = MCPServerConfig::fromJson("disabled_server", j);
REQUIRE(config.enabled == false);
}
// ============================================================================
// TI_TYPES_015: JsonRpcRequest ID increment
// ============================================================================
TEST_CASE("TI_TYPES_015_JsonRpcRequestIdIncrement", "[mcp][types]") {
JsonRpcRequest req1;
req1.id = 1;
req1.method = "test";
JsonRpcRequest req2;
req2.id = 2;
req2.method = "test";
// IDs should be different
REQUIRE(req1.id != req2.id);
// Both should serialize correctly
json j1 = req1.toJson();
json j2 = req2.toJson();
REQUIRE(j1["id"] == 1);
REQUIRE(j2["id"] == 2);
}

View File

@ -1,445 +1,445 @@
/**
* @file StdioTransportTests.cpp
* @brief Integration tests for StdioTransport (20 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "shared/mcp/StdioTransport.hpp"
#include "shared/mcp/MCPTypes.hpp"
#include <thread>
#include <chrono>
using namespace aissia::mcp;
using json = nlohmann::json;
// ============================================================================
// Helper: Create config for echo server
// ============================================================================
MCPServerConfig makeEchoServerConfig() {
MCPServerConfig config;
config.name = "echo";
config.command = "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe";
config.args = {"tests/fixtures/echo_server.py"};
config.enabled = true;
return config;
}
MCPServerConfig makeMockMCPServerConfig() {
MCPServerConfig config;
config.name = "mock_mcp";
config.command = "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe";
config.args = {"tests/fixtures/mock_mcp_server.py"};
config.enabled = true;
return config;
}
// ============================================================================
// TI_TRANSPORT_001: Start spawns process
// ============================================================================
TEST_CASE("TI_TRANSPORT_001_StartSpawnsProcess", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
bool started = transport.start();
REQUIRE(started == true);
REQUIRE(transport.isRunning() == true);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_002: Start fails with invalid command
// ============================================================================
TEST_CASE("TI_TRANSPORT_002_StartFailsInvalidCommand", "[mcp][transport]") {
MCPServerConfig config;
config.name = "invalid";
config.command = "nonexistent_command_xyz";
config.enabled = true;
StdioTransport transport(config);
bool started = transport.start();
REQUIRE(started == false);
REQUIRE(transport.isRunning() == false);
}
// ============================================================================
// TI_TRANSPORT_003: Stop kills process
// ============================================================================
TEST_CASE("TI_TRANSPORT_003_StopKillsProcess", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
REQUIRE(transport.isRunning() == true);
transport.stop();
REQUIRE(transport.isRunning() == false);
}
// ============================================================================
// TI_TRANSPORT_004: IsRunning reflects state
// ============================================================================
TEST_CASE("TI_TRANSPORT_004_IsRunningReflectsState", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
REQUIRE(transport.isRunning() == false);
transport.start();
REQUIRE(transport.isRunning() == true);
transport.stop();
REQUIRE(transport.isRunning() == false);
}
// ============================================================================
// TI_TRANSPORT_005: SendRequest writes to stdin
// ============================================================================
TEST_CASE("TI_TRANSPORT_005_SendRequestWritesToStdin", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
JsonRpcRequest request;
request.id = 1;
request.method = "test";
request.params = {{"message", "hello"}};
// Echo server will echo back params as result
auto response = transport.sendRequest(request, 5000);
// If we got a response, the request was written
REQUIRE(response.isError() == false);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_006: SendRequest reads response
// ============================================================================
TEST_CASE("TI_TRANSPORT_006_SendRequestReadsResponse", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
JsonRpcRequest request;
request.id = 42;
request.method = "echo";
request.params = {{"value", 123}};
auto response = transport.sendRequest(request, 5000);
REQUIRE(response.isError() == false);
REQUIRE(response.id == 42);
REQUIRE(response.result.has_value());
REQUIRE(response.result.value()["value"] == 123);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_007: SendRequest timeout
// ============================================================================
TEST_CASE("TI_TRANSPORT_007_SendRequestTimeout", "[mcp][transport]") {
// Use cat which doesn't respond to JSON-RPC
MCPServerConfig config;
config.name = "cat";
config.command = "cat";
config.enabled = true;
StdioTransport transport(config);
transport.start();
JsonRpcRequest request;
request.id = 1;
request.method = "test";
// Very short timeout
auto response = transport.sendRequest(request, 100);
// Should timeout and return error
REQUIRE(response.isError() == true);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_008: SendRequest ID matching
// ============================================================================
TEST_CASE("TI_TRANSPORT_008_SendRequestIdMatching", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Send request with specific ID
JsonRpcRequest request;
request.id = 999;
request.method = "test";
auto response = transport.sendRequest(request, 5000);
// Response ID should match request ID
REQUIRE(response.id == 999);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_009: Concurrent requests
// ============================================================================
TEST_CASE("TI_TRANSPORT_009_ConcurrentRequests", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
std::vector<std::thread> threads;
std::vector<bool> results(5, false);
for (int i = 0; i < 5; i++) {
threads.emplace_back([&transport, &results, i]() {
JsonRpcRequest request;
request.id = 100 + i;
request.method = "test";
request.params = {{"index", i}};
auto response = transport.sendRequest(request, 5000);
results[i] = !response.isError() && response.id == 100 + i;
});
}
for (auto& t : threads) {
t.join();
}
// All requests should succeed
for (bool result : results) {
REQUIRE(result == true);
}
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_010: SendNotification no response
// ============================================================================
TEST_CASE("TI_TRANSPORT_010_SendNotificationNoResponse", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Should not block or throw
REQUIRE_NOTHROW(transport.sendNotification("notification/test", {{"data", "value"}}));
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_011: Reader thread starts on start
// ============================================================================
TEST_CASE("TI_TRANSPORT_011_ReaderThreadStartsOnStart", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// If reader thread didn't start, sendRequest would hang
JsonRpcRequest request;
request.id = 1;
request.method = "test";
auto response = transport.sendRequest(request, 1000);
// Got response means reader thread is working
REQUIRE(response.isError() == false);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_012: Reader thread stops on stop
// ============================================================================
TEST_CASE("TI_TRANSPORT_012_ReaderThreadStopsOnStop", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
transport.stop();
// Should not hang or crash on destruction
SUCCEED();
}
// ============================================================================
// TI_TRANSPORT_013: JSON parse error handled
// ============================================================================
TEST_CASE("TI_TRANSPORT_013_JsonParseErrorHandled", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Send valid request - server will respond with valid JSON
JsonRpcRequest request;
request.id = 1;
request.method = "test";
// Should not crash even if server sends invalid JSON
REQUIRE_NOTHROW(transport.sendRequest(request, 1000));
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_014: Process crash detected
// ============================================================================
TEST_CASE("TI_TRANSPORT_014_ProcessCrashDetected", "[mcp][transport]") {
// TODO: Need a server that crashes to test this
// For now, just verify we can handle stop
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
transport.stop();
REQUIRE(transport.isRunning() == false);
}
// ============================================================================
// TI_TRANSPORT_015: Large message handling
// ============================================================================
TEST_CASE("TI_TRANSPORT_015_LargeMessageHandling", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Create large params
std::string largeString(10000, 'x');
JsonRpcRequest request;
request.id = 1;
request.method = "test";
request.params = {{"data", largeString}};
auto response = transport.sendRequest(request, 10000);
REQUIRE(response.isError() == false);
REQUIRE(response.result.value()["data"] == largeString);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_016: Multiline JSON handling
// ============================================================================
TEST_CASE("TI_TRANSPORT_016_MultilineJsonHandling", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// JSON with newlines in strings should work
JsonRpcRequest request;
request.id = 1;
request.method = "test";
request.params = {{"text", "line1\nline2\nline3"}};
auto response = transport.sendRequest(request, 5000);
REQUIRE(response.isError() == false);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_017: Env variables passed to process
// ============================================================================
TEST_CASE("TI_TRANSPORT_017_EnvVariablesPassedToProcess", "[mcp][transport]") {
auto config = makeEchoServerConfig();
config.env["TEST_VAR"] = "test_value";
StdioTransport transport(config);
bool started = transport.start();
REQUIRE(started == true);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_018: Args passed to process
// ============================================================================
TEST_CASE("TI_TRANSPORT_018_ArgsPassedToProcess", "[mcp][transport]") {
auto config = makeMockMCPServerConfig();
// Args are already set in the helper function
StdioTransport transport(config);
bool started = transport.start();
REQUIRE(started == true);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_019: Destructor cleans up
// ============================================================================
TEST_CASE("TI_TRANSPORT_019_DestructorCleansUp", "[mcp][transport]") {
{
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Destructor called here
}
// Should not leak resources or hang
SUCCEED();
}
// ============================================================================
// TI_TRANSPORT_020: Restart after stop
// ============================================================================
TEST_CASE("TI_TRANSPORT_020_RestartAfterStop", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
// First start/stop
transport.start();
transport.stop();
REQUIRE(transport.isRunning() == false);
// Second start
bool restarted = transport.start();
REQUIRE(restarted == true);
REQUIRE(transport.isRunning() == true);
// Verify it works
JsonRpcRequest request;
request.id = 1;
request.method = "test";
auto response = transport.sendRequest(request, 5000);
REQUIRE(response.isError() == false);
transport.stop();
}
/**
* @file StdioTransportTests.cpp
* @brief Integration tests for StdioTransport (20 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "shared/mcp/StdioTransport.hpp"
#include "shared/mcp/MCPTypes.hpp"
#include <thread>
#include <chrono>
using namespace aissia::mcp;
using json = nlohmann::json;
// ============================================================================
// Helper: Create config for echo server
// ============================================================================
MCPServerConfig makeEchoServerConfig() {
MCPServerConfig config;
config.name = "echo";
config.command = "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe";
config.args = {"tests/fixtures/echo_server.py"};
config.enabled = true;
return config;
}
MCPServerConfig makeMockMCPServerConfig() {
MCPServerConfig config;
config.name = "mock_mcp";
config.command = "C:\\Users\\alexi\\AppData\\Local\\Programs\\Python\\Python312\\python.exe";
config.args = {"tests/fixtures/mock_mcp_server.py"};
config.enabled = true;
return config;
}
// ============================================================================
// TI_TRANSPORT_001: Start spawns process
// ============================================================================
TEST_CASE("TI_TRANSPORT_001_StartSpawnsProcess", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
bool started = transport.start();
REQUIRE(started == true);
REQUIRE(transport.isRunning() == true);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_002: Start fails with invalid command
// ============================================================================
TEST_CASE("TI_TRANSPORT_002_StartFailsInvalidCommand", "[mcp][transport]") {
MCPServerConfig config;
config.name = "invalid";
config.command = "nonexistent_command_xyz";
config.enabled = true;
StdioTransport transport(config);
bool started = transport.start();
REQUIRE(started == false);
REQUIRE(transport.isRunning() == false);
}
// ============================================================================
// TI_TRANSPORT_003: Stop kills process
// ============================================================================
TEST_CASE("TI_TRANSPORT_003_StopKillsProcess", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
REQUIRE(transport.isRunning() == true);
transport.stop();
REQUIRE(transport.isRunning() == false);
}
// ============================================================================
// TI_TRANSPORT_004: IsRunning reflects state
// ============================================================================
TEST_CASE("TI_TRANSPORT_004_IsRunningReflectsState", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
REQUIRE(transport.isRunning() == false);
transport.start();
REQUIRE(transport.isRunning() == true);
transport.stop();
REQUIRE(transport.isRunning() == false);
}
// ============================================================================
// TI_TRANSPORT_005: SendRequest writes to stdin
// ============================================================================
TEST_CASE("TI_TRANSPORT_005_SendRequestWritesToStdin", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
JsonRpcRequest request;
request.id = 1;
request.method = "test";
request.params = {{"message", "hello"}};
// Echo server will echo back params as result
auto response = transport.sendRequest(request, 5000);
// If we got a response, the request was written
REQUIRE(response.isError() == false);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_006: SendRequest reads response
// ============================================================================
TEST_CASE("TI_TRANSPORT_006_SendRequestReadsResponse", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
JsonRpcRequest request;
request.id = 42;
request.method = "echo";
request.params = {{"value", 123}};
auto response = transport.sendRequest(request, 5000);
REQUIRE(response.isError() == false);
REQUIRE(response.id == 42);
REQUIRE(response.result.has_value());
REQUIRE(response.result.value()["value"] == 123);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_007: SendRequest timeout
// ============================================================================
TEST_CASE("TI_TRANSPORT_007_SendRequestTimeout", "[mcp][transport]") {
// Use cat which doesn't respond to JSON-RPC
MCPServerConfig config;
config.name = "cat";
config.command = "cat";
config.enabled = true;
StdioTransport transport(config);
transport.start();
JsonRpcRequest request;
request.id = 1;
request.method = "test";
// Very short timeout
auto response = transport.sendRequest(request, 100);
// Should timeout and return error
REQUIRE(response.isError() == true);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_008: SendRequest ID matching
// ============================================================================
TEST_CASE("TI_TRANSPORT_008_SendRequestIdMatching", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Send request with specific ID
JsonRpcRequest request;
request.id = 999;
request.method = "test";
auto response = transport.sendRequest(request, 5000);
// Response ID should match request ID
REQUIRE(response.id == 999);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_009: Concurrent requests
// ============================================================================
TEST_CASE("TI_TRANSPORT_009_ConcurrentRequests", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
std::vector<std::thread> threads;
std::vector<bool> results(5, false);
for (int i = 0; i < 5; i++) {
threads.emplace_back([&transport, &results, i]() {
JsonRpcRequest request;
request.id = 100 + i;
request.method = "test";
request.params = {{"index", i}};
auto response = transport.sendRequest(request, 5000);
results[i] = !response.isError() && response.id == 100 + i;
});
}
for (auto& t : threads) {
t.join();
}
// All requests should succeed
for (bool result : results) {
REQUIRE(result == true);
}
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_010: SendNotification no response
// ============================================================================
TEST_CASE("TI_TRANSPORT_010_SendNotificationNoResponse", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Should not block or throw
REQUIRE_NOTHROW(transport.sendNotification("notification/test", {{"data", "value"}}));
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_011: Reader thread starts on start
// ============================================================================
TEST_CASE("TI_TRANSPORT_011_ReaderThreadStartsOnStart", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// If reader thread didn't start, sendRequest would hang
JsonRpcRequest request;
request.id = 1;
request.method = "test";
auto response = transport.sendRequest(request, 1000);
// Got response means reader thread is working
REQUIRE(response.isError() == false);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_012: Reader thread stops on stop
// ============================================================================
TEST_CASE("TI_TRANSPORT_012_ReaderThreadStopsOnStop", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
transport.stop();
// Should not hang or crash on destruction
SUCCEED();
}
// ============================================================================
// TI_TRANSPORT_013: JSON parse error handled
// ============================================================================
TEST_CASE("TI_TRANSPORT_013_JsonParseErrorHandled", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Send valid request - server will respond with valid JSON
JsonRpcRequest request;
request.id = 1;
request.method = "test";
// Should not crash even if server sends invalid JSON
REQUIRE_NOTHROW(transport.sendRequest(request, 1000));
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_014: Process crash detected
// ============================================================================
TEST_CASE("TI_TRANSPORT_014_ProcessCrashDetected", "[mcp][transport]") {
// TODO: Need a server that crashes to test this
// For now, just verify we can handle stop
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
transport.stop();
REQUIRE(transport.isRunning() == false);
}
// ============================================================================
// TI_TRANSPORT_015: Large message handling
// ============================================================================
TEST_CASE("TI_TRANSPORT_015_LargeMessageHandling", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Create large params
std::string largeString(10000, 'x');
JsonRpcRequest request;
request.id = 1;
request.method = "test";
request.params = {{"data", largeString}};
auto response = transport.sendRequest(request, 10000);
REQUIRE(response.isError() == false);
REQUIRE(response.result.value()["data"] == largeString);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_016: Multiline JSON handling
// ============================================================================
TEST_CASE("TI_TRANSPORT_016_MultilineJsonHandling", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// JSON with newlines in strings should work
JsonRpcRequest request;
request.id = 1;
request.method = "test";
request.params = {{"text", "line1\nline2\nline3"}};
auto response = transport.sendRequest(request, 5000);
REQUIRE(response.isError() == false);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_017: Env variables passed to process
// ============================================================================
TEST_CASE("TI_TRANSPORT_017_EnvVariablesPassedToProcess", "[mcp][transport]") {
auto config = makeEchoServerConfig();
config.env["TEST_VAR"] = "test_value";
StdioTransport transport(config);
bool started = transport.start();
REQUIRE(started == true);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_018: Args passed to process
// ============================================================================
TEST_CASE("TI_TRANSPORT_018_ArgsPassedToProcess", "[mcp][transport]") {
auto config = makeMockMCPServerConfig();
// Args are already set in the helper function
StdioTransport transport(config);
bool started = transport.start();
REQUIRE(started == true);
transport.stop();
}
// ============================================================================
// TI_TRANSPORT_019: Destructor cleans up
// ============================================================================
TEST_CASE("TI_TRANSPORT_019_DestructorCleansUp", "[mcp][transport]") {
{
auto config = makeEchoServerConfig();
StdioTransport transport(config);
transport.start();
// Destructor called here
}
// Should not leak resources or hang
SUCCEED();
}
// ============================================================================
// TI_TRANSPORT_020: Restart after stop
// ============================================================================
TEST_CASE("TI_TRANSPORT_020_RestartAfterStop", "[mcp][transport]") {
auto config = makeEchoServerConfig();
StdioTransport transport(config);
// First start/stop
transport.start();
transport.stop();
REQUIRE(transport.isRunning() == false);
// Second start
bool restarted = transport.start();
REQUIRE(restarted == true);
REQUIRE(transport.isRunning() == true);
// Verify it works
JsonRpcRequest request;
request.id = 1;
request.method = "test";
auto response = transport.sendRequest(request, 5000);
REQUIRE(response.isError() == false);
transport.stop();
}

View File

@ -1,88 +1,88 @@
#include "MockIO.hpp"
namespace aissia::tests {
void MockIO::publish(const std::string& topic, std::unique_ptr<grove::IDataNode> data) {
// Convert IDataNode to JSON for easy verification
json jsonData;
if (data) {
// Try to extract JSON from JsonDataNode
auto* jsonNode = dynamic_cast<grove::JsonDataNode*>(data.get());
if (jsonNode) {
jsonData = jsonNode->getJsonData();
} else {
// Fallback: create basic JSON from IDataNode interface
jsonData = json::object();
}
}
m_publishedMessages.emplace_back(topic, jsonData);
}
grove::Message MockIO::pullMessage() {
if (m_incomingMessages.empty()) {
throw std::runtime_error("No messages available");
}
grove::Message msg = std::move(m_incomingMessages.front());
m_incomingMessages.pop();
return msg;
}
void MockIO::injectMessage(const std::string& topic, const json& data) {
grove::Message message;
message.topic = topic;
message.data = std::make_unique<grove::JsonDataNode>("data", data);
message.timestamp = 0;
m_incomingMessages.push(std::move(message));
}
void MockIO::injectMessages(const std::vector<std::pair<std::string, json>>& messages) {
for (const auto& [topic, data] : messages) {
injectMessage(topic, data);
}
}
bool MockIO::wasPublished(const std::string& topic) const {
return std::any_of(m_publishedMessages.begin(), m_publishedMessages.end(),
[&topic](const auto& msg) { return msg.first == topic; });
}
json MockIO::getLastPublished(const std::string& topic) const {
for (auto it = m_publishedMessages.rbegin(); it != m_publishedMessages.rend(); ++it) {
if (it->first == topic) {
return it->second;
}
}
return json::object();
}
std::vector<json> MockIO::getAllPublished(const std::string& topic) const {
std::vector<json> result;
for (const auto& [t, data] : m_publishedMessages) {
if (t == topic) {
result.push_back(data);
}
}
return result;
}
size_t MockIO::countPublished(const std::string& topic) const {
return std::count_if(m_publishedMessages.begin(), m_publishedMessages.end(),
[&topic](const auto& msg) { return msg.first == topic; });
}
void MockIO::clear() {
m_publishedMessages.clear();
while (!m_incomingMessages.empty()) {
m_incomingMessages.pop();
}
m_subscriptions.clear();
}
void MockIO::clearPublished() {
m_publishedMessages.clear();
}
} // namespace aissia::tests
#include "MockIO.hpp"
namespace aissia::tests {
void MockIO::publish(const std::string& topic, std::unique_ptr<grove::IDataNode> data) {
// Convert IDataNode to JSON for easy verification
json jsonData;
if (data) {
// Try to extract JSON from JsonDataNode
auto* jsonNode = dynamic_cast<grove::JsonDataNode*>(data.get());
if (jsonNode) {
jsonData = jsonNode->getJsonData();
} else {
// Fallback: create basic JSON from IDataNode interface
jsonData = json::object();
}
}
m_publishedMessages.emplace_back(topic, jsonData);
}
grove::Message MockIO::pullMessage() {
if (m_incomingMessages.empty()) {
throw std::runtime_error("No messages available");
}
grove::Message msg = std::move(m_incomingMessages.front());
m_incomingMessages.pop();
return msg;
}
void MockIO::injectMessage(const std::string& topic, const json& data) {
grove::Message message;
message.topic = topic;
message.data = std::make_unique<grove::JsonDataNode>("data", data);
message.timestamp = 0;
m_incomingMessages.push(std::move(message));
}
void MockIO::injectMessages(const std::vector<std::pair<std::string, json>>& messages) {
for (const auto& [topic, data] : messages) {
injectMessage(topic, data);
}
}
bool MockIO::wasPublished(const std::string& topic) const {
return std::any_of(m_publishedMessages.begin(), m_publishedMessages.end(),
[&topic](const auto& msg) { return msg.first == topic; });
}
json MockIO::getLastPublished(const std::string& topic) const {
for (auto it = m_publishedMessages.rbegin(); it != m_publishedMessages.rend(); ++it) {
if (it->first == topic) {
return it->second;
}
}
return json::object();
}
std::vector<json> MockIO::getAllPublished(const std::string& topic) const {
std::vector<json> result;
for (const auto& [t, data] : m_publishedMessages) {
if (t == topic) {
result.push_back(data);
}
}
return result;
}
size_t MockIO::countPublished(const std::string& topic) const {
return std::count_if(m_publishedMessages.begin(), m_publishedMessages.end(),
[&topic](const auto& msg) { return msg.first == topic; });
}
void MockIO::clear() {
m_publishedMessages.clear();
while (!m_incomingMessages.empty()) {
m_incomingMessages.pop();
}
m_subscriptions.clear();
}
void MockIO::clearPublished() {
m_publishedMessages.clear();
}
} // namespace aissia::tests

View File

@ -1,129 +1,129 @@
#pragma once
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <nlohmann/json.hpp>
#include <string>
#include <vector>
#include <queue>
#include <map>
#include <algorithm>
namespace aissia::tests {
using json = nlohmann::json;
/**
* @brief Mock implementation of grove::IIO for testing
*
* Captures published messages and allows injecting incoming messages.
*/
class MockIO : public grove::IIO {
public:
// ========================================================================
// IIO Interface Implementation
// ========================================================================
void publish(const std::string& topic, std::unique_ptr<grove::IDataNode> data) override;
void subscribe(const std::string& topicPattern, const grove::SubscriptionConfig& config = {}) override {
// Mock: just record subscription
m_subscriptions.push_back(topicPattern);
}
void subscribeLowFreq(const std::string& topicPattern, const grove::SubscriptionConfig& config = {}) override {
// Mock: same as subscribe
m_subscriptions.push_back(topicPattern);
}
int hasMessages() const override {
return static_cast<int>(m_incomingMessages.size());
}
grove::Message pullMessage() override;
grove::IOHealth getHealth() const override {
return grove::IOHealth{
.queueSize = static_cast<int>(m_incomingMessages.size()),
.maxQueueSize = 1000,
.dropping = false,
.averageProcessingRate = 100.0f,
.droppedMessageCount = 0
};
}
grove::IOType getType() const override {
return grove::IOType::INTRA;
}
// ========================================================================
// Test Helpers - Message Injection
// ========================================================================
/**
* @brief Inject a message to be received by the module under test
*/
void injectMessage(const std::string& topic, const json& data);
/**
* @brief Inject multiple messages at once
*/
void injectMessages(const std::vector<std::pair<std::string, json>>& messages);
// ========================================================================
// Test Helpers - Verification
// ========================================================================
/**
* @brief Check if a message was published to a specific topic
*/
bool wasPublished(const std::string& topic) const;
/**
* @brief Get the last message published to a topic
*/
json getLastPublished(const std::string& topic) const;
/**
* @brief Get all messages published to a topic
*/
std::vector<json> getAllPublished(const std::string& topic) const;
/**
* @brief Count messages published to a topic
*/
size_t countPublished(const std::string& topic) const;
/**
* @brief Get all published messages (topic -> data pairs)
*/
const std::vector<std::pair<std::string, json>>& getPublishedMessages() const {
return m_publishedMessages;
}
/**
* @brief Clear all captured and pending messages
*/
void clear();
/**
* @brief Clear only published messages (keep incoming queue)
*/
void clearPublished();
// ========================================================================
// Test State
// ========================================================================
/// All messages published by the module under test
std::vector<std::pair<std::string, json>> m_publishedMessages;
/// Messages waiting to be received by the module
std::queue<grove::Message> m_incomingMessages;
/// Subscribed topic patterns (for verification)
std::vector<std::string> m_subscriptions;
};
} // namespace aissia::tests
#pragma once
#include <grove/IIO.h>
#include <grove/JsonDataNode.h>
#include <nlohmann/json.hpp>
#include <string>
#include <vector>
#include <queue>
#include <map>
#include <algorithm>
namespace aissia::tests {
using json = nlohmann::json;
/**
* @brief Mock implementation of grove::IIO for testing
*
* Captures published messages and allows injecting incoming messages.
*/
class MockIO : public grove::IIO {
public:
// ========================================================================
// IIO Interface Implementation
// ========================================================================
void publish(const std::string& topic, std::unique_ptr<grove::IDataNode> data) override;
void subscribe(const std::string& topicPattern, const grove::SubscriptionConfig& config = {}) override {
// Mock: just record subscription
m_subscriptions.push_back(topicPattern);
}
void subscribeLowFreq(const std::string& topicPattern, const grove::SubscriptionConfig& config = {}) override {
// Mock: same as subscribe
m_subscriptions.push_back(topicPattern);
}
int hasMessages() const override {
return static_cast<int>(m_incomingMessages.size());
}
grove::Message pullMessage() override;
grove::IOHealth getHealth() const override {
return grove::IOHealth{
.queueSize = static_cast<int>(m_incomingMessages.size()),
.maxQueueSize = 1000,
.dropping = false,
.averageProcessingRate = 100.0f,
.droppedMessageCount = 0
};
}
grove::IOType getType() const override {
return grove::IOType::INTRA;
}
// ========================================================================
// Test Helpers - Message Injection
// ========================================================================
/**
* @brief Inject a message to be received by the module under test
*/
void injectMessage(const std::string& topic, const json& data);
/**
* @brief Inject multiple messages at once
*/
void injectMessages(const std::vector<std::pair<std::string, json>>& messages);
// ========================================================================
// Test Helpers - Verification
// ========================================================================
/**
* @brief Check if a message was published to a specific topic
*/
bool wasPublished(const std::string& topic) const;
/**
* @brief Get the last message published to a topic
*/
json getLastPublished(const std::string& topic) const;
/**
* @brief Get all messages published to a topic
*/
std::vector<json> getAllPublished(const std::string& topic) const;
/**
* @brief Count messages published to a topic
*/
size_t countPublished(const std::string& topic) const;
/**
* @brief Get all published messages (topic -> data pairs)
*/
const std::vector<std::pair<std::string, json>>& getPublishedMessages() const {
return m_publishedMessages;
}
/**
* @brief Clear all captured and pending messages
*/
void clear();
/**
* @brief Clear only published messages (keep incoming queue)
*/
void clearPublished();
// ========================================================================
// Test State
// ========================================================================
/// All messages published by the module under test
std::vector<std::pair<std::string, json>> m_publishedMessages;
/// Messages waiting to be received by the module
std::queue<grove::Message> m_incomingMessages;
/// Subscribed topic patterns (for verification)
std::vector<std::string> m_subscriptions;
};
} // namespace aissia::tests

View File

@ -1,192 +1,192 @@
#pragma once
#include "shared/mcp/MCPTransport.hpp"
#include "shared/mcp/MCPTypes.hpp"
#include <queue>
#include <vector>
#include <functional>
namespace aissia::tests {
using namespace aissia::mcp;
/**
* @brief Mock implementation of IMCPTransport for testing MCPClient
*/
class MockTransport : public IMCPTransport {
public:
// ========================================================================
// IMCPTransport Interface
// ========================================================================
bool start() override {
if (m_startShouldFail) {
return false;
}
m_running = true;
return true;
}
void stop() override {
m_running = false;
}
bool isRunning() const override {
return m_running;
}
JsonRpcResponse sendRequest(const JsonRpcRequest& request, int timeoutMs = 30000) override {
m_sentRequests.push_back(request);
// If we have a custom handler, use it
if (m_requestHandler) {
return m_requestHandler(request);
}
// Otherwise, use prepared responses
if (!m_preparedResponses.empty()) {
auto response = m_preparedResponses.front();
m_preparedResponses.pop();
response.id = request.id; // Match the request ID
return response;
}
// Default: return error
JsonRpcResponse errorResponse;
errorResponse.id = request.id;
errorResponse.error = json{{"code", -32603}, {"message", "No prepared response"}};
return errorResponse;
}
void sendNotification(const std::string& method, const json& params) override {
m_sentNotifications.emplace_back(method, params);
}
// ========================================================================
// Test Configuration
// ========================================================================
/**
* @brief Make start() fail
*/
void setStartShouldFail(bool fail) {
m_startShouldFail = fail;
}
/**
* @brief Add a response to be returned on next sendRequest
*/
void prepareResponse(const JsonRpcResponse& response) {
m_preparedResponses.push(response);
}
/**
* @brief Prepare a successful response with result
*/
void prepareSuccessResponse(const json& result) {
JsonRpcResponse response;
response.result = result;
m_preparedResponses.push(response);
}
/**
* @brief Prepare an error response
*/
void prepareErrorResponse(int code, const std::string& message) {
JsonRpcResponse response;
response.error = json{{"code", code}, {"message", message}};
m_preparedResponses.push(response);
}
/**
* @brief Set a custom handler for all requests
*/
void setRequestHandler(std::function<JsonRpcResponse(const JsonRpcRequest&)> handler) {
m_requestHandler = std::move(handler);
}
/**
* @brief Simulate MCP server with initialize and tools/list
*/
void setupAsMCPServer(const std::string& serverName, const std::vector<MCPTool>& tools) {
m_requestHandler = [serverName, tools](const JsonRpcRequest& req) -> JsonRpcResponse {
JsonRpcResponse resp;
resp.id = req.id;
if (req.method == "initialize") {
resp.result = json{
{"protocolVersion", "2024-11-05"},
{"capabilities", {{"tools", json::object()}}},
{"serverInfo", {{"name", serverName}, {"version", "1.0.0"}}}
};
} else if (req.method == "tools/list") {
json toolsJson = json::array();
for (const auto& tool : tools) {
toolsJson.push_back(tool.toJson());
}
resp.result = json{{"tools", toolsJson}};
} else if (req.method == "tools/call") {
resp.result = json{
{"content", json::array({{{"type", "text"}, {"text", "Tool executed"}}})}
};
} else {
resp.error = json{{"code", -32601}, {"message", "Method not found"}};
}
return resp;
};
}
// ========================================================================
// Test Verification
// ========================================================================
/**
* @brief Get all sent requests
*/
const std::vector<JsonRpcRequest>& getSentRequests() const {
return m_sentRequests;
}
/**
* @brief Check if a method was called
*/
bool wasMethodCalled(const std::string& method) const {
return std::any_of(m_sentRequests.begin(), m_sentRequests.end(),
[&method](const auto& req) { return req.method == method; });
}
/**
* @brief Get count of calls to a method
*/
size_t countMethodCalls(const std::string& method) const {
return std::count_if(m_sentRequests.begin(), m_sentRequests.end(),
[&method](const auto& req) { return req.method == method; });
}
/**
* @brief Clear all state
*/
void clear() {
m_sentRequests.clear();
m_sentNotifications.clear();
while (!m_preparedResponses.empty()) {
m_preparedResponses.pop();
}
m_requestHandler = nullptr;
}
// ========================================================================
// Test State
// ========================================================================
bool m_running = false;
bool m_startShouldFail = false;
std::vector<JsonRpcRequest> m_sentRequests;
std::vector<std::pair<std::string, json>> m_sentNotifications;
std::queue<JsonRpcResponse> m_preparedResponses;
std::function<JsonRpcResponse(const JsonRpcRequest&)> m_requestHandler;
};
} // namespace aissia::tests
#pragma once
#include "shared/mcp/MCPTransport.hpp"
#include "shared/mcp/MCPTypes.hpp"
#include <queue>
#include <vector>
#include <functional>
namespace aissia::tests {
using namespace aissia::mcp;
/**
* @brief Mock implementation of IMCPTransport for testing MCPClient
*/
class MockTransport : public IMCPTransport {
public:
// ========================================================================
// IMCPTransport Interface
// ========================================================================
bool start() override {
if (m_startShouldFail) {
return false;
}
m_running = true;
return true;
}
void stop() override {
m_running = false;
}
bool isRunning() const override {
return m_running;
}
JsonRpcResponse sendRequest(const JsonRpcRequest& request, int timeoutMs = 30000) override {
m_sentRequests.push_back(request);
// If we have a custom handler, use it
if (m_requestHandler) {
return m_requestHandler(request);
}
// Otherwise, use prepared responses
if (!m_preparedResponses.empty()) {
auto response = m_preparedResponses.front();
m_preparedResponses.pop();
response.id = request.id; // Match the request ID
return response;
}
// Default: return error
JsonRpcResponse errorResponse;
errorResponse.id = request.id;
errorResponse.error = json{{"code", -32603}, {"message", "No prepared response"}};
return errorResponse;
}
void sendNotification(const std::string& method, const json& params) override {
m_sentNotifications.emplace_back(method, params);
}
// ========================================================================
// Test Configuration
// ========================================================================
/**
* @brief Make start() fail
*/
void setStartShouldFail(bool fail) {
m_startShouldFail = fail;
}
/**
* @brief Add a response to be returned on next sendRequest
*/
void prepareResponse(const JsonRpcResponse& response) {
m_preparedResponses.push(response);
}
/**
* @brief Prepare a successful response with result
*/
void prepareSuccessResponse(const json& result) {
JsonRpcResponse response;
response.result = result;
m_preparedResponses.push(response);
}
/**
* @brief Prepare an error response
*/
void prepareErrorResponse(int code, const std::string& message) {
JsonRpcResponse response;
response.error = json{{"code", code}, {"message", message}};
m_preparedResponses.push(response);
}
/**
* @brief Set a custom handler for all requests
*/
void setRequestHandler(std::function<JsonRpcResponse(const JsonRpcRequest&)> handler) {
m_requestHandler = std::move(handler);
}
/**
* @brief Simulate MCP server with initialize and tools/list
*/
void setupAsMCPServer(const std::string& serverName, const std::vector<MCPTool>& tools) {
m_requestHandler = [serverName, tools](const JsonRpcRequest& req) -> JsonRpcResponse {
JsonRpcResponse resp;
resp.id = req.id;
if (req.method == "initialize") {
resp.result = json{
{"protocolVersion", "2024-11-05"},
{"capabilities", {{"tools", json::object()}}},
{"serverInfo", {{"name", serverName}, {"version", "1.0.0"}}}
};
} else if (req.method == "tools/list") {
json toolsJson = json::array();
for (const auto& tool : tools) {
toolsJson.push_back(tool.toJson());
}
resp.result = json{{"tools", toolsJson}};
} else if (req.method == "tools/call") {
resp.result = json{
{"content", json::array({{{"type", "text"}, {"text", "Tool executed"}}})}
};
} else {
resp.error = json{{"code", -32601}, {"message", "Method not found"}};
}
return resp;
};
}
// ========================================================================
// Test Verification
// ========================================================================
/**
* @brief Get all sent requests
*/
const std::vector<JsonRpcRequest>& getSentRequests() const {
return m_sentRequests;
}
/**
* @brief Check if a method was called
*/
bool wasMethodCalled(const std::string& method) const {
return std::any_of(m_sentRequests.begin(), m_sentRequests.end(),
[&method](const auto& req) { return req.method == method; });
}
/**
* @brief Get count of calls to a method
*/
size_t countMethodCalls(const std::string& method) const {
return std::count_if(m_sentRequests.begin(), m_sentRequests.end(),
[&method](const auto& req) { return req.method == method; });
}
/**
* @brief Clear all state
*/
void clear() {
m_sentRequests.clear();
m_sentNotifications.clear();
while (!m_preparedResponses.empty()) {
m_preparedResponses.pop();
}
m_requestHandler = nullptr;
}
// ========================================================================
// Test State
// ========================================================================
bool m_running = false;
bool m_startShouldFail = false;
std::vector<JsonRpcRequest> m_sentRequests;
std::vector<std::pair<std::string, json>> m_sentNotifications;
std::queue<JsonRpcResponse> m_preparedResponses;
std::function<JsonRpcResponse(const JsonRpcRequest&)> m_requestHandler;
};
} // namespace aissia::tests

View File

@ -1,293 +1,293 @@
/**
* @file AIModuleTests.cpp
* @brief Integration tests for AIModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/AIModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class AITestFixture {
public:
MockIO io;
TimeSimulator time;
AIModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"system_prompt", "Tu es un assistant personnel intelligent."},
{"max_iterations", 10}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
};
// ============================================================================
// TI_AI_001: Query Sends LLM Request
// ============================================================================
TEST_CASE("TI_AI_001_QuerySendsLLMRequest", "[ai][integration]") {
AITestFixture f;
f.configure();
// Send query
f.io.injectMessage("ai:query", {{"query", "Quelle heure est-il?"}});
f.process();
// Verify LLM request published
REQUIRE(f.io.wasPublished("llm:request"));
auto msg = f.io.getLastPublished("llm:request");
REQUIRE(msg["query"] == "Quelle heure est-il?");
}
// ============================================================================
// TI_AI_002: Voice Transcription Triggers Query
// ============================================================================
TEST_CASE("TI_AI_002_VoiceTranscriptionTriggersQuery", "[ai][integration]") {
AITestFixture f;
f.configure();
// Send voice transcription
f.io.injectMessage("voice:transcription", {
{"text", "Aide-moi avec mon code"},
{"confidence", 0.95}
});
f.process();
// Verify LLM request
REQUIRE(f.io.wasPublished("llm:request"));
auto msg = f.io.getLastPublished("llm:request");
REQUIRE(msg["query"] == "Aide-moi avec mon code");
}
// ============================================================================
// TI_AI_003: LLM Response Handled
// ============================================================================
TEST_CASE("TI_AI_003_LLMResponseHandled", "[ai][integration]") {
AITestFixture f;
f.configure();
// Send query to set awaiting state
f.io.injectMessage("ai:query", {{"query", "Test"}});
f.process();
REQUIRE(f.module.isIdle() == false);
// Receive response
f.io.injectMessage("llm:response", {
{"text", "Voici la reponse"},
{"tokens", 100},
{"conversationId", "default"}
});
f.process();
// Verify no longer awaiting
REQUIRE(f.module.isIdle() == true);
}
// ============================================================================
// TI_AI_004: LLM Error Handled
// ============================================================================
TEST_CASE("TI_AI_004_LLMErrorHandled", "[ai][integration]") {
AITestFixture f;
f.configure();
// Send query
f.io.injectMessage("ai:query", {{"query", "Test"}});
f.process();
REQUIRE(f.module.isIdle() == false);
// Receive error
f.io.injectMessage("llm:error", {
{"message", "API rate limit exceeded"},
{"conversationId", "default"}
});
f.process();
// Should no longer be awaiting
REQUIRE(f.module.isIdle() == true);
}
// ============================================================================
// TI_AI_005: Hyperfocus Alert Generates Suggestion
// ============================================================================
TEST_CASE("TI_AI_005_HyperfocusAlertGeneratesSuggestion", "[ai][integration]") {
AITestFixture f;
f.configure();
// Receive hyperfocus alert
f.io.injectMessage("scheduler:hyperfocus_alert", {
{"sessionMinutes", 130},
{"task", "coding"}
});
f.process();
// Verify LLM request published
REQUIRE(f.io.wasPublished("llm:request"));
auto req = f.io.getLastPublished("llm:request");
std::string convId = req["conversationId"];
// Simulate LLM response
f.io.injectMessage("llm:response", {
{"text", "Time to take a break!"},
{"conversationId", convId}
});
f.process();
// Verify suggestion published
REQUIRE(f.io.wasPublished("ai:suggestion"));
auto msg = f.io.getLastPublished("ai:suggestion");
REQUIRE(msg.contains("message"));
}
// ============================================================================
// TI_AI_006: Break Reminder Generates Suggestion
// ============================================================================
TEST_CASE("TI_AI_006_BreakReminderGeneratesSuggestion", "[ai][integration]") {
AITestFixture f;
f.configure();
// Receive break reminder
f.io.injectMessage("scheduler:break_reminder", {
{"workMinutes", 45}
});
f.process();
// Verify LLM request published
REQUIRE(f.io.wasPublished("llm:request"));
auto req = f.io.getLastPublished("llm:request");
std::string convId = req["conversationId"];
// Simulate LLM response
f.io.injectMessage("llm:response", {
{"text", "Take a short break now!"},
{"conversationId", convId}
});
f.process();
// Verify suggestion
REQUIRE(f.io.wasPublished("ai:suggestion"));
}
// ============================================================================
// TI_AI_007: System Prompt In Request
// ============================================================================
TEST_CASE("TI_AI_007_SystemPromptInRequest", "[ai][integration]") {
AITestFixture f;
f.configure({{"system_prompt", "Custom prompt here"}});
f.io.injectMessage("ai:query", {{"query", "Test"}});
f.process();
REQUIRE(f.io.wasPublished("llm:request"));
auto msg = f.io.getLastPublished("llm:request");
REQUIRE(msg["systemPrompt"] == "Custom prompt here");
}
// ============================================================================
// TI_AI_008: Conversation ID Tracking
// ============================================================================
TEST_CASE("TI_AI_008_ConversationIdTracking", "[ai][integration]") {
AITestFixture f;
f.configure();
// First query
f.io.injectMessage("ai:query", {{"query", "Question 1"}});
f.process();
auto msg1 = f.io.getLastPublished("llm:request");
std::string convId = msg1["conversationId"];
REQUIRE(!convId.empty());
// Simulate response
f.io.injectMessage("llm:response", {{"text", "Response"}, {"conversationId", convId}});
f.process();
f.io.clearPublished();
// Second query should use same conversation
f.io.injectMessage("ai:query", {{"query", "Question 2"}});
f.process();
auto msg2 = f.io.getLastPublished("llm:request");
REQUIRE(msg2["conversationId"] == convId);
}
// ============================================================================
// TI_AI_009: Token Counting Accumulates
// ============================================================================
TEST_CASE("TI_AI_009_TokenCountingAccumulates", "[ai][integration]") {
AITestFixture f;
f.configure();
// Query 1
f.io.injectMessage("ai:query", {{"query", "Q1"}});
f.process();
f.io.injectMessage("llm:response", {{"text", "R1"}, {"tokens", 50}});
f.process();
// Query 2
f.io.injectMessage("ai:query", {{"query", "Q2"}});
f.process();
f.io.injectMessage("llm:response", {{"text", "R2"}, {"tokens", 75}});
f.process();
// Verify total
auto state = f.module.getState();
// TODO: Verify totalTokens == 125
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_AI_010: State Serialization
// ============================================================================
TEST_CASE("TI_AI_010_StateSerialization", "[ai][integration]") {
AITestFixture f;
f.configure();
// Build state
f.io.injectMessage("ai:query", {{"query", "Test"}});
f.process();
f.io.injectMessage("llm:response", {{"text", "Response"}, {"tokens", 100}});
f.process();
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Restore
AIModule module2;
grove::JsonDataNode configNode2("config", json::object());
module2.setConfiguration(configNode2, &f.io, nullptr);
module2.setState(*state);
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
SUCCEED(); // Placeholder
}
/**
* @file AIModuleTests.cpp
* @brief Integration tests for AIModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/AIModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class AITestFixture {
public:
MockIO io;
TimeSimulator time;
AIModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"system_prompt", "Tu es un assistant personnel intelligent."},
{"max_iterations", 10}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
};
// ============================================================================
// TI_AI_001: Query Sends LLM Request
// ============================================================================
TEST_CASE("TI_AI_001_QuerySendsLLMRequest", "[ai][integration]") {
AITestFixture f;
f.configure();
// Send query
f.io.injectMessage("ai:query", {{"query", "Quelle heure est-il?"}});
f.process();
// Verify LLM request published
REQUIRE(f.io.wasPublished("llm:request"));
auto msg = f.io.getLastPublished("llm:request");
REQUIRE(msg["query"] == "Quelle heure est-il?");
}
// ============================================================================
// TI_AI_002: Voice Transcription Triggers Query
// ============================================================================
TEST_CASE("TI_AI_002_VoiceTranscriptionTriggersQuery", "[ai][integration]") {
AITestFixture f;
f.configure();
// Send voice transcription
f.io.injectMessage("voice:transcription", {
{"text", "Aide-moi avec mon code"},
{"confidence", 0.95}
});
f.process();
// Verify LLM request
REQUIRE(f.io.wasPublished("llm:request"));
auto msg = f.io.getLastPublished("llm:request");
REQUIRE(msg["query"] == "Aide-moi avec mon code");
}
// ============================================================================
// TI_AI_003: LLM Response Handled
// ============================================================================
TEST_CASE("TI_AI_003_LLMResponseHandled", "[ai][integration]") {
AITestFixture f;
f.configure();
// Send query to set awaiting state
f.io.injectMessage("ai:query", {{"query", "Test"}});
f.process();
REQUIRE(f.module.isIdle() == false);
// Receive response
f.io.injectMessage("llm:response", {
{"text", "Voici la reponse"},
{"tokens", 100},
{"conversationId", "default"}
});
f.process();
// Verify no longer awaiting
REQUIRE(f.module.isIdle() == true);
}
// ============================================================================
// TI_AI_004: LLM Error Handled
// ============================================================================
TEST_CASE("TI_AI_004_LLMErrorHandled", "[ai][integration]") {
AITestFixture f;
f.configure();
// Send query
f.io.injectMessage("ai:query", {{"query", "Test"}});
f.process();
REQUIRE(f.module.isIdle() == false);
// Receive error
f.io.injectMessage("llm:error", {
{"message", "API rate limit exceeded"},
{"conversationId", "default"}
});
f.process();
// Should no longer be awaiting
REQUIRE(f.module.isIdle() == true);
}
// ============================================================================
// TI_AI_005: Hyperfocus Alert Generates Suggestion
// ============================================================================
TEST_CASE("TI_AI_005_HyperfocusAlertGeneratesSuggestion", "[ai][integration]") {
AITestFixture f;
f.configure();
// Receive hyperfocus alert
f.io.injectMessage("scheduler:hyperfocus_alert", {
{"sessionMinutes", 130},
{"task", "coding"}
});
f.process();
// Verify LLM request published
REQUIRE(f.io.wasPublished("llm:request"));
auto req = f.io.getLastPublished("llm:request");
std::string convId = req["conversationId"];
// Simulate LLM response
f.io.injectMessage("llm:response", {
{"text", "Time to take a break!"},
{"conversationId", convId}
});
f.process();
// Verify suggestion published
REQUIRE(f.io.wasPublished("ai:suggestion"));
auto msg = f.io.getLastPublished("ai:suggestion");
REQUIRE(msg.contains("message"));
}
// ============================================================================
// TI_AI_006: Break Reminder Generates Suggestion
// ============================================================================
TEST_CASE("TI_AI_006_BreakReminderGeneratesSuggestion", "[ai][integration]") {
AITestFixture f;
f.configure();
// Receive break reminder
f.io.injectMessage("scheduler:break_reminder", {
{"workMinutes", 45}
});
f.process();
// Verify LLM request published
REQUIRE(f.io.wasPublished("llm:request"));
auto req = f.io.getLastPublished("llm:request");
std::string convId = req["conversationId"];
// Simulate LLM response
f.io.injectMessage("llm:response", {
{"text", "Take a short break now!"},
{"conversationId", convId}
});
f.process();
// Verify suggestion
REQUIRE(f.io.wasPublished("ai:suggestion"));
}
// ============================================================================
// TI_AI_007: System Prompt In Request
// ============================================================================
TEST_CASE("TI_AI_007_SystemPromptInRequest", "[ai][integration]") {
AITestFixture f;
f.configure({{"system_prompt", "Custom prompt here"}});
f.io.injectMessage("ai:query", {{"query", "Test"}});
f.process();
REQUIRE(f.io.wasPublished("llm:request"));
auto msg = f.io.getLastPublished("llm:request");
REQUIRE(msg["systemPrompt"] == "Custom prompt here");
}
// ============================================================================
// TI_AI_008: Conversation ID Tracking
// ============================================================================
TEST_CASE("TI_AI_008_ConversationIdTracking", "[ai][integration]") {
AITestFixture f;
f.configure();
// First query
f.io.injectMessage("ai:query", {{"query", "Question 1"}});
f.process();
auto msg1 = f.io.getLastPublished("llm:request");
std::string convId = msg1["conversationId"];
REQUIRE(!convId.empty());
// Simulate response
f.io.injectMessage("llm:response", {{"text", "Response"}, {"conversationId", convId}});
f.process();
f.io.clearPublished();
// Second query should use same conversation
f.io.injectMessage("ai:query", {{"query", "Question 2"}});
f.process();
auto msg2 = f.io.getLastPublished("llm:request");
REQUIRE(msg2["conversationId"] == convId);
}
// ============================================================================
// TI_AI_009: Token Counting Accumulates
// ============================================================================
TEST_CASE("TI_AI_009_TokenCountingAccumulates", "[ai][integration]") {
AITestFixture f;
f.configure();
// Query 1
f.io.injectMessage("ai:query", {{"query", "Q1"}});
f.process();
f.io.injectMessage("llm:response", {{"text", "R1"}, {"tokens", 50}});
f.process();
// Query 2
f.io.injectMessage("ai:query", {{"query", "Q2"}});
f.process();
f.io.injectMessage("llm:response", {{"text", "R2"}, {"tokens", 75}});
f.process();
// Verify total
auto state = f.module.getState();
// TODO: Verify totalTokens == 125
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_AI_010: State Serialization
// ============================================================================
TEST_CASE("TI_AI_010_StateSerialization", "[ai][integration]") {
AITestFixture f;
f.configure();
// Build state
f.io.injectMessage("ai:query", {{"query", "Test"}});
f.process();
f.io.injectMessage("llm:response", {{"text", "Response"}, {"tokens", 100}});
f.process();
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Restore
AIModule module2;
grove::JsonDataNode configNode2("config", json::object());
module2.setConfiguration(configNode2, &f.io, nullptr);
module2.setState(*state);
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
SUCCEED(); // Placeholder
}

View File

@ -1,285 +1,285 @@
/**
* @file MonitoringModuleTests.cpp
* @brief Integration tests for MonitoringModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/MonitoringModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class MonitoringTestFixture {
public:
MockIO io;
TimeSimulator time;
MonitoringModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"enabled", true},
{"productive_apps", json::array({"Code", "CLion", "Visual Studio"})},
{"distracting_apps", json::array({"Discord", "Steam", "YouTube"})}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
};
// ============================================================================
// TI_MONITOR_001: App Changed
// ============================================================================
TEST_CASE("TI_MONITOR_001_AppChanged", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Inject window change
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Code"},
{"duration", 0}
});
f.process();
// Verify app_changed published
REQUIRE(f.io.wasPublished("monitoring:app_changed"));
auto msg = f.io.getLastPublished("monitoring:app_changed");
REQUIRE(msg["appName"] == "Code");
}
// ============================================================================
// TI_MONITOR_002: Productive App Classification
// ============================================================================
TEST_CASE("TI_MONITOR_002_ProductiveAppClassification", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Code"},
{"duration", 0}
});
f.process();
REQUIRE(f.io.wasPublished("monitoring:app_changed"));
auto msg = f.io.getLastPublished("monitoring:app_changed");
REQUIRE(msg["classification"] == "productive");
}
// ============================================================================
// TI_MONITOR_003: Distracting App Classification
// ============================================================================
TEST_CASE("TI_MONITOR_003_DistractingAppClassification", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Discord"},
{"duration", 0}
});
f.process();
REQUIRE(f.io.wasPublished("monitoring:app_changed"));
auto msg = f.io.getLastPublished("monitoring:app_changed");
REQUIRE(msg["classification"] == "distracting");
}
// ============================================================================
// TI_MONITOR_004: Neutral App Classification
// ============================================================================
TEST_CASE("TI_MONITOR_004_NeutralAppClassification", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Notepad"},
{"duration", 0}
});
f.process();
REQUIRE(f.io.wasPublished("monitoring:app_changed"));
auto msg = f.io.getLastPublished("monitoring:app_changed");
REQUIRE(msg["classification"] == "neutral");
}
// ============================================================================
// TI_MONITOR_005: Duration Tracking
// ============================================================================
TEST_CASE("TI_MONITOR_005_DurationTracking", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Start with Code
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Code"},
{"duration", 0}
});
f.process();
f.io.clearPublished();
// Switch after 60 seconds
f.io.injectMessage("platform:window_changed", {
{"oldApp", "Code"},
{"newApp", "Discord"},
{"duration", 60}
});
f.process();
// Verify duration tracked
auto state = f.module.getState();
// TODO: Verify appDurations["Code"] == 60
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_MONITOR_006: Idle Detected Pauses Tracking
// ============================================================================
TEST_CASE("TI_MONITOR_006_IdleDetectedPausesTracking", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Start tracking
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Code"},
{"duration", 0}
});
f.process();
// Go idle
f.io.injectMessage("platform:idle_detected", {{"idleSeconds", 300}});
f.process();
// Verify idle state
auto state = f.module.getState();
// TODO: Verify isIdle == true
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_MONITOR_007: Activity Resumed Resumes Tracking
// ============================================================================
TEST_CASE("TI_MONITOR_007_ActivityResumedResumesTracking", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Setup idle state
f.io.injectMessage("platform:window_changed", {{"oldApp", ""}, {"newApp", "Code"}, {"duration", 0}});
f.process();
f.io.injectMessage("platform:idle_detected", {});
f.process();
// Resume
f.io.injectMessage("platform:activity_resumed", {});
f.process();
// Verify not idle
auto state = f.module.getState();
// TODO: Verify isIdle == false
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_MONITOR_008: Productivity Stats
// ============================================================================
TEST_CASE("TI_MONITOR_008_ProductivityStats", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Use productive app for 60s
f.io.injectMessage("platform:window_changed", {{"oldApp", ""}, {"newApp", "Code"}, {"duration", 0}});
f.process();
f.io.injectMessage("platform:window_changed", {{"oldApp", "Code"}, {"newApp", "Discord"}, {"duration", 60}});
f.process();
// Use distracting app for 30s
f.io.injectMessage("platform:window_changed", {{"oldApp", "Discord"}, {"newApp", "Code"}, {"duration", 30}});
f.process();
// Verify stats
auto state = f.module.getState();
// TODO: Verify totalProductiveSeconds == 60, totalDistractingSeconds == 30
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_MONITOR_009: Tool Query Get Current App
// ============================================================================
TEST_CASE("TI_MONITOR_009_ToolQueryGetCurrentApp", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Set current app
f.io.injectMessage("platform:window_changed", {{"oldApp", ""}, {"newApp", "Code"}, {"duration", 0}});
f.process();
f.io.clearPublished();
// Query
f.io.injectMessage("monitoring:query", {
{"action", "get_current_app"},
{"correlation_id", "test-456"}
});
f.process();
// Verify response
REQUIRE(f.io.wasPublished("monitoring:response"));
auto resp = f.io.getLastPublished("monitoring:response");
REQUIRE(resp["correlation_id"] == "test-456");
}
// ============================================================================
// TI_MONITOR_010: State Serialization
// ============================================================================
TEST_CASE("TI_MONITOR_010_StateSerialization", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Build up some state
f.io.injectMessage("platform:window_changed", {{"oldApp", ""}, {"newApp", "Code"}, {"duration", 0}});
f.process();
f.io.injectMessage("platform:window_changed", {{"oldApp", "Code"}, {"newApp", "Discord"}, {"duration", 120}});
f.process();
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Restore to new module
MonitoringModule module2;
grove::JsonDataNode configNode2("config", json::object());
module2.setConfiguration(configNode2, &f.io, nullptr);
module2.setState(*state);
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
SUCCEED(); // Placeholder
}
/**
* @file MonitoringModuleTests.cpp
* @brief Integration tests for MonitoringModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/MonitoringModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class MonitoringTestFixture {
public:
MockIO io;
TimeSimulator time;
MonitoringModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"enabled", true},
{"productive_apps", json::array({"Code", "CLion", "Visual Studio"})},
{"distracting_apps", json::array({"Discord", "Steam", "YouTube"})}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
};
// ============================================================================
// TI_MONITOR_001: App Changed
// ============================================================================
TEST_CASE("TI_MONITOR_001_AppChanged", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Inject window change
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Code"},
{"duration", 0}
});
f.process();
// Verify app_changed published
REQUIRE(f.io.wasPublished("monitoring:app_changed"));
auto msg = f.io.getLastPublished("monitoring:app_changed");
REQUIRE(msg["appName"] == "Code");
}
// ============================================================================
// TI_MONITOR_002: Productive App Classification
// ============================================================================
TEST_CASE("TI_MONITOR_002_ProductiveAppClassification", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Code"},
{"duration", 0}
});
f.process();
REQUIRE(f.io.wasPublished("monitoring:app_changed"));
auto msg = f.io.getLastPublished("monitoring:app_changed");
REQUIRE(msg["classification"] == "productive");
}
// ============================================================================
// TI_MONITOR_003: Distracting App Classification
// ============================================================================
TEST_CASE("TI_MONITOR_003_DistractingAppClassification", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Discord"},
{"duration", 0}
});
f.process();
REQUIRE(f.io.wasPublished("monitoring:app_changed"));
auto msg = f.io.getLastPublished("monitoring:app_changed");
REQUIRE(msg["classification"] == "distracting");
}
// ============================================================================
// TI_MONITOR_004: Neutral App Classification
// ============================================================================
TEST_CASE("TI_MONITOR_004_NeutralAppClassification", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Notepad"},
{"duration", 0}
});
f.process();
REQUIRE(f.io.wasPublished("monitoring:app_changed"));
auto msg = f.io.getLastPublished("monitoring:app_changed");
REQUIRE(msg["classification"] == "neutral");
}
// ============================================================================
// TI_MONITOR_005: Duration Tracking
// ============================================================================
TEST_CASE("TI_MONITOR_005_DurationTracking", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Start with Code
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Code"},
{"duration", 0}
});
f.process();
f.io.clearPublished();
// Switch after 60 seconds
f.io.injectMessage("platform:window_changed", {
{"oldApp", "Code"},
{"newApp", "Discord"},
{"duration", 60}
});
f.process();
// Verify duration tracked
auto state = f.module.getState();
// TODO: Verify appDurations["Code"] == 60
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_MONITOR_006: Idle Detected Pauses Tracking
// ============================================================================
TEST_CASE("TI_MONITOR_006_IdleDetectedPausesTracking", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Start tracking
f.io.injectMessage("platform:window_changed", {
{"oldApp", ""},
{"newApp", "Code"},
{"duration", 0}
});
f.process();
// Go idle
f.io.injectMessage("platform:idle_detected", {{"idleSeconds", 300}});
f.process();
// Verify idle state
auto state = f.module.getState();
// TODO: Verify isIdle == true
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_MONITOR_007: Activity Resumed Resumes Tracking
// ============================================================================
TEST_CASE("TI_MONITOR_007_ActivityResumedResumesTracking", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Setup idle state
f.io.injectMessage("platform:window_changed", {{"oldApp", ""}, {"newApp", "Code"}, {"duration", 0}});
f.process();
f.io.injectMessage("platform:idle_detected", {});
f.process();
// Resume
f.io.injectMessage("platform:activity_resumed", {});
f.process();
// Verify not idle
auto state = f.module.getState();
// TODO: Verify isIdle == false
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_MONITOR_008: Productivity Stats
// ============================================================================
TEST_CASE("TI_MONITOR_008_ProductivityStats", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Use productive app for 60s
f.io.injectMessage("platform:window_changed", {{"oldApp", ""}, {"newApp", "Code"}, {"duration", 0}});
f.process();
f.io.injectMessage("platform:window_changed", {{"oldApp", "Code"}, {"newApp", "Discord"}, {"duration", 60}});
f.process();
// Use distracting app for 30s
f.io.injectMessage("platform:window_changed", {{"oldApp", "Discord"}, {"newApp", "Code"}, {"duration", 30}});
f.process();
// Verify stats
auto state = f.module.getState();
// TODO: Verify totalProductiveSeconds == 60, totalDistractingSeconds == 30
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_MONITOR_009: Tool Query Get Current App
// ============================================================================
TEST_CASE("TI_MONITOR_009_ToolQueryGetCurrentApp", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Set current app
f.io.injectMessage("platform:window_changed", {{"oldApp", ""}, {"newApp", "Code"}, {"duration", 0}});
f.process();
f.io.clearPublished();
// Query
f.io.injectMessage("monitoring:query", {
{"action", "get_current_app"},
{"correlation_id", "test-456"}
});
f.process();
// Verify response
REQUIRE(f.io.wasPublished("monitoring:response"));
auto resp = f.io.getLastPublished("monitoring:response");
REQUIRE(resp["correlation_id"] == "test-456");
}
// ============================================================================
// TI_MONITOR_010: State Serialization
// ============================================================================
TEST_CASE("TI_MONITOR_010_StateSerialization", "[monitoring][integration]") {
MonitoringTestFixture f;
f.configure();
// Build up some state
f.io.injectMessage("platform:window_changed", {{"oldApp", ""}, {"newApp", "Code"}, {"duration", 0}});
f.process();
f.io.injectMessage("platform:window_changed", {{"oldApp", "Code"}, {"newApp", "Discord"}, {"duration", 120}});
f.process();
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Restore to new module
MonitoringModule module2;
grove::JsonDataNode configNode2("config", json::object());
module2.setConfiguration(configNode2, &f.io, nullptr);
module2.setState(*state);
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
SUCCEED(); // Placeholder
}

View File

@ -1,303 +1,303 @@
/**
* @file NotificationModuleTests.cpp
* @brief Integration tests for NotificationModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/NotificationModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class NotificationTestFixture {
public:
MockIO io;
TimeSimulator time;
NotificationModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"language", "fr"},
{"silentMode", false},
{"ttsEnabled", false},
{"maxQueueSize", 50}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
int getPendingCount() {
auto state = module.getState();
return state ? state->getInt("pendingCount", -1) : -1;
}
int getNotificationCount() {
auto state = module.getState();
return state ? state->getInt("notificationCount", -1) : -1;
}
int getUrgentCount() {
auto state = module.getState();
return state ? state->getInt("urgentCount", -1) : -1;
}
};
// ============================================================================
// TI_NOTIF_001: Queue Notification
// ============================================================================
TEST_CASE("TI_NOTIF_001_QueueNotification", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add notification
f.module.notify("Test Title", "Test Message", NotificationModule::Priority::NORMAL);
// Verify queue has 1 item (before processing)
REQUIRE(f.getPendingCount() == 1);
// Verify notification count incremented
REQUIRE(f.getNotificationCount() == 1);
}
// ============================================================================
// TI_NOTIF_002: Process Queue (max 3 per frame)
// ============================================================================
TEST_CASE("TI_NOTIF_002_ProcessQueue", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add 5 notifications
for (int i = 0; i < 5; i++) {
f.module.notify("Title", "Message " + std::to_string(i), NotificationModule::Priority::NORMAL);
}
// Verify 5 pending before process
REQUIRE(f.getPendingCount() == 5);
// Process one frame (should handle max 3)
f.process();
// Verify 2 remaining in queue
REQUIRE(f.getPendingCount() == 2);
}
// ============================================================================
// TI_NOTIF_003: Priority Ordering
// NOTE: Current implementation uses FIFO queue without priority sorting.
// This test verifies that URGENT notifications can still be added
// alongside other priorities. True priority ordering would require
// a priority queue implementation.
// ============================================================================
TEST_CASE("TI_NOTIF_003_PriorityOrdering", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add notifications in reverse priority order
f.module.notify("Low", "Low priority", NotificationModule::Priority::LOW);
f.module.notify("Urgent", "Urgent priority", NotificationModule::Priority::URGENT);
f.module.notify("Normal", "Normal priority", NotificationModule::Priority::NORMAL);
// Verify all 3 are queued
REQUIRE(f.getPendingCount() == 3);
// Verify urgent count is tracked
REQUIRE(f.getUrgentCount() == 1);
// Process - verify all are processed
f.process();
REQUIRE(f.getPendingCount() == 0);
}
// ============================================================================
// TI_NOTIF_004: Silent Mode Blocks Non-Urgent
// ============================================================================
TEST_CASE("TI_NOTIF_004_SilentModeBlocksNonUrgent", "[notification][integration]") {
NotificationTestFixture f;
f.configure({{"silentMode", true}});
// Add non-urgent notifications
f.module.notify("Low", "Should be blocked", NotificationModule::Priority::LOW);
f.module.notify("Normal", "Should be blocked", NotificationModule::Priority::NORMAL);
f.module.notify("High", "Should be blocked", NotificationModule::Priority::HIGH);
// Verify all were blocked (queue empty)
REQUIRE(f.getPendingCount() == 0);
// Verify notification count was NOT incremented for blocked notifications
// Note: Current implementation increments count before checking silentMode
// So count will be 0 (notify returns early before incrementing)
REQUIRE(f.getNotificationCount() == 0);
}
// ============================================================================
// TI_NOTIF_005: Silent Mode Allows Urgent
// ============================================================================
TEST_CASE("TI_NOTIF_005_SilentModeAllowsUrgent", "[notification][integration]") {
NotificationTestFixture f;
f.configure({{"silentMode", true}});
// Add urgent notification
f.module.notify("Urgent", "Should pass", NotificationModule::Priority::URGENT);
// Verify URGENT notification was queued
REQUIRE(f.getPendingCount() == 1);
// Verify counts
REQUIRE(f.getNotificationCount() == 1);
REQUIRE(f.getUrgentCount() == 1);
}
// ============================================================================
// TI_NOTIF_006: Max Queue Size
// ============================================================================
TEST_CASE("TI_NOTIF_006_MaxQueueSize", "[notification][integration]") {
NotificationTestFixture f;
f.configure({{"maxQueueSize", 5}});
// Add more than max (10 notifications)
for (int i = 0; i < 10; i++) {
f.module.notify("Title", "Message " + std::to_string(i), NotificationModule::Priority::NORMAL);
}
// Verify queue is capped at maxQueueSize
REQUIRE(f.getPendingCount() <= 5);
// Notification count should still reflect all attempts
REQUIRE(f.getNotificationCount() == 10);
}
// ============================================================================
// TI_NOTIF_007: Language Config
// ============================================================================
TEST_CASE("TI_NOTIF_007_LanguageConfig", "[notification][integration]") {
NotificationTestFixture f;
f.configure({{"language", "en"}});
// Verify module accepted configuration (no crash)
// The language is stored internally and used for notification display
// We can verify via getHealthStatus which doesn't expose language directly
auto health = f.module.getHealthStatus();
REQUIRE(health != nullptr);
REQUIRE(health->getString("status", "") == "running");
}
// ============================================================================
// TI_NOTIF_008: Notification Count Tracking
// ============================================================================
TEST_CASE("TI_NOTIF_008_NotificationCountTracking", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add various notifications
f.module.notify("Normal1", "msg", NotificationModule::Priority::NORMAL);
f.module.notify("Urgent1", "msg", NotificationModule::Priority::URGENT);
f.module.notify("Urgent2", "msg", NotificationModule::Priority::URGENT);
f.module.notify("Low1", "msg", NotificationModule::Priority::LOW);
// Verify counts
REQUIRE(f.getNotificationCount() == 4);
REQUIRE(f.getUrgentCount() == 2);
REQUIRE(f.getPendingCount() == 4);
// Process all
f.process(); // processes 3
f.process(); // processes 1
// Verify queue empty but counts preserved
REQUIRE(f.getPendingCount() == 0);
REQUIRE(f.getNotificationCount() == 4);
REQUIRE(f.getUrgentCount() == 2);
}
// ============================================================================
// TI_NOTIF_009: State Serialization
// ============================================================================
TEST_CASE("TI_NOTIF_009_StateSerialization", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Create some state
f.module.notify("Test1", "msg", NotificationModule::Priority::NORMAL);
f.module.notify("Test2", "msg", NotificationModule::Priority::URGENT);
f.process(); // Process some
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Verify state contains expected fields
REQUIRE(state->getInt("notificationCount", -1) == 2);
REQUIRE(state->getInt("urgentCount", -1) == 1);
// Create new module and restore
NotificationModule module2;
MockIO io2;
grove::JsonDataNode configNode("config", json::object());
module2.setConfiguration(configNode, &io2, nullptr);
module2.setState(*state);
// Verify counters were restored
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
REQUIRE(state2->getInt("notificationCount", -1) == 2);
REQUIRE(state2->getInt("urgentCount", -1) == 1);
// Note: pending queue is NOT restored (documented behavior)
REQUIRE(state2->getInt("pendingCount", -1) == 0);
}
// ============================================================================
// TI_NOTIF_010: Multiple Frame Processing
// ============================================================================
TEST_CASE("TI_NOTIF_010_MultipleFrameProcessing", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add 7 notifications (needs 3 frames to process at 3/frame)
for (int i = 0; i < 7; i++) {
f.module.notify("Title", "Message " + std::to_string(i), NotificationModule::Priority::NORMAL);
}
// Verify initial count
REQUIRE(f.getPendingCount() == 7);
// Frame 1: 3 processed, 4 remaining
f.process();
REQUIRE(f.getPendingCount() == 4);
// Frame 2: 3 processed, 1 remaining
f.process();
REQUIRE(f.getPendingCount() == 1);
// Frame 3: 1 processed, 0 remaining
f.process();
REQUIRE(f.getPendingCount() == 0);
// Total notification count should be unchanged
REQUIRE(f.getNotificationCount() == 7);
}
/**
* @file NotificationModuleTests.cpp
* @brief Integration tests for NotificationModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/NotificationModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class NotificationTestFixture {
public:
MockIO io;
TimeSimulator time;
NotificationModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"language", "fr"},
{"silentMode", false},
{"ttsEnabled", false},
{"maxQueueSize", 50}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
int getPendingCount() {
auto state = module.getState();
return state ? state->getInt("pendingCount", -1) : -1;
}
int getNotificationCount() {
auto state = module.getState();
return state ? state->getInt("notificationCount", -1) : -1;
}
int getUrgentCount() {
auto state = module.getState();
return state ? state->getInt("urgentCount", -1) : -1;
}
};
// ============================================================================
// TI_NOTIF_001: Queue Notification
// ============================================================================
TEST_CASE("TI_NOTIF_001_QueueNotification", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add notification
f.module.notify("Test Title", "Test Message", NotificationModule::Priority::NORMAL);
// Verify queue has 1 item (before processing)
REQUIRE(f.getPendingCount() == 1);
// Verify notification count incremented
REQUIRE(f.getNotificationCount() == 1);
}
// ============================================================================
// TI_NOTIF_002: Process Queue (max 3 per frame)
// ============================================================================
TEST_CASE("TI_NOTIF_002_ProcessQueue", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add 5 notifications
for (int i = 0; i < 5; i++) {
f.module.notify("Title", "Message " + std::to_string(i), NotificationModule::Priority::NORMAL);
}
// Verify 5 pending before process
REQUIRE(f.getPendingCount() == 5);
// Process one frame (should handle max 3)
f.process();
// Verify 2 remaining in queue
REQUIRE(f.getPendingCount() == 2);
}
// ============================================================================
// TI_NOTIF_003: Priority Ordering
// NOTE: Current implementation uses FIFO queue without priority sorting.
// This test verifies that URGENT notifications can still be added
// alongside other priorities. True priority ordering would require
// a priority queue implementation.
// ============================================================================
TEST_CASE("TI_NOTIF_003_PriorityOrdering", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add notifications in reverse priority order
f.module.notify("Low", "Low priority", NotificationModule::Priority::LOW);
f.module.notify("Urgent", "Urgent priority", NotificationModule::Priority::URGENT);
f.module.notify("Normal", "Normal priority", NotificationModule::Priority::NORMAL);
// Verify all 3 are queued
REQUIRE(f.getPendingCount() == 3);
// Verify urgent count is tracked
REQUIRE(f.getUrgentCount() == 1);
// Process - verify all are processed
f.process();
REQUIRE(f.getPendingCount() == 0);
}
// ============================================================================
// TI_NOTIF_004: Silent Mode Blocks Non-Urgent
// ============================================================================
TEST_CASE("TI_NOTIF_004_SilentModeBlocksNonUrgent", "[notification][integration]") {
NotificationTestFixture f;
f.configure({{"silentMode", true}});
// Add non-urgent notifications
f.module.notify("Low", "Should be blocked", NotificationModule::Priority::LOW);
f.module.notify("Normal", "Should be blocked", NotificationModule::Priority::NORMAL);
f.module.notify("High", "Should be blocked", NotificationModule::Priority::HIGH);
// Verify all were blocked (queue empty)
REQUIRE(f.getPendingCount() == 0);
// Verify notification count was NOT incremented for blocked notifications
// Note: Current implementation increments count before checking silentMode
// So count will be 0 (notify returns early before incrementing)
REQUIRE(f.getNotificationCount() == 0);
}
// ============================================================================
// TI_NOTIF_005: Silent Mode Allows Urgent
// ============================================================================
TEST_CASE("TI_NOTIF_005_SilentModeAllowsUrgent", "[notification][integration]") {
NotificationTestFixture f;
f.configure({{"silentMode", true}});
// Add urgent notification
f.module.notify("Urgent", "Should pass", NotificationModule::Priority::URGENT);
// Verify URGENT notification was queued
REQUIRE(f.getPendingCount() == 1);
// Verify counts
REQUIRE(f.getNotificationCount() == 1);
REQUIRE(f.getUrgentCount() == 1);
}
// ============================================================================
// TI_NOTIF_006: Max Queue Size
// ============================================================================
TEST_CASE("TI_NOTIF_006_MaxQueueSize", "[notification][integration]") {
NotificationTestFixture f;
f.configure({{"maxQueueSize", 5}});
// Add more than max (10 notifications)
for (int i = 0; i < 10; i++) {
f.module.notify("Title", "Message " + std::to_string(i), NotificationModule::Priority::NORMAL);
}
// Verify queue is capped at maxQueueSize
REQUIRE(f.getPendingCount() <= 5);
// Notification count should still reflect all attempts
REQUIRE(f.getNotificationCount() == 10);
}
// ============================================================================
// TI_NOTIF_007: Language Config
// ============================================================================
TEST_CASE("TI_NOTIF_007_LanguageConfig", "[notification][integration]") {
NotificationTestFixture f;
f.configure({{"language", "en"}});
// Verify module accepted configuration (no crash)
// The language is stored internally and used for notification display
// We can verify via getHealthStatus which doesn't expose language directly
auto health = f.module.getHealthStatus();
REQUIRE(health != nullptr);
REQUIRE(health->getString("status", "") == "running");
}
// ============================================================================
// TI_NOTIF_008: Notification Count Tracking
// ============================================================================
TEST_CASE("TI_NOTIF_008_NotificationCountTracking", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add various notifications
f.module.notify("Normal1", "msg", NotificationModule::Priority::NORMAL);
f.module.notify("Urgent1", "msg", NotificationModule::Priority::URGENT);
f.module.notify("Urgent2", "msg", NotificationModule::Priority::URGENT);
f.module.notify("Low1", "msg", NotificationModule::Priority::LOW);
// Verify counts
REQUIRE(f.getNotificationCount() == 4);
REQUIRE(f.getUrgentCount() == 2);
REQUIRE(f.getPendingCount() == 4);
// Process all
f.process(); // processes 3
f.process(); // processes 1
// Verify queue empty but counts preserved
REQUIRE(f.getPendingCount() == 0);
REQUIRE(f.getNotificationCount() == 4);
REQUIRE(f.getUrgentCount() == 2);
}
// ============================================================================
// TI_NOTIF_009: State Serialization
// ============================================================================
TEST_CASE("TI_NOTIF_009_StateSerialization", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Create some state
f.module.notify("Test1", "msg", NotificationModule::Priority::NORMAL);
f.module.notify("Test2", "msg", NotificationModule::Priority::URGENT);
f.process(); // Process some
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Verify state contains expected fields
REQUIRE(state->getInt("notificationCount", -1) == 2);
REQUIRE(state->getInt("urgentCount", -1) == 1);
// Create new module and restore
NotificationModule module2;
MockIO io2;
grove::JsonDataNode configNode("config", json::object());
module2.setConfiguration(configNode, &io2, nullptr);
module2.setState(*state);
// Verify counters were restored
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
REQUIRE(state2->getInt("notificationCount", -1) == 2);
REQUIRE(state2->getInt("urgentCount", -1) == 1);
// Note: pending queue is NOT restored (documented behavior)
REQUIRE(state2->getInt("pendingCount", -1) == 0);
}
// ============================================================================
// TI_NOTIF_010: Multiple Frame Processing
// ============================================================================
TEST_CASE("TI_NOTIF_010_MultipleFrameProcessing", "[notification][integration]") {
NotificationTestFixture f;
f.configure();
// Add 7 notifications (needs 3 frames to process at 3/frame)
for (int i = 0; i < 7; i++) {
f.module.notify("Title", "Message " + std::to_string(i), NotificationModule::Priority::NORMAL);
}
// Verify initial count
REQUIRE(f.getPendingCount() == 7);
// Frame 1: 3 processed, 4 remaining
f.process();
REQUIRE(f.getPendingCount() == 4);
// Frame 2: 3 processed, 1 remaining
f.process();
REQUIRE(f.getPendingCount() == 1);
// Frame 3: 1 processed, 0 remaining
f.process();
REQUIRE(f.getPendingCount() == 0);
// Total notification count should be unchanged
REQUIRE(f.getNotificationCount() == 7);
}

View File

@ -1,315 +1,315 @@
/**
* @file SchedulerModuleTests.cpp
* @brief Integration tests for SchedulerModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/SchedulerModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class SchedulerTestFixture {
public:
MockIO io;
TimeSimulator time;
SchedulerModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"hyperfocusThresholdMinutes", 120},
{"breakReminderIntervalMinutes", 45},
{"breakDurationMinutes", 10}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
void processWithTime(float gameTime) {
time.setTime(gameTime);
grove::JsonDataNode input("input", time.createInput(0.1f));
module.process(input);
}
};
// ============================================================================
// TI_SCHEDULER_001: Start Task
// ============================================================================
TEST_CASE("TI_SCHEDULER_001_StartTask", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Inject task switch message
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
// Process
f.process();
// Verify task_started was published
REQUIRE(f.io.wasPublished("scheduler:task_started"));
auto msg = f.io.getLastPublished("scheduler:task_started");
REQUIRE(msg["taskId"] == "task-1");
REQUIRE(msg.contains("taskName"));
}
// ============================================================================
// TI_SCHEDULER_002: Complete Task
// ============================================================================
TEST_CASE("TI_SCHEDULER_002_CompleteTask", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start a task at time 0
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.clearPublished();
// Advance time 30 minutes (1800 seconds)
f.time.setTime(1800.0f);
// Switch to another task (completes current task implicitly)
f.io.injectMessage("user:task_switch", {{"taskId", "task-2"}});
f.process();
// Verify task_completed was published with duration
REQUIRE(f.io.wasPublished("scheduler:task_completed"));
auto msg = f.io.getLastPublished("scheduler:task_completed");
REQUIRE(msg["taskId"] == "task-1");
REQUIRE(msg.contains("duration"));
// Duration should be around 30 minutes
int duration = msg["duration"].get<int>();
REQUIRE(duration >= 29);
REQUIRE(duration <= 31);
}
// ============================================================================
// TI_SCHEDULER_003: Hyperfocus Detection
// ============================================================================
TEST_CASE("TI_SCHEDULER_003_HyperfocusDetection", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure({{"hyperfocusThresholdMinutes", 120}});
// Start a task at time 0
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.clearPublished();
// Advance time past threshold (121 minutes = 7260 seconds)
f.processWithTime(7260.0f);
// Verify hyperfocus alert
REQUIRE(f.io.wasPublished("scheduler:hyperfocus_alert"));
auto msg = f.io.getLastPublished("scheduler:hyperfocus_alert");
REQUIRE(msg["type"] == "hyperfocus");
REQUIRE(msg["task"] == "task-1");
REQUIRE(msg["duration_minutes"].get<int>() >= 120);
}
// ============================================================================
// TI_SCHEDULER_004: Hyperfocus Alert Only Once
// ============================================================================
TEST_CASE("TI_SCHEDULER_004_HyperfocusAlertOnce", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure({{"hyperfocusThresholdMinutes", 120}});
// Start task
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
// Trigger hyperfocus (121 min)
f.processWithTime(7260.0f);
// Count first alert
size_t alertCount = f.io.countPublished("scheduler:hyperfocus_alert");
REQUIRE(alertCount == 1);
// Continue processing (130 min, 140 min)
f.processWithTime(7800.0f);
f.processWithTime(8400.0f);
// Should still be only 1 alert
REQUIRE(f.io.countPublished("scheduler:hyperfocus_alert") == 1);
}
// ============================================================================
// TI_SCHEDULER_005: Break Reminder
// ============================================================================
TEST_CASE("TI_SCHEDULER_005_BreakReminder", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure({{"breakReminderIntervalMinutes", 45}});
// Process at time 0 (sets lastBreakTime)
f.processWithTime(0.0f);
f.io.clearPublished();
// Advance past break reminder interval (46 minutes = 2760 seconds)
f.processWithTime(2760.0f);
// Verify break reminder
REQUIRE(f.io.wasPublished("scheduler:break_reminder"));
auto msg = f.io.getLastPublished("scheduler:break_reminder");
REQUIRE(msg["type"] == "break");
REQUIRE(msg.contains("break_duration"));
}
// ============================================================================
// TI_SCHEDULER_006: Idle Pauses Session
// ============================================================================
TEST_CASE("TI_SCHEDULER_006_IdlePausesSession", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start task
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
// Go idle
f.io.injectMessage("monitoring:idle_detected", {{"idleSeconds", 300}});
f.processWithTime(60.0f);
// Verify module received and processed the idle message
// (Module logs "User idle" - we can verify via state)
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Task should still be tracked (idle doesn't clear it)
REQUIRE(state->getString("currentTaskId", "") == "task-1");
}
// ============================================================================
// TI_SCHEDULER_007: Activity Resumes Session
// ============================================================================
TEST_CASE("TI_SCHEDULER_007_ActivityResumesSession", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start task, go idle, resume
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.injectMessage("monitoring:idle_detected", {});
f.processWithTime(60.0f);
f.io.injectMessage("monitoring:activity_resumed", {});
f.processWithTime(120.0f);
// Verify session continues - task still active
auto state = f.module.getState();
REQUIRE(state != nullptr);
REQUIRE(state->getString("currentTaskId", "") == "task-1");
}
// ============================================================================
// TI_SCHEDULER_008: Tool Query Get Current Task
// ============================================================================
TEST_CASE("TI_SCHEDULER_008_ToolQueryGetCurrentTask", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start a task
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.clearPublished();
// Query current task
f.io.injectMessage("scheduler:query", {
{"action", "get_current_task"},
{"correlation_id", "test-123"}
});
f.processWithTime(60.0f);
// Verify response
REQUIRE(f.io.wasPublished("scheduler:response"));
auto resp = f.io.getLastPublished("scheduler:response");
REQUIRE(resp["correlation_id"] == "test-123");
REQUIRE(resp["task_id"] == "task-1");
}
// ============================================================================
// TI_SCHEDULER_009: Tool Command Start Break
// ============================================================================
TEST_CASE("TI_SCHEDULER_009_ToolCommandStartBreak", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start task
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.clearPublished();
// Command to start break
f.io.injectMessage("scheduler:command", {
{"action", "start_break"},
{"duration_minutes", 15},
{"reason", "test break"}
});
f.processWithTime(60.0f);
// Verify break started was published
REQUIRE(f.io.wasPublished("scheduler:break_started"));
auto msg = f.io.getLastPublished("scheduler:break_started");
REQUIRE(msg["duration"] == 15);
REQUIRE(msg["reason"] == "test break");
// Verify response was also published
REQUIRE(f.io.wasPublished("scheduler:response"));
auto resp = f.io.getLastPublished("scheduler:response");
REQUIRE(resp["success"] == true);
}
// ============================================================================
// TI_SCHEDULER_010: State Serialization
// ============================================================================
TEST_CASE("TI_SCHEDULER_010_StateSerialization", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Setup some state
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.processWithTime(1800.0f); // 30 minutes
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Verify state content
REQUIRE(state->getString("currentTaskId", "") == "task-1");
REQUIRE(state->getBool("hyperfocusAlertSent", true) == false);
// Create new module and restore state
SchedulerModule module2;
MockIO io2;
grove::JsonDataNode configNode("config", json::object());
module2.setConfiguration(configNode, &io2, nullptr);
module2.setState(*state);
// Verify state was restored
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
REQUIRE(state2->getString("currentTaskId", "") == "task-1");
REQUIRE(state2->getBool("hyperfocusAlertSent", true) == false);
}
/**
* @file SchedulerModuleTests.cpp
* @brief Integration tests for SchedulerModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/SchedulerModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class SchedulerTestFixture {
public:
MockIO io;
TimeSimulator time;
SchedulerModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"hyperfocusThresholdMinutes", 120},
{"breakReminderIntervalMinutes", 45},
{"breakDurationMinutes", 10}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
void processWithTime(float gameTime) {
time.setTime(gameTime);
grove::JsonDataNode input("input", time.createInput(0.1f));
module.process(input);
}
};
// ============================================================================
// TI_SCHEDULER_001: Start Task
// ============================================================================
TEST_CASE("TI_SCHEDULER_001_StartTask", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Inject task switch message
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
// Process
f.process();
// Verify task_started was published
REQUIRE(f.io.wasPublished("scheduler:task_started"));
auto msg = f.io.getLastPublished("scheduler:task_started");
REQUIRE(msg["taskId"] == "task-1");
REQUIRE(msg.contains("taskName"));
}
// ============================================================================
// TI_SCHEDULER_002: Complete Task
// ============================================================================
TEST_CASE("TI_SCHEDULER_002_CompleteTask", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start a task at time 0
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.clearPublished();
// Advance time 30 minutes (1800 seconds)
f.time.setTime(1800.0f);
// Switch to another task (completes current task implicitly)
f.io.injectMessage("user:task_switch", {{"taskId", "task-2"}});
f.process();
// Verify task_completed was published with duration
REQUIRE(f.io.wasPublished("scheduler:task_completed"));
auto msg = f.io.getLastPublished("scheduler:task_completed");
REQUIRE(msg["taskId"] == "task-1");
REQUIRE(msg.contains("duration"));
// Duration should be around 30 minutes
int duration = msg["duration"].get<int>();
REQUIRE(duration >= 29);
REQUIRE(duration <= 31);
}
// ============================================================================
// TI_SCHEDULER_003: Hyperfocus Detection
// ============================================================================
TEST_CASE("TI_SCHEDULER_003_HyperfocusDetection", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure({{"hyperfocusThresholdMinutes", 120}});
// Start a task at time 0
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.clearPublished();
// Advance time past threshold (121 minutes = 7260 seconds)
f.processWithTime(7260.0f);
// Verify hyperfocus alert
REQUIRE(f.io.wasPublished("scheduler:hyperfocus_alert"));
auto msg = f.io.getLastPublished("scheduler:hyperfocus_alert");
REQUIRE(msg["type"] == "hyperfocus");
REQUIRE(msg["task"] == "task-1");
REQUIRE(msg["duration_minutes"].get<int>() >= 120);
}
// ============================================================================
// TI_SCHEDULER_004: Hyperfocus Alert Only Once
// ============================================================================
TEST_CASE("TI_SCHEDULER_004_HyperfocusAlertOnce", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure({{"hyperfocusThresholdMinutes", 120}});
// Start task
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
// Trigger hyperfocus (121 min)
f.processWithTime(7260.0f);
// Count first alert
size_t alertCount = f.io.countPublished("scheduler:hyperfocus_alert");
REQUIRE(alertCount == 1);
// Continue processing (130 min, 140 min)
f.processWithTime(7800.0f);
f.processWithTime(8400.0f);
// Should still be only 1 alert
REQUIRE(f.io.countPublished("scheduler:hyperfocus_alert") == 1);
}
// ============================================================================
// TI_SCHEDULER_005: Break Reminder
// ============================================================================
TEST_CASE("TI_SCHEDULER_005_BreakReminder", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure({{"breakReminderIntervalMinutes", 45}});
// Process at time 0 (sets lastBreakTime)
f.processWithTime(0.0f);
f.io.clearPublished();
// Advance past break reminder interval (46 minutes = 2760 seconds)
f.processWithTime(2760.0f);
// Verify break reminder
REQUIRE(f.io.wasPublished("scheduler:break_reminder"));
auto msg = f.io.getLastPublished("scheduler:break_reminder");
REQUIRE(msg["type"] == "break");
REQUIRE(msg.contains("break_duration"));
}
// ============================================================================
// TI_SCHEDULER_006: Idle Pauses Session
// ============================================================================
TEST_CASE("TI_SCHEDULER_006_IdlePausesSession", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start task
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
// Go idle
f.io.injectMessage("monitoring:idle_detected", {{"idleSeconds", 300}});
f.processWithTime(60.0f);
// Verify module received and processed the idle message
// (Module logs "User idle" - we can verify via state)
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Task should still be tracked (idle doesn't clear it)
REQUIRE(state->getString("currentTaskId", "") == "task-1");
}
// ============================================================================
// TI_SCHEDULER_007: Activity Resumes Session
// ============================================================================
TEST_CASE("TI_SCHEDULER_007_ActivityResumesSession", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start task, go idle, resume
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.injectMessage("monitoring:idle_detected", {});
f.processWithTime(60.0f);
f.io.injectMessage("monitoring:activity_resumed", {});
f.processWithTime(120.0f);
// Verify session continues - task still active
auto state = f.module.getState();
REQUIRE(state != nullptr);
REQUIRE(state->getString("currentTaskId", "") == "task-1");
}
// ============================================================================
// TI_SCHEDULER_008: Tool Query Get Current Task
// ============================================================================
TEST_CASE("TI_SCHEDULER_008_ToolQueryGetCurrentTask", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start a task
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.clearPublished();
// Query current task
f.io.injectMessage("scheduler:query", {
{"action", "get_current_task"},
{"correlation_id", "test-123"}
});
f.processWithTime(60.0f);
// Verify response
REQUIRE(f.io.wasPublished("scheduler:response"));
auto resp = f.io.getLastPublished("scheduler:response");
REQUIRE(resp["correlation_id"] == "test-123");
REQUIRE(resp["task_id"] == "task-1");
}
// ============================================================================
// TI_SCHEDULER_009: Tool Command Start Break
// ============================================================================
TEST_CASE("TI_SCHEDULER_009_ToolCommandStartBreak", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Start task
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.io.clearPublished();
// Command to start break
f.io.injectMessage("scheduler:command", {
{"action", "start_break"},
{"duration_minutes", 15},
{"reason", "test break"}
});
f.processWithTime(60.0f);
// Verify break started was published
REQUIRE(f.io.wasPublished("scheduler:break_started"));
auto msg = f.io.getLastPublished("scheduler:break_started");
REQUIRE(msg["duration"] == 15);
REQUIRE(msg["reason"] == "test break");
// Verify response was also published
REQUIRE(f.io.wasPublished("scheduler:response"));
auto resp = f.io.getLastPublished("scheduler:response");
REQUIRE(resp["success"] == true);
}
// ============================================================================
// TI_SCHEDULER_010: State Serialization
// ============================================================================
TEST_CASE("TI_SCHEDULER_010_StateSerialization", "[scheduler][integration]") {
SchedulerTestFixture f;
f.configure();
// Setup some state
f.io.injectMessage("user:task_switch", {{"taskId", "task-1"}});
f.processWithTime(0.0f);
f.processWithTime(1800.0f); // 30 minutes
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Verify state content
REQUIRE(state->getString("currentTaskId", "") == "task-1");
REQUIRE(state->getBool("hyperfocusAlertSent", true) == false);
// Create new module and restore state
SchedulerModule module2;
MockIO io2;
grove::JsonDataNode configNode("config", json::object());
module2.setConfiguration(configNode, &io2, nullptr);
module2.setState(*state);
// Verify state was restored
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
REQUIRE(state2->getString("currentTaskId", "") == "task-1");
REQUIRE(state2->getBool("hyperfocusAlertSent", true) == false);
}

View File

@ -1,293 +1,293 @@
/**
* @file StorageModuleTests.cpp
* @brief Integration tests for StorageModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/StorageModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class StorageTestFixture {
public:
MockIO io;
TimeSimulator time;
StorageModule module;
void configure(const json& config = json::object()) {
json fullConfig = json::object();
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
};
// ============================================================================
// TI_STORAGE_001: Task Completed Saves Session
// ============================================================================
TEST_CASE("TI_STORAGE_001_TaskCompletedSavesSession", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Receive task completed
f.io.injectMessage("scheduler:task_completed", {
{"taskId", "task-1"},
{"taskName", "Coding session"},
{"durationMinutes", 45},
{"hyperfocus", false}
});
f.process();
// Verify save_session published
REQUIRE(f.io.wasPublished("storage:save_session"));
auto msg = f.io.getLastPublished("storage:save_session");
REQUIRE(msg["taskName"] == "Coding session");
REQUIRE(msg["durationMinutes"] == 45);
}
// ============================================================================
// TI_STORAGE_002: App Changed Saves Usage
// ============================================================================
TEST_CASE("TI_STORAGE_002_AppChangedSavesUsage", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Receive app changed with duration
f.io.injectMessage("monitoring:app_changed", {
{"appName", "Code"},
{"oldApp", "Discord"},
{"duration", 120},
{"classification", "productive"}
});
f.process();
// Verify save_app_usage published
REQUIRE(f.io.wasPublished("storage:save_app_usage"));
auto msg = f.io.getLastPublished("storage:save_app_usage");
REQUIRE(msg["appName"] == "Discord"); // Old app that ended
REQUIRE(msg["durationSeconds"] == 120);
}
// ============================================================================
// TI_STORAGE_003: Session Saved Updates Last ID
// ============================================================================
TEST_CASE("TI_STORAGE_003_SessionSavedUpdatesLastId", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Receive session saved confirmation
f.io.injectMessage("storage:session_saved", {
{"sessionId", 42}
});
f.process();
// Verify state updated
auto state = f.module.getState();
// TODO: Verify lastSessionId == 42
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_004: Storage Error Handled
// ============================================================================
TEST_CASE("TI_STORAGE_004_StorageErrorHandled", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Receive storage error
f.io.injectMessage("storage:error", {
{"message", "Database locked"}
});
// Should not throw
REQUIRE_NOTHROW(f.process());
}
// ============================================================================
// TI_STORAGE_005: Pending Saves Tracking
// ============================================================================
TEST_CASE("TI_STORAGE_005_PendingSavesTracking", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Trigger save
f.io.injectMessage("scheduler:task_completed", {
{"taskId", "t1"},
{"taskName", "Task"},
{"durationMinutes", 10}
});
f.process();
// Verify pending incremented
auto state = f.module.getState();
// TODO: Verify pendingSaves == 1
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_006: Total Saved Tracking
// ============================================================================
TEST_CASE("TI_STORAGE_006_TotalSavedTracking", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Save and confirm multiple times
for (int i = 0; i < 3; i++) {
f.io.injectMessage("scheduler:task_completed", {
{"taskId", "t" + std::to_string(i)},
{"taskName", "Task"},
{"durationMinutes", 10}
});
f.process();
f.io.injectMessage("storage:session_saved", {{"sessionId", i}});
f.process();
}
// Verify total
auto state = f.module.getState();
// TODO: Verify totalSaved == 3
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_007: Tool Query Notes
// ============================================================================
TEST_CASE("TI_STORAGE_007_ToolQueryNotes", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Add a note first
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Test note"},
{"tags", json::array({"test", "important"})}
});
f.process();
f.io.clearPublished();
// Query notes
f.io.injectMessage("storage:query", {
{"action", "query_notes"},
{"correlation_id", "query-1"}
});
f.process();
// Verify response
REQUIRE(f.io.wasPublished("storage:response"));
auto resp = f.io.getLastPublished("storage:response");
REQUIRE(resp["correlation_id"] == "query-1");
}
// ============================================================================
// TI_STORAGE_008: Tool Command Save Note
// ============================================================================
TEST_CASE("TI_STORAGE_008_ToolCommandSaveNote", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Save note
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Remember to check logs"},
{"tags", json::array({"reminder"})}
});
f.process();
// Verify note added to state
auto state = f.module.getState();
// TODO: Verify notes contains the new note
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_009: Note Tags Filtering
// ============================================================================
TEST_CASE("TI_STORAGE_009_NoteTagsFiltering", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Add notes with different tags
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Work note"},
{"tags", json::array({"work"})}
});
f.process();
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Personal note"},
{"tags", json::array({"personal"})}
});
f.process();
f.io.clearPublished();
// Query with tag filter
f.io.injectMessage("storage:query", {
{"action", "query_notes"},
{"tags", json::array({"work"})},
{"correlation_id", "filter-1"}
});
f.process();
// Verify filtered response
REQUIRE(f.io.wasPublished("storage:response"));
auto resp = f.io.getLastPublished("storage:response");
// TODO: Verify only work notes returned
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_010: State Serialization
// ============================================================================
TEST_CASE("TI_STORAGE_010_StateSerialization", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Build state with notes
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Test note for serialization"},
{"tags", json::array({"test"})}
});
f.process();
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Restore
StorageModule module2;
grove::JsonDataNode configNode2("config", json::object());
module2.setConfiguration(configNode2, &f.io, nullptr);
module2.setState(*state);
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
SUCCEED(); // Placeholder
}
/**
* @file StorageModuleTests.cpp
* @brief Integration tests for StorageModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/StorageModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class StorageTestFixture {
public:
MockIO io;
TimeSimulator time;
StorageModule module;
void configure(const json& config = json::object()) {
json fullConfig = json::object();
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
};
// ============================================================================
// TI_STORAGE_001: Task Completed Saves Session
// ============================================================================
TEST_CASE("TI_STORAGE_001_TaskCompletedSavesSession", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Receive task completed
f.io.injectMessage("scheduler:task_completed", {
{"taskId", "task-1"},
{"taskName", "Coding session"},
{"durationMinutes", 45},
{"hyperfocus", false}
});
f.process();
// Verify save_session published
REQUIRE(f.io.wasPublished("storage:save_session"));
auto msg = f.io.getLastPublished("storage:save_session");
REQUIRE(msg["taskName"] == "Coding session");
REQUIRE(msg["durationMinutes"] == 45);
}
// ============================================================================
// TI_STORAGE_002: App Changed Saves Usage
// ============================================================================
TEST_CASE("TI_STORAGE_002_AppChangedSavesUsage", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Receive app changed with duration
f.io.injectMessage("monitoring:app_changed", {
{"appName", "Code"},
{"oldApp", "Discord"},
{"duration", 120},
{"classification", "productive"}
});
f.process();
// Verify save_app_usage published
REQUIRE(f.io.wasPublished("storage:save_app_usage"));
auto msg = f.io.getLastPublished("storage:save_app_usage");
REQUIRE(msg["appName"] == "Discord"); // Old app that ended
REQUIRE(msg["durationSeconds"] == 120);
}
// ============================================================================
// TI_STORAGE_003: Session Saved Updates Last ID
// ============================================================================
TEST_CASE("TI_STORAGE_003_SessionSavedUpdatesLastId", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Receive session saved confirmation
f.io.injectMessage("storage:session_saved", {
{"sessionId", 42}
});
f.process();
// Verify state updated
auto state = f.module.getState();
// TODO: Verify lastSessionId == 42
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_004: Storage Error Handled
// ============================================================================
TEST_CASE("TI_STORAGE_004_StorageErrorHandled", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Receive storage error
f.io.injectMessage("storage:error", {
{"message", "Database locked"}
});
// Should not throw
REQUIRE_NOTHROW(f.process());
}
// ============================================================================
// TI_STORAGE_005: Pending Saves Tracking
// ============================================================================
TEST_CASE("TI_STORAGE_005_PendingSavesTracking", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Trigger save
f.io.injectMessage("scheduler:task_completed", {
{"taskId", "t1"},
{"taskName", "Task"},
{"durationMinutes", 10}
});
f.process();
// Verify pending incremented
auto state = f.module.getState();
// TODO: Verify pendingSaves == 1
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_006: Total Saved Tracking
// ============================================================================
TEST_CASE("TI_STORAGE_006_TotalSavedTracking", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Save and confirm multiple times
for (int i = 0; i < 3; i++) {
f.io.injectMessage("scheduler:task_completed", {
{"taskId", "t" + std::to_string(i)},
{"taskName", "Task"},
{"durationMinutes", 10}
});
f.process();
f.io.injectMessage("storage:session_saved", {{"sessionId", i}});
f.process();
}
// Verify total
auto state = f.module.getState();
// TODO: Verify totalSaved == 3
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_007: Tool Query Notes
// ============================================================================
TEST_CASE("TI_STORAGE_007_ToolQueryNotes", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Add a note first
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Test note"},
{"tags", json::array({"test", "important"})}
});
f.process();
f.io.clearPublished();
// Query notes
f.io.injectMessage("storage:query", {
{"action", "query_notes"},
{"correlation_id", "query-1"}
});
f.process();
// Verify response
REQUIRE(f.io.wasPublished("storage:response"));
auto resp = f.io.getLastPublished("storage:response");
REQUIRE(resp["correlation_id"] == "query-1");
}
// ============================================================================
// TI_STORAGE_008: Tool Command Save Note
// ============================================================================
TEST_CASE("TI_STORAGE_008_ToolCommandSaveNote", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Save note
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Remember to check logs"},
{"tags", json::array({"reminder"})}
});
f.process();
// Verify note added to state
auto state = f.module.getState();
// TODO: Verify notes contains the new note
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_009: Note Tags Filtering
// ============================================================================
TEST_CASE("TI_STORAGE_009_NoteTagsFiltering", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Add notes with different tags
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Work note"},
{"tags", json::array({"work"})}
});
f.process();
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Personal note"},
{"tags", json::array({"personal"})}
});
f.process();
f.io.clearPublished();
// Query with tag filter
f.io.injectMessage("storage:query", {
{"action", "query_notes"},
{"tags", json::array({"work"})},
{"correlation_id", "filter-1"}
});
f.process();
// Verify filtered response
REQUIRE(f.io.wasPublished("storage:response"));
auto resp = f.io.getLastPublished("storage:response");
// TODO: Verify only work notes returned
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_STORAGE_010: State Serialization
// ============================================================================
TEST_CASE("TI_STORAGE_010_StateSerialization", "[storage][integration]") {
StorageTestFixture f;
f.configure();
// Build state with notes
f.io.injectMessage("storage:command", {
{"action", "save_note"},
{"content", "Test note for serialization"},
{"tags", json::array({"test"})}
});
f.process();
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Restore
StorageModule module2;
grove::JsonDataNode configNode2("config", json::object());
module2.setConfiguration(configNode2, &f.io, nullptr);
module2.setState(*state);
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
SUCCEED(); // Placeholder
}

View File

@ -1,258 +1,258 @@
/**
* @file VoiceModuleTests.cpp
* @brief Integration tests for VoiceModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/VoiceModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class VoiceTestFixture {
public:
MockIO io;
TimeSimulator time;
VoiceModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"ttsEnabled", true},
{"sttEnabled", true},
{"language", "fr"}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
};
// ============================================================================
// TI_VOICE_001: AI Response Triggers Speak
// ============================================================================
TEST_CASE("TI_VOICE_001_AIResponseTriggersSpeak", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Receive AI response
f.io.injectMessage("ai:response", {
{"text", "Voici la reponse a ta question"}
});
f.process();
// Verify speak request
REQUIRE(f.io.wasPublished("voice:speak"));
auto msg = f.io.getLastPublished("voice:speak");
REQUIRE(msg["text"] == "Voici la reponse a ta question");
}
// ============================================================================
// TI_VOICE_002: Suggestion Priority Speak
// ============================================================================
TEST_CASE("TI_VOICE_002_SuggestionPrioritySpeak", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Receive suggestion (should be priority)
f.io.injectMessage("ai:suggestion", {
{"message", "Tu devrais faire une pause"},
{"duration", 5}
});
f.process();
// Verify speak with priority
REQUIRE(f.io.wasPublished("voice:speak"));
auto msg = f.io.getLastPublished("voice:speak");
REQUIRE(msg["priority"] == true);
}
// ============================================================================
// TI_VOICE_003: Speaking Started Updates State
// ============================================================================
TEST_CASE("TI_VOICE_003_SpeakingStartedUpdatesState", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Initially idle
REQUIRE(f.module.isIdle() == true);
// Receive speaking started
f.io.injectMessage("voice:speaking_started", {{"text", "Hello"}});
f.process();
// Should be speaking
REQUIRE(f.module.isIdle() == false);
}
// ============================================================================
// TI_VOICE_004: Speaking Ended Updates State
// ============================================================================
TEST_CASE("TI_VOICE_004_SpeakingEndedUpdatesState", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Start speaking
f.io.injectMessage("voice:speaking_started", {{"text", "Hello"}});
f.process();
REQUIRE(f.module.isIdle() == false);
// End speaking
f.io.injectMessage("voice:speaking_ended", {});
f.process();
// Should be idle
REQUIRE(f.module.isIdle() == true);
}
// ============================================================================
// TI_VOICE_005: IsIdle Reflects Speaking
// ============================================================================
TEST_CASE("TI_VOICE_005_IsIdleReflectsSpeaking", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Not speaking = idle
REQUIRE(f.module.isIdle() == true);
// Start speaking
f.io.injectMessage("voice:speaking_started", {});
f.process();
REQUIRE(f.module.isIdle() == false);
// Stop speaking
f.io.injectMessage("voice:speaking_ended", {});
f.process();
REQUIRE(f.module.isIdle() == true);
}
// ============================================================================
// TI_VOICE_006: Transcription Forwarded (No Re-publish)
// ============================================================================
TEST_CASE("TI_VOICE_006_TranscriptionForwarded", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Receive transcription
f.io.injectMessage("voice:transcription", {
{"text", "Test transcription"},
{"confidence", 0.9}
});
f.process();
// VoiceModule should NOT re-publish transcription
// It just updates internal state
REQUIRE(f.io.countPublished("voice:transcription") == 0);
}
// ============================================================================
// TI_VOICE_007: Total Spoken Incremented
// ============================================================================
TEST_CASE("TI_VOICE_007_TotalSpokenIncremented", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Complete one speak cycle
f.io.injectMessage("voice:speaking_started", {});
f.process();
f.io.injectMessage("voice:speaking_ended", {});
f.process();
// Complete another
f.io.injectMessage("voice:speaking_started", {});
f.process();
f.io.injectMessage("voice:speaking_ended", {});
f.process();
// Verify counter
auto state = f.module.getState();
// TODO: Verify totalSpoken == 2
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_VOICE_008: TTS Disabled Config
// ============================================================================
TEST_CASE("TI_VOICE_008_TTSDisabledConfig", "[voice][integration]") {
VoiceTestFixture f;
f.configure({{"ttsEnabled", false}});
// Try to trigger speak
f.io.injectMessage("ai:response", {{"text", "Should not speak"}});
f.process();
// Should NOT publish speak request
REQUIRE(f.io.wasPublished("voice:speak") == false);
}
// ============================================================================
// TI_VOICE_009: Tool Command Speak
// ============================================================================
TEST_CASE("TI_VOICE_009_ToolCommandSpeak", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Send speak command via tool
f.io.injectMessage("voice:command", {
{"action", "speak"},
{"text", "Hello from tool"}
});
f.process();
// Verify speak published
REQUIRE(f.io.wasPublished("voice:speak"));
auto msg = f.io.getLastPublished("voice:speak");
REQUIRE(msg["text"] == "Hello from tool");
}
// ============================================================================
// TI_VOICE_010: State Serialization
// ============================================================================
TEST_CASE("TI_VOICE_010_StateSerialization", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Build state
f.io.injectMessage("voice:speaking_started", {});
f.process();
f.io.injectMessage("voice:speaking_ended", {});
f.process();
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Restore
VoiceModule module2;
grove::JsonDataNode configNode2("config", json::object());
module2.setConfiguration(configNode2, &f.io, nullptr);
module2.setState(*state);
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
SUCCEED(); // Placeholder
}
/**
* @file VoiceModuleTests.cpp
* @brief Integration tests for VoiceModule (10 TI)
*/
#include <catch2/catch_test_macros.hpp>
#include "mocks/MockIO.hpp"
#include "utils/TimeSimulator.hpp"
#include "utils/TestHelpers.hpp"
#include "modules/VoiceModule.h"
#include <grove/JsonDataNode.h>
using namespace aissia;
using namespace aissia::tests;
// ============================================================================
// Test Fixture
// ============================================================================
class VoiceTestFixture {
public:
MockIO io;
TimeSimulator time;
VoiceModule module;
void configure(const json& config = json::object()) {
json fullConfig = {
{"ttsEnabled", true},
{"sttEnabled", true},
{"language", "fr"}
};
fullConfig.merge_patch(config);
grove::JsonDataNode configNode("config", fullConfig);
module.setConfiguration(configNode, &io, nullptr);
}
void process() {
grove::JsonDataNode input("input", time.createInput());
module.process(input);
}
};
// ============================================================================
// TI_VOICE_001: AI Response Triggers Speak
// ============================================================================
TEST_CASE("TI_VOICE_001_AIResponseTriggersSpeak", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Receive AI response
f.io.injectMessage("ai:response", {
{"text", "Voici la reponse a ta question"}
});
f.process();
// Verify speak request
REQUIRE(f.io.wasPublished("voice:speak"));
auto msg = f.io.getLastPublished("voice:speak");
REQUIRE(msg["text"] == "Voici la reponse a ta question");
}
// ============================================================================
// TI_VOICE_002: Suggestion Priority Speak
// ============================================================================
TEST_CASE("TI_VOICE_002_SuggestionPrioritySpeak", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Receive suggestion (should be priority)
f.io.injectMessage("ai:suggestion", {
{"message", "Tu devrais faire une pause"},
{"duration", 5}
});
f.process();
// Verify speak with priority
REQUIRE(f.io.wasPublished("voice:speak"));
auto msg = f.io.getLastPublished("voice:speak");
REQUIRE(msg["priority"] == true);
}
// ============================================================================
// TI_VOICE_003: Speaking Started Updates State
// ============================================================================
TEST_CASE("TI_VOICE_003_SpeakingStartedUpdatesState", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Initially idle
REQUIRE(f.module.isIdle() == true);
// Receive speaking started
f.io.injectMessage("voice:speaking_started", {{"text", "Hello"}});
f.process();
// Should be speaking
REQUIRE(f.module.isIdle() == false);
}
// ============================================================================
// TI_VOICE_004: Speaking Ended Updates State
// ============================================================================
TEST_CASE("TI_VOICE_004_SpeakingEndedUpdatesState", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Start speaking
f.io.injectMessage("voice:speaking_started", {{"text", "Hello"}});
f.process();
REQUIRE(f.module.isIdle() == false);
// End speaking
f.io.injectMessage("voice:speaking_ended", {});
f.process();
// Should be idle
REQUIRE(f.module.isIdle() == true);
}
// ============================================================================
// TI_VOICE_005: IsIdle Reflects Speaking
// ============================================================================
TEST_CASE("TI_VOICE_005_IsIdleReflectsSpeaking", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Not speaking = idle
REQUIRE(f.module.isIdle() == true);
// Start speaking
f.io.injectMessage("voice:speaking_started", {});
f.process();
REQUIRE(f.module.isIdle() == false);
// Stop speaking
f.io.injectMessage("voice:speaking_ended", {});
f.process();
REQUIRE(f.module.isIdle() == true);
}
// ============================================================================
// TI_VOICE_006: Transcription Forwarded (No Re-publish)
// ============================================================================
TEST_CASE("TI_VOICE_006_TranscriptionForwarded", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Receive transcription
f.io.injectMessage("voice:transcription", {
{"text", "Test transcription"},
{"confidence", 0.9}
});
f.process();
// VoiceModule should NOT re-publish transcription
// It just updates internal state
REQUIRE(f.io.countPublished("voice:transcription") == 0);
}
// ============================================================================
// TI_VOICE_007: Total Spoken Incremented
// ============================================================================
TEST_CASE("TI_VOICE_007_TotalSpokenIncremented", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Complete one speak cycle
f.io.injectMessage("voice:speaking_started", {});
f.process();
f.io.injectMessage("voice:speaking_ended", {});
f.process();
// Complete another
f.io.injectMessage("voice:speaking_started", {});
f.process();
f.io.injectMessage("voice:speaking_ended", {});
f.process();
// Verify counter
auto state = f.module.getState();
// TODO: Verify totalSpoken == 2
SUCCEED(); // Placeholder
}
// ============================================================================
// TI_VOICE_008: TTS Disabled Config
// ============================================================================
TEST_CASE("TI_VOICE_008_TTSDisabledConfig", "[voice][integration]") {
VoiceTestFixture f;
f.configure({{"ttsEnabled", false}});
// Try to trigger speak
f.io.injectMessage("ai:response", {{"text", "Should not speak"}});
f.process();
// Should NOT publish speak request
REQUIRE(f.io.wasPublished("voice:speak") == false);
}
// ============================================================================
// TI_VOICE_009: Tool Command Speak
// ============================================================================
TEST_CASE("TI_VOICE_009_ToolCommandSpeak", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Send speak command via tool
f.io.injectMessage("voice:command", {
{"action", "speak"},
{"text", "Hello from tool"}
});
f.process();
// Verify speak published
REQUIRE(f.io.wasPublished("voice:speak"));
auto msg = f.io.getLastPublished("voice:speak");
REQUIRE(msg["text"] == "Hello from tool");
}
// ============================================================================
// TI_VOICE_010: State Serialization
// ============================================================================
TEST_CASE("TI_VOICE_010_StateSerialization", "[voice][integration]") {
VoiceTestFixture f;
f.configure();
// Build state
f.io.injectMessage("voice:speaking_started", {});
f.process();
f.io.injectMessage("voice:speaking_ended", {});
f.process();
// Get state
auto state = f.module.getState();
REQUIRE(state != nullptr);
// Restore
VoiceModule module2;
grove::JsonDataNode configNode2("config", json::object());
module2.setConfiguration(configNode2, &f.io, nullptr);
module2.setState(*state);
auto state2 = module2.getState();
REQUIRE(state2 != nullptr);
SUCCEED(); // Placeholder
}

View File

@ -1,82 +1,82 @@
#pragma once
#include <catch2/catch_test_macros.hpp>
#include <nlohmann/json.hpp>
#include <string>
namespace aissia::tests {
using json = nlohmann::json;
// ============================================================================
// Custom Catch2 Matchers and Macros
// ============================================================================
/**
* @brief Require that a message was published to a topic
*/
#define REQUIRE_PUBLISHED(io, topic) \
REQUIRE_MESSAGE(io.wasPublished(topic), "Expected message on topic: " << topic)
/**
* @brief Require that no message was published to a topic
*/
#define REQUIRE_NOT_PUBLISHED(io, topic) \
REQUIRE_MESSAGE(!io.wasPublished(topic), "Did not expect message on topic: " << topic)
/**
* @brief Require specific count of messages on a topic
*/
#define REQUIRE_PUBLISH_COUNT(io, topic, count) \
REQUIRE(io.countPublished(topic) == count)
// ============================================================================
// JSON Helpers
// ============================================================================
/**
* @brief Create a minimal valid config for a module
*/
inline json makeConfig(const json& overrides = json::object()) {
json config = json::object();
for (auto& [key, value] : overrides.items()) {
config[key] = value;
}
return config;
}
/**
* @brief Check if JSON contains expected fields
*/
inline bool jsonContains(const json& j, const json& expected) {
for (auto& [key, value] : expected.items()) {
if (!j.contains(key) || j[key] != value) {
return false;
}
}
return true;
}
// ============================================================================
// Test Tags
// ============================================================================
// Module tags
constexpr const char* TAG_SCHEDULER = "[scheduler]";
constexpr const char* TAG_NOTIFICATION = "[notification]";
constexpr const char* TAG_MONITORING = "[monitoring]";
constexpr const char* TAG_AI = "[ai]";
constexpr const char* TAG_VOICE = "[voice]";
constexpr const char* TAG_STORAGE = "[storage]";
// MCP tags
constexpr const char* TAG_MCP = "[mcp]";
constexpr const char* TAG_MCP_TYPES = "[mcp][types]";
constexpr const char* TAG_MCP_TRANSPORT = "[mcp][transport]";
constexpr const char* TAG_MCP_CLIENT = "[mcp][client]";
// Common tags
constexpr const char* TAG_INTEGRATION = "[integration]";
constexpr const char* TAG_UNIT = "[unit]";
} // namespace aissia::tests
#pragma once
#include <catch2/catch_test_macros.hpp>
#include <nlohmann/json.hpp>
#include <string>
namespace aissia::tests {
using json = nlohmann::json;
// ============================================================================
// Custom Catch2 Matchers and Macros
// ============================================================================
/**
* @brief Require that a message was published to a topic
*/
#define REQUIRE_PUBLISHED(io, topic) \
REQUIRE_MESSAGE(io.wasPublished(topic), "Expected message on topic: " << topic)
/**
* @brief Require that no message was published to a topic
*/
#define REQUIRE_NOT_PUBLISHED(io, topic) \
REQUIRE_MESSAGE(!io.wasPublished(topic), "Did not expect message on topic: " << topic)
/**
* @brief Require specific count of messages on a topic
*/
#define REQUIRE_PUBLISH_COUNT(io, topic, count) \
REQUIRE(io.countPublished(topic) == count)
// ============================================================================
// JSON Helpers
// ============================================================================
/**
* @brief Create a minimal valid config for a module
*/
inline json makeConfig(const json& overrides = json::object()) {
json config = json::object();
for (auto& [key, value] : overrides.items()) {
config[key] = value;
}
return config;
}
/**
* @brief Check if JSON contains expected fields
*/
inline bool jsonContains(const json& j, const json& expected) {
for (auto& [key, value] : expected.items()) {
if (!j.contains(key) || j[key] != value) {
return false;
}
}
return true;
}
// ============================================================================
// Test Tags
// ============================================================================
// Module tags
constexpr const char* TAG_SCHEDULER = "[scheduler]";
constexpr const char* TAG_NOTIFICATION = "[notification]";
constexpr const char* TAG_MONITORING = "[monitoring]";
constexpr const char* TAG_AI = "[ai]";
constexpr const char* TAG_VOICE = "[voice]";
constexpr const char* TAG_STORAGE = "[storage]";
// MCP tags
constexpr const char* TAG_MCP = "[mcp]";
constexpr const char* TAG_MCP_TYPES = "[mcp][types]";
constexpr const char* TAG_MCP_TRANSPORT = "[mcp][transport]";
constexpr const char* TAG_MCP_CLIENT = "[mcp][client]";
// Common tags
constexpr const char* TAG_INTEGRATION = "[integration]";
constexpr const char* TAG_UNIT = "[unit]";
} // namespace aissia::tests

View File

@ -1,92 +1,92 @@
#pragma once
#include <grove/JsonDataNode.h>
#include <nlohmann/json.hpp>
#include <memory>
namespace aissia::tests {
using json = nlohmann::json;
/**
* @brief Simulates game time for testing modules
*
* Modules receive time info via process() input:
* {
* "gameTime": 123.45, // Total elapsed time in seconds
* "deltaTime": 0.1 // Time since last frame
* }
*/
class TimeSimulator {
public:
TimeSimulator() = default;
/**
* @brief Create input data for module.process()
* @param deltaTime Time since last frame (default 0.1s = 10Hz)
*/
json createInput(float deltaTime = 0.1f) {
json input = {
{"gameTime", m_gameTime},
{"deltaTime", deltaTime}
};
m_gameTime += deltaTime;
return input;
}
/**
* @brief Create input as IDataNode
*/
std::unique_ptr<grove::JsonDataNode> createInputNode(float deltaTime = 0.1f) {
return std::make_unique<grove::JsonDataNode>("input", createInput(deltaTime));
}
/**
* @brief Advance time without creating input
*/
void advance(float seconds) {
m_gameTime += seconds;
}
/**
* @brief Advance time by minutes (convenience for hyperfocus tests)
*/
void advanceMinutes(float minutes) {
m_gameTime += minutes * 60.0f;
}
/**
* @brief Set absolute time
*/
void setTime(float time) {
m_gameTime = time;
}
/**
* @brief Get current game time
*/
float getTime() const {
return m_gameTime;
}
/**
* @brief Reset to zero
*/
void reset() {
m_gameTime = 0.0f;
}
/**
* @brief Simulate multiple frames
* @param count Number of frames to simulate
* @param deltaTime Time per frame
*/
void simulateFrames(int count, float deltaTime = 0.1f) {
m_gameTime += count * deltaTime;
}
private:
float m_gameTime = 0.0f;
};
} // namespace aissia::tests
#pragma once
#include <grove/JsonDataNode.h>
#include <nlohmann/json.hpp>
#include <memory>
namespace aissia::tests {
using json = nlohmann::json;
/**
* @brief Simulates game time for testing modules
*
* Modules receive time info via process() input:
* {
* "gameTime": 123.45, // Total elapsed time in seconds
* "deltaTime": 0.1 // Time since last frame
* }
*/
class TimeSimulator {
public:
TimeSimulator() = default;
/**
* @brief Create input data for module.process()
* @param deltaTime Time since last frame (default 0.1s = 10Hz)
*/
json createInput(float deltaTime = 0.1f) {
json input = {
{"gameTime", m_gameTime},
{"deltaTime", deltaTime}
};
m_gameTime += deltaTime;
return input;
}
/**
* @brief Create input as IDataNode
*/
std::unique_ptr<grove::JsonDataNode> createInputNode(float deltaTime = 0.1f) {
return std::make_unique<grove::JsonDataNode>("input", createInput(deltaTime));
}
/**
* @brief Advance time without creating input
*/
void advance(float seconds) {
m_gameTime += seconds;
}
/**
* @brief Advance time by minutes (convenience for hyperfocus tests)
*/
void advanceMinutes(float minutes) {
m_gameTime += minutes * 60.0f;
}
/**
* @brief Set absolute time
*/
void setTime(float time) {
m_gameTime = time;
}
/**
* @brief Get current game time
*/
float getTime() const {
return m_gameTime;
}
/**
* @brief Reset to zero
*/
void reset() {
m_gameTime = 0.0f;
}
/**
* @brief Simulate multiple frames
* @param count Number of frames to simulate
* @param deltaTime Time per frame
*/
void simulateFrames(int count, float deltaTime = 0.1f) {
m_gameTime += count * deltaTime;
}
private:
float m_gameTime = 0.0f;
};
} // namespace aissia::tests