|
|
889cd24319
|
Amélioration système de traduction: normalisation, lexique et couverture
Corrections majeures:
- Normalisation ligatures (œ→oe, æ→ae) pour éviter fragmentation tokens
- Normalisation complète lexique (clés + synonymes) sans accents
- Correction faux positif "dansent"→"dans" (longueur radical ≥5)
Enrichissement lexique (+212 entrées):
- Verbes: battre (pulum), penser/réfléchir (umis), voler (aliuk)
- Mots grammaticaux: nous (tanu), possessifs (sa/mon→na), démonstratifs (ce→ko)
- Temporels: hier/avant (at), demain/après (ok), autour (no)
- Formes conjuguées ajoutées pour manger, battre, penser
Améliorations techniques:
- Lemmatisation verbale améliorée (radical ≥5 lettres)
- Système normalizeText() dans lexiqueLoader.js
- Liaisons sacrées pour compositions culturelles
Note: Problème connu de lemmatisation à investiguer (formes fléchies non trouvées)
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
|
2025-11-30 22:37:31 +08:00 |
|
|
|
e8d17ab0d5
|
Implement radical lookup system for Confluent translator (83% → 92% coverage)
Major features:
- Radical-based word matching for conjugated verbs
- Morphological decomposition for compound words
- Multi-index search (byWord + byFormeLiee)
- Cascade search strategy with confidence scoring
New files:
- ConfluentTranslator/radicalMatcher.js: Extract radicals from conjugated forms
- ConfluentTranslator/morphologicalDecomposer.js: Decompose compound words
- ConfluentTranslator/plans/radical-lookup-system.md: Implementation plan
- ConfluentTranslator/test-results-radical-system.md: Test results and analysis
- ancien-confluent/lexique/00-grammaire.json: Grammar particles
- ancien-confluent/lexique/lowercase-confluent.js: Lowercase utility
Modified files:
- ConfluentTranslator/reverseIndexBuilder.js: Added byFormeLiee index
- ConfluentTranslator/confluentToFrench.js: Cascade search with radicals
- Multiple lexique JSON files: Enhanced entries with forme_liee
Test results:
- Before: 83% coverage (101/122 tokens)
- After: 92% coverage (112/122 tokens)
- Improvement: +9 percentage points
Remaining work to reach 95%+:
- Add missing particles (ve, eol)
- Enrich VERBAL_SUFFIXES (aran, vis)
- Document missing words (tiru, kala, vulu)
🤖 Generated with Claude Code (https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
|
2025-11-28 22:24:56 +08:00 |
|