moai/.planning/phases/04-single-model-qa/04-02-PLAN.md
Mikkel Georgsen 4ea13efe8f docs(04): create phase plans for single model Q&A
Phase 04: Single Model Q&A
- 2 plans created
- 5 total tasks defined
- Ready for execution

Plans:
- 04-01: AI client abstraction (openai dep, config, AIClient class)
- 04-02: /ask handler and bot integration (M3 milestone)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-16 19:00:50 +00:00

6.2 KiB

phase plan type
04-single-model-qa 02 execute
Implement /ask command for single model Q&A and integrate AI client into bot lifecycle.

Purpose: Complete M3 milestone - users can ask questions to individual AI models through Telegram. Output: Working /ask command that sends questions to AI models and returns responses.

<execution_context> ~/.claude/get-shit-done/workflows/execute-phase.md ~/.claude/get-shit-done/templates/summary.md </execution_context>

@.planning/PROJECT.md @.planning/ROADMAP.md @.planning/STATE.md @.planning/phases/04-single-model-qa/04-01-SUMMARY.md (will exist after 04-01)

Key files:

@src/moai/bot/main.py @src/moai/bot/config.py @src/moai/bot/handlers/init.py @src/moai/bot/handlers/projects.py @src/moai/core/ai_client.py (will exist after 04-01)

Tech available:

  • python-telegram-bot, sqlalchemy, httpx, aiosqlite, openai (added in 04-01)
  • AI client abstraction (04-01)

Established patterns:

  • Handler registration in handlers/init.py
  • get_selected_project() helper in projects.py
  • Module-level init pattern (database.py, ai_client.py)
  • Async command handlers

Constraining decisions:

  • Thin handlers delegating to core (CLAUDE.md)
  • Service layer for business logic
Task 1: Integrate AI client into bot lifecycle src/moai/bot/main.py Modify main.py to initialize AI client during bot startup:
  1. Import init_ai_client from moai.core.ai_client
  2. In post_init callback (or create one if needed), after database init:
    • Call init_ai_client(config)
    • Log "AI client initialized with {config.AI_ROUTER}"

Follow existing pattern - database is initialized in post_init, AI client goes right after. Keep error handling minimal - if AI_API_KEY is missing, let it fail at first use rather than at startup (user may just want to test bot commands first). python -c "import moai.bot.main; print('main imports ok')" main.py imports and initializes AI client alongside database in post_init

Task 2: Create /ask handler for single model queries src/moai/bot/handlers/discussion.py, src/moai/bot/handlers/__init__.py Create discussion.py with /ask handler:
  1. Create src/moai/bot/handlers/discussion.py:

    """Discussion handlers for MoAI bot."""
    from telegram import Update
    from telegram.ext import ContextTypes
    
    from moai.bot.handlers.projects import get_selected_project
    from moai.core.ai_client import get_ai_client, MODEL_MAP
    
    async def ask_command(update: Update, context: ContextTypes.DEFAULT_TYPE) -> None:
        """Handle /ask <model> <question> command.
    
        Examples:
            /ask claude What is Python?
            /ask gpt Explain async/await
        """
        args = context.args or []
    
        if len(args) < 2:
            available = ", ".join(MODEL_MAP.keys())
            await update.message.reply_text(
                f"Usage: /ask <model> <question>\n"
                f"Available models: {available}\n\n"
                f"Example: /ask claude What is Python?"
            )
            return
    
        model_name = args[0].lower()
        question = " ".join(args[1:])
    
        # Validate model
        if model_name not in MODEL_MAP:
            available = ", ".join(MODEL_MAP.keys())
            await update.message.reply_text(
                f"Unknown model: {model_name}\n"
                f"Available: {available}"
            )
            return
    
        # Get project context if available (optional for /ask)
        project = await get_selected_project(context)
        project_context = f"Project: {project.name}\n" if project else ""
    
        # Send "typing" indicator while waiting for AI
        await update.message.chat.send_action("typing")
    
        try:
            client = get_ai_client()
            response = await client.complete(
                model=model_name,
                messages=[{"role": "user", "content": question}],
                system_prompt=f"{project_context}You are a helpful AI assistant."
            )
    
            # Format response with model name
            await update.message.reply_text(
                f"*{model_name.title()}:*\n\n{response}",
                parse_mode="Markdown"
            )
        except Exception as e:
            await update.message.reply_text(f"Error: {e}")
    
  2. Update handlers/init.py:

    • Import ask_command from discussion
    • Add CommandHandler("ask", ask_command) to register_handlers python -c "from moai.bot.handlers.discussion import ask_command; print('handler ok')" /ask handler registered, validates model name, sends typing indicator, calls AI client
Task 3: Update help text and status src/moai/bot/handlers/commands.py, src/moai/bot/handlers/status.py 1. Update HELP_TEXT in commands.py to include: ``` *Questions* /ask - Ask a single model ``` Add after the Project Management section.
  1. Update status.py to show AI client status:
    • Import get_ai_client (wrapped in try/except)
    • In status_command, add a line showing AI router configured
    • Example: "AI Router: requesty ✓" or "AI Router: not configured" python -c "from moai.bot.handlers.commands import HELP_TEXT; print('/ask' in HELP_TEXT)" Help shows /ask command, status shows AI router status
Before declaring plan complete: - [ ] `ruff check src` passes - [ ] Bot starts without errors (with valid AI_API_KEY in env) - [ ] /help shows /ask command - [ ] /status shows AI router status - [ ] /ask without args shows usage - [ ] /ask with invalid model shows available models

<success_criteria>

  • AI client initialized in bot lifecycle
  • /ask command works with model validation
  • Help text updated with /ask
  • Status shows AI configuration
  • M3 milestone: Single model Q&A working </success_criteria>
After completion, create `.planning/phases/04-single-model-qa/04-02-SUMMARY.md`

Note: This completes Phase 4. Summary should note M3 milestone complete.