Phase 04: Single Model Q&A - 2 plans created - 5 total tasks defined - Ready for execution Plans: - 04-01: AI client abstraction (openai dep, config, AIClient class) - 04-02: /ask handler and bot integration (M3 milestone) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
6.2 KiB
| phase | plan | type |
|---|---|---|
| 04-single-model-qa | 02 | execute |
Purpose: Complete M3 milestone - users can ask questions to individual AI models through Telegram. Output: Working /ask command that sends questions to AI models and returns responses.
<execution_context> ~/.claude/get-shit-done/workflows/execute-phase.md ~/.claude/get-shit-done/templates/summary.md </execution_context>
@.planning/PROJECT.md @.planning/ROADMAP.md @.planning/STATE.md @.planning/phases/04-single-model-qa/04-01-SUMMARY.md (will exist after 04-01)Key files:
@src/moai/bot/main.py @src/moai/bot/config.py @src/moai/bot/handlers/init.py @src/moai/bot/handlers/projects.py @src/moai/core/ai_client.py (will exist after 04-01)
Tech available:
- python-telegram-bot, sqlalchemy, httpx, aiosqlite, openai (added in 04-01)
- AI client abstraction (04-01)
Established patterns:
- Handler registration in handlers/init.py
- get_selected_project() helper in projects.py
- Module-level init pattern (database.py, ai_client.py)
- Async command handlers
Constraining decisions:
- Thin handlers delegating to core (CLAUDE.md)
- Service layer for business logic
- Import init_ai_client from moai.core.ai_client
- In post_init callback (or create one if needed), after database init:
- Call init_ai_client(config)
- Log "AI client initialized with {config.AI_ROUTER}"
Follow existing pattern - database is initialized in post_init, AI client goes right after. Keep error handling minimal - if AI_API_KEY is missing, let it fail at first use rather than at startup (user may just want to test bot commands first). python -c "import moai.bot.main; print('main imports ok')" main.py imports and initializes AI client alongside database in post_init
Task 2: Create /ask handler for single model queries src/moai/bot/handlers/discussion.py, src/moai/bot/handlers/__init__.py Create discussion.py with /ask handler:-
Create src/moai/bot/handlers/discussion.py:
"""Discussion handlers for MoAI bot.""" from telegram import Update from telegram.ext import ContextTypes from moai.bot.handlers.projects import get_selected_project from moai.core.ai_client import get_ai_client, MODEL_MAP async def ask_command(update: Update, context: ContextTypes.DEFAULT_TYPE) -> None: """Handle /ask <model> <question> command. Examples: /ask claude What is Python? /ask gpt Explain async/await """ args = context.args or [] if len(args) < 2: available = ", ".join(MODEL_MAP.keys()) await update.message.reply_text( f"Usage: /ask <model> <question>\n" f"Available models: {available}\n\n" f"Example: /ask claude What is Python?" ) return model_name = args[0].lower() question = " ".join(args[1:]) # Validate model if model_name not in MODEL_MAP: available = ", ".join(MODEL_MAP.keys()) await update.message.reply_text( f"Unknown model: {model_name}\n" f"Available: {available}" ) return # Get project context if available (optional for /ask) project = await get_selected_project(context) project_context = f"Project: {project.name}\n" if project else "" # Send "typing" indicator while waiting for AI await update.message.chat.send_action("typing") try: client = get_ai_client() response = await client.complete( model=model_name, messages=[{"role": "user", "content": question}], system_prompt=f"{project_context}You are a helpful AI assistant." ) # Format response with model name await update.message.reply_text( f"*{model_name.title()}:*\n\n{response}", parse_mode="Markdown" ) except Exception as e: await update.message.reply_text(f"Error: {e}") -
Update handlers/init.py:
- Import ask_command from discussion
- Add CommandHandler("ask", ask_command) to register_handlers python -c "from moai.bot.handlers.discussion import ask_command; print('handler ok')" /ask handler registered, validates model name, sends typing indicator, calls AI client
- Update status.py to show AI client status:
- Import get_ai_client (wrapped in try/except)
- In status_command, add a line showing AI router configured
- Example: "AI Router: requesty ✓" or "AI Router: not configured" python -c "from moai.bot.handlers.commands import HELP_TEXT; print('/ask' in HELP_TEXT)" Help shows /ask command, status shows AI router status
<success_criteria>
- AI client initialized in bot lifecycle
- /ask command works with model validation
- Help text updated with /ask
- Status shows AI configuration
- M3 milestone: Single model Q&A working </success_criteria>
Note: This completes Phase 4. Summary should note M3 milestone complete.