Add direct model query function to orchestrator that:
- Takes single model, message, optional discussion context, and project name
- Builds context from discussion history when available
- Uses system prompt indicating this is a direct message
- Handles errors gracefully (returns error message string)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add /next to advance discussion with full context passed to models:
- Validates active discussion state
- Auto-completes when round limit reached
- Shows Round N/M progress indicator
Add /stop to end discussion early:
- Marks discussion as COMPLETED
- Clears session state
- Suggests /consensus for summarization
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add /discuss [rounds] command that:
- Requires active discussion from /open
- Stores discussion state in user_data for /next and /stop
- Runs sequential rounds via run_discussion_round
- Shows Round N/M progress indicator
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add sequential model execution for discuss mode:
- build_context() converts discussion history to OpenAI message format
- run_discussion_round() queries models sequentially, each seeing prior responses
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Tasks completed: 3/3
- Create orchestrator module with parallel query function
- Implement /open command handler with database persistence
- Update help text for discussion commands
SUMMARY: .planning/phases/05-multi-model-discussions/05-02-SUMMARY.md
Add @model mention syntax to help text under Discussion Commands section.
Documents all multi-model discussion commands for the full M4-M8 scope.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add /open command handler that queries all project models in parallel.
Creates Discussion and Round records, persists Message for each response.
Shows typing indicator and formats output with model names.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add orchestrator module with SYSTEM_PROMPT constant and query_models_parallel
function that uses asyncio.gather() to query all models simultaneously.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add create_round, get_current_round, create_message, and get_round_messages
functions. All async with proper type hints and eager loading.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add discussion.py service with create_discussion, get_discussion,
get_active_discussion, list_discussions, and complete_discussion.
Uses selectinload for eager loading of rounds, messages, and consensus.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Phase 05: Multi-Model Discussions
- 4 plans created
- 11 total tasks defined
- Covers M4 (open/parallel), M5 (discuss/sequential), M8 (@mentions)
- Ready for execution
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Create 04-02-SUMMARY.md documenting:
- /ask command handler implementation
- AI client integration in bot lifecycle
- Help text and status updates
Update STATE.md:
- Phase 4 complete, ready for Phase 5
- 10 total plans completed
- Add 04-02 decisions
Update ROADMAP.md:
- Mark Phase 4 as complete
- Update progress table (2/2 plans)
This completes M3 milestone (Single Model Q&A) and Phase 4.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Update HELP_TEXT in commands.py with new Questions section showing
/ask <model> <question> command.
Update status_command to display AI router status (e.g., "AI Router:
requesty") or "not configured" if AI client isn't initialized.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Create discussion.py with ask_command handler that:
- Validates model name against MODEL_MAP
- Shows usage when called without arguments
- Sends typing indicator while waiting for AI
- Returns formatted response with model name
- Includes optional project context if project is selected
Register CommandHandler("ask", ask_command) in handlers/__init__.py.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Import init_ai_client and call it during post_init callback alongside
database initialization. Logs the configured AI router at startup.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add 04-01-SUMMARY.md with plan completion details
- Update STATE.md with Phase 4 progress and decisions
- Update ROADMAP.md with plan count and status
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add AIClient class wrapping AsyncOpenAI for model routing
- Support Requesty and OpenRouter as backend routers
- Add MODEL_MAP with claude, gpt, gemini short names
- Add init_ai_client/get_ai_client module functions
- Include HTTP-Referer header support for OpenRouter
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add openai package to dependencies in pyproject.toml
- Extend BotConfig with ai_router, ai_api_key, ai_referer attributes
- Load AI settings from environment with sensible defaults
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Phase 04: Single Model Q&A
- 2 plans created
- 5 total tasks defined
- Ready for execution
Plans:
- 04-01: AI client abstraction (openai dep, config, AIClient class)
- 04-02: /ask handler and bot integration (M3 milestone)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- /project models [list] - show/set AI models for current project
- /project delete <id> - delete project by ID with confirmation
- Clear user selection if deleted project was selected
M2 milestone complete: full project CRUD via Telegram.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add update_project_models(project_id, models) and delete_project(project_id)
functions to complete the project service CRUD operations.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Tasks completed: 2/2
- Add get_project_by_name to service
- Implement /project select and /project info handlers
SUMMARY: .planning/phases/03-project-crud/03-02-SUMMARY.md
Add project selection and info display:
- /project select <id|name> stores selection in user_data
- /project info displays selected project details
- get_selected_project helper retrieves current project
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add case-insensitive project lookup by name using ilike query.
Supports /project select command finding projects by name.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Tasks completed: 2/2
- Create project service module
- Implement /projects and /project new handlers
SUMMARY: .planning/phases/03-project-crud/03-01-SUMMARY.md
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- /projects lists all projects with name, ID, models
- /project new "Name" creates project with confirmation
- Registered handlers in __init__.py
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- list_projects() returns all projects ordered by created_at desc
- create_project() creates project with default models
- get_project() retrieves project by ID
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Phase 02: Bot Core
- 2 plans created (02-01 infrastructure, 02-02 handlers)
- 6 total tasks defined
- Ready for execution
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Tasks completed: 3/3
- Create database module with async session management
- Create model tests with in-memory database
- Verify gitignore and full test suite
SUMMARY: .planning/phases/01-foundation/01-03-SUMMARY.md
Phase 1: Foundation complete (3/3 plans)
Tasks completed: 3/3
- Create base model and enums
- Create Project and Discussion models
- Create Round, Message, and Consensus models
SUMMARY: .planning/phases/01-foundation/01-02-SUMMARY.md
- Create 01-01-SUMMARY.md documenting plan execution
- Update STATE.md with current position and velocity metrics
- Update ROADMAP.md progress table
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>