AI Query Assistant
rfx ask lets you query your codebase using natural language. It combines Reflex’s search capabilities with an LLM to answer questions about your code.
Configure your LLM provider:
rfx llm configThis interactive wizard sets up your API key, provider, and model. Configuration is stored in ~/.reflex/config.toml.
Check your current setup:
rfx llm statusSupported providers
Section titled “Supported providers”| Provider | Models |
|---|---|
| Anthropic | Claude Sonnet, Claude Opus, Claude Haiku |
| OpenAI | GPT-4o, GPT-4o-mini |
| OpenRouter | Any model available on OpenRouter |
Interactive mode
Section titled “Interactive mode”Start a conversation about your code:
rfx askThis opens an interactive session where you can ask follow-up questions with full conversation history. The assistant searches your codebase to ground its answers.
>> How does authentication work in this project?>> What files would I need to change to add OAuth support?>> Show me the error handling pattern used in the API handlers.One-shot mode
Section titled “One-shot mode”Ask a single question from the command line:
rfx ask "What does the handleRequest function do?"Answer mode (default)
Section titled “Answer mode (default)”Returns a direct answer grounded in your code:
rfx ask "What database does this project use?" --answerAgentic mode
Section titled “Agentic mode”Enables multi-step reasoning — the assistant can run multiple searches to build a comprehensive answer:
rfx ask "How would I add a new API endpoint?" --agenticIn agentic mode, the assistant iteratively searches your codebase, follows dependency chains, and synthesizes a thorough answer.
Execute mode
Section titled “Execute mode”rfx ask "Find all TODO comments" --executeRuns the query and returns raw results without LLM interpretation.
Provider selection
Section titled “Provider selection”Override the configured provider for a single query:
rfx ask "Explain the auth flow" --provider anthropicrfx ask "Explain the auth flow" --provider openaiProject context
Section titled “Project context”If your project has a REFLEX.md file in the root, rfx ask automatically includes it as context. This helps the assistant understand your project’s architecture, conventions, and terminology. See Configuration for details.
Next steps
Section titled “Next steps”- AI Integration — MCP server and JSON output for agent workflows
- Configuration — LLM provider setup
- CLI Commands — full
rfx askreference