Master AI Development: Essential Ai CLI Commands

by Alex Johnson 49 views

Welcome, engineers! Are you diving into the world of AI development and looking for a practical way to streamline your workflow? You've landed in the right spot. This article serves as your comprehensive guide to the ai CLI commands, designed not just to list them, but to integrate them seamlessly into your daily tasks. Whether you're setting up a new project, managing profiles, launching powerful AI tools, or conducting in-depth code reviews, understanding these commands is key to unlocking efficiency and maximizing your productivity. We'll walk through common workflows, breaking down each command with clear examples and explanations, making it easier than ever to navigate the complexities of AI development.

Think of this as your essential toolkit for working with the ai CLI. We've structured it to be task-oriented, helping you learn by doing. This isn't just a dry list; it's a practical worksheet crafted for real engineers. If you're new to aidev, or perhaps preparing for a demo and want to showcase the tool's capabilities effectively, this guide will be your best friend. We'll cover everything from the initial project setup and profile switching to managing MCP servers, launching your favorite AI assistants like Claude, Cursor, or Codex, and even diving deep into debugging and code analysis. Get ready to supercharge your AI development process!

1. Project Quickstart & Setup: Getting Started Swiftly

Embarking on a new project is always exciting, and setting it up correctly from the start can save you a world of pain down the line. The ai CLI commands offer a streamlined approach to this crucial first step. Specifically, the ai setup command is your go-to for creating a new `.aidev` configuration file for your repository. This file acts as the central hub for all your project-specific AI development settings. It ensures that your AI tools and configurations are tailored to the unique needs of your project, rather than relying on generic global settings. This local configuration is vital for maintaining consistency and reproducibility, especially when working in a team environment.

Once your configuration file is in place, the ai quickstart command comes into play. This powerful command performs an automatic project detection process. It intelligently analyzes your project's structure and identifies the underlying technologies and frameworks being used. Based on this analysis, it will recommend an optimized AI development profile. Imagine the time saved not having to manually figure out the best setup! The output of ai quickstart provides valuable insights, including the detected stack (like JavaScript, Python, Docker, or Kubernetes), a recommended profile tailored for your needs (such as `web`, `infra`, or `pair_programmer`), and often an optional initialization step to get you up and running even faster. This intelligent recommendation system is a cornerstone of the ai CLI's user-friendly design, ensuring that even developers new to a specific stack can quickly adopt best practices.

After receiving the recommendations, applying them is just as straightforward. The ai use command allows you to activate the suggested profile, instantly configuring your environment to suit the project's requirements. This command ensures that all subsequent AI tool interactions and configurations align with the chosen profile. Furthermore, in the chaotic world of development, having a backup is always wise. The ai backup command provides a safety net, allowing you to back up both your global and project-specific configuration files. This simple command can be a lifesaver if you ever need to revert to a previous stable configuration or migrate your settings to a new machine. By integrating these initial setup commands, the ai CLI empowers you to establish a solid foundation for your AI projects, making the entire process from creation to configuration remarkably efficient and user-friendly.

2. Profiles: Seamlessly Switch, Inspect, and Customize

In the dynamic landscape of AI development, flexibility is paramount. The ai CLI commands provide robust profile management capabilities, allowing you to tailor your AI environment to specific tasks or projects. Profiles act as distinct configurations for your AI tools, enabling you to switch contexts effortlessly. The command ai profiles is your window into this system; it lists all the available profiles on your machine, giving you a clear overview of your options. This is incredibly useful when you need to recall the name of a specific profile you've previously set up or simply want to explore the profiles that come bundled with the ai CLI.

Switching between these configurations is as simple as typing ai use . For instance, if you're working on a large, complex system, you might have a profile named `monolith_surgeon` specifically designed for navigating and refactoring such codebases. Executing ai use monolith_surgeon would instantly load all the relevant settings, tools, and configurations associated with that profile. This context-switching capability is a huge time-saver, eliminating the need to manually reconfigure tools every time you shift your focus. It ensures that your AI assistants are always primed with the most relevant context for the task at hand.

Understanding the details of your current environment is also crucial. The ai profile show command displays the specific configuration of your currently active profile, including any custom settings, enabled MCP servers, and tool configurations. This transparency helps you understand exactly how your AI tools are set up and why they behave in a certain way. If you need to make adjustments, the ai profile edit command provides an interactive way to modify the active profile directly within your preferred editor. This hands-on approach allows for fine-tuning and personalization, ensuring your AI environment perfectly matches your workflow. When things get too customized or you simply want to reset to a known state, ai use default reverts your environment back to the standard, out-of-the-box configuration, providing a reliable baseline to return to.

3. MCP Server Management: Powering Your Tools

MCP servers are a critical component of the ai CLI, acting as specialized agents that extend the capabilities of your AI tools. They enable functionalities like code indexing, intelligent searching, and more. Effective management of these servers is key to unlocking the full potential of your AI development environment. The ai mcp list command is your starting point, providing a comprehensive list of all installed and available MCP servers. This helps you discover what advanced functionalities are at your disposal and whether they are currently active or need to be enabled.

To integrate an MCP server into your current workflow, you use the ai mcp enable command. For example, enabling the `grep` MCP server (`ai mcp enable grep`) might provide powerful, AI-assisted code searching capabilities within your project. Conversely, if you no longer need a particular MCP server's functionality, the ai mcp disable command allows you to deactivate it, such as disabling the `docker` MCP server (`ai mcp disable docker`) if you're not currently working with containerization. This granular control ensures that your environment remains lean and focused on the tasks you're actively performing.

Understanding how these servers are configured for your specific profile is also important. The ai mcp show command displays the profile-specific MCP configuration, detailing which servers are enabled and how they are set up. This is invaluable for troubleshooting and ensuring that your AI tools have access to the necessary backend functionalities. Whenever you make significant changes to your profiles or MCP server configurations—such as editing profiles, adding or removing MCP servers, or upgrading the aidev version—it's a good practice to regenerate the MCP configurations for all tools. The ai mcp generate command handles this efficiently, ensuring that all your integrated tools are correctly configured to utilize the enabled MCP servers. This regeneration step is crucial for maintaining a stable and functional AI development environment, especially after system updates or major configuration changes.

4. Launching Tools: Accessing AI Powerhouses

The true power of the ai CLI lies in its ability to seamlessly launch and integrate various AI tools directly into your development workflow. Instead of juggling multiple applications and manually configuring them, you can use a single, unified command to bring your favorite AI assistants to life. The fundamental command for this is ai . This simple yet powerful syntax allows you to launch any supported AI tool with your currently active profile automatically injected. This means the tool is immediately aware of your project's context, your chosen settings, and any relevant MCP servers that are enabled.

Let's look at some examples to illustrate this. If you want to leverage the capabilities of Anthropic's Claude, you simply type ai claude. For OpenAI's Codex, it's ai codex. Similarly, you can launch Cursor with ai cursor or Google's Gemini with ai gemini. Behind the scenes, this command does a lot of heavy lifting for you. It automatically injects the active profile's configurations, writes the correct MCP (Meta-Context Processing) configuration for the specific tool you're launching, loads any enabled MCP servers relevant to that tool, and then launches the tool in the appropriate mode. This automation drastically reduces the setup time and cognitive load associated with using multiple AI tools, allowing you to focus more on coding and problem-solving.

Furthermore, the ai CLI understands that sometimes you need more specific control or want to access advanced features of these tools. If a tool, like Codex, has its own set of command-line arguments or flags, you can pass them directly through the ai CLI. For instance, running ai codex --help will display the help information for the Codex tool, allowing you to explore its specific options and parameters. This capability ensures that you aren't limited by the ai CLI's abstraction; you can still access the full power and nuance of each individual AI tool. By centralizing the launch and configuration of these powerful AI assistants, the ai CLI makes integrating them into your daily coding routine incredibly efficient and straightforward.

5. Environment & Variables: Managing Your Configuration

Managing environment variables and configurations is a cornerstone of any development workflow, and the ai CLI provides a dedicated set of ai CLI commands to handle this efficiently. These commands allow you to control settings that influence how your AI tools operate, from API keys to project-specific tokens. The ai env command serves as a central point for managing these variables. Running it lists all the environment variables currently being managed by aidev, giving you a clear picture of your configuration landscape. This is particularly useful for auditing and understanding which settings are active.

Setting environment variables can be done globally or locally (specific to a repository). To set a variable that applies everywhere your ai CLI is used, you use ai env set . For example, ai env set API_KEY 12345 would set a global API key. If you need a variable to be active only within the current project, you can add the `--local` flag: ai env set LOCAL_TOKEN abcdef --local. This distinction is crucial for security and managing different credentials for different projects. When you need to remove a variable, the ai env unset command does just that, ensuring you can clean up your configuration as needed. For instance, ai env unset API_KEY would remove the previously set global API key.

For more complex configuration needs or when you want to make significant changes, the ai env edit command opens the entire environment configuration file in your default editor. This allows for bulk edits and a more visual approach to managing your settings. Beyond simple management, diagnosing potential issues is also streamlined. The ai doctor command is an invaluable tool for diagnosing environment issues. It can help identify problems such as missing API tokens, conflicting variable values, or incorrect configurations that might be preventing your AI tools from functioning correctly. By providing these comprehensive tools for environment management and diagnostics, the ai CLI ensures that your AI development setup is robust, secure, and consistently operational.

6. Code Review & Analysis Workflows: Enhancing Quality

Code quality is paramount in software development, and leveraging AI for code review and analysis can significantly enhance this process. The ai CLI commands offer powerful features specifically designed for code review, allowing you to catch issues early and improve the overall codebase. The command ai review --staged is your go-to for performing a heuristic code review specifically on your staged changes. This means it analyzes the code you're about to commit, providing feedback before it even enters your version control history. This proactive approach is incredibly valuable for maintaining high code standards.

For a more comprehensive analysis, such as evaluating a large-scale refactoring effort, you can use ai review --all. This command will analyze the entire repository, providing insights into broader architectural changes or potential issues that might arise from significant modifications. The flexibility of the ai CLI extends to the choice of AI providers. You can specify which AI model or service should perform the review using the `--provider` flag. For example, ai review --provider codex --staged directs the review to be performed by Codex on your staged changes. This allows you to choose the best tool for the job, whether it's Codex, Ollama, an external service, or another provider you've configured.

If you're unsure about which providers are available or how to configure them, the command ai review providers will list all the available review providers that your ai CLI can access. Understanding your current review setup is also important, and ai review config provides a clear view of your review-related configurations, helping you verify that everything is set up as intended. By integrating these specialized commands, the ai CLI empowers developers to incorporate AI-driven code analysis and review seamlessly into their development lifecycle, leading to more robust, higher-quality software and a more efficient review process.

7. Debugging / Diagnosis Workflows: Solving Complex Problems

When development hits a snag, especially with complex AI systems, efficient debugging and diagnosis are critical. The ai CLI commands provide specialized workflows to help you pinpoint and resolve issues quickly. A common strategy is to switch to a dedicated `debugger` profile. You can activate this profile using ai use debugger. Once this profile is active, you can then launch an AI tool like Claude (ai claude) within this debugging context. This setup ensures that your AI assistant has access to logs, error messages, and potentially debugging configurations that are specific to troubleshooting.

Identifying which backend services or MCP servers are currently active is often a crucial step in diagnosis. The command ai mcp show is invaluable here, as it displays the profile-specific MCP configuration, clearly indicating which MCP servers are enabled and operational for your current context. This helps you verify if the necessary background services for debugging are running. Furthermore, ensuring that your tool integrations are correctly configured is vital. The ai doctor command acts as a diagnostic tool that validates your environment and tool configurations. It can flag issues such as missing API keys, incorrect paths, or misconfigured connections that might be causing your problems.

Sometimes, specific tools might have configuration issues that need addressing. For instance, if you suspect problems with the Codex integration, you might need to regenerate its specific MCP configuration. The command ai mcp generate --tool codex allows you to do just that, ensuring that the Codex tool is set up correctly with the latest MCP configurations. By combining profile switching, MCP server inspection, and diagnostic tools like `ai doctor` and `ai mcp generate`, the ai CLI provides a powerful and structured approach to tackling complex debugging challenges in your AI development projects, significantly reducing the time spent hunting for errors.

8. Onboarding a New Engineer: Accelerating Integration

Bringing new engineers onto a project efficiently is vital for team productivity. The ai CLI commands can play a significant role in accelerating this onboarding process, providing new team members with the context and tools they need to become productive quickly. You can start by setting a dedicated `onboarding` profile using the command ai use onboarding. This profile can be pre-configured with settings and tools specifically useful for someone learning the project's codebase and architecture.

Once the onboarding profile is active, you can launch Claude with repo-aware instructions by simply typing ai claude. This allows the new engineer to immediately start asking questions about the project. To further enhance the onboarding experience, you can enable specific MCP servers that provide valuable project context. For example, enabling the `arch_docs` MCP server (`ai mcp enable arch_docs`) and then regenerating the configurations (`ai mcp generate`) ensures that AI tools can access and understand architectural documentation. This sets the stage for powerful insights into the project's structure.

With the environment prepared, the new engineer can then leverage Claude to generate a guided overview of the repository. A prompt like, “Give me an architectural overview of this repo. What are the key modules, boundaries, and common workflows?” can yield incredibly valuable information. This allows the new team member to quickly grasp the project's essence without needing extensive manual walkthroughs. By using the ai CLI to set up a tailored onboarding environment and facilitate context-aware AI interactions, you can drastically reduce the ramp-up time for new engineers, enabling them to contribute meaningfully much faster.

9. Monolith Navigation & Refactoring: Taming Complexity

Working with large, monolithic codebases presents unique challenges. The ai CLI commands offer specialized profiles and tools to help you navigate and refactor these complex systems more effectively. To tackle monoliths, you can switch to a powerful profile designed for this purpose, such as `monolith_surgeon`, by running ai use monolith_surgeon. This profile typically includes configurations and enables tools that are optimized for understanding and manipulating large code structures.

Once in this specialized context, you can use AI tools like Claude (launched via ai claude) to analyze critical parts of the codebase. For instance, you could ask Claude to analyze pack boundaries or specific contexts within the monolith. A targeted prompt might be: “Show me everything that depends on the billing context. Identify unsafe edges.” This kind of AI-driven analysis can reveal hidden dependencies and potential areas of risk that are difficult to spot manually in vast codebases. The ability to query the codebase using natural language is a game-changer for monolith refactoring.

To further aid in navigating and understanding the code, you can enable powerful search capabilities. Enabling the `grep` MCP server (`ai mcp enable grep`) and then verifying its configuration (`ai mcp show`) provides AI-assisted code searching. This allows you to search the repository safely and efficiently, quickly finding relevant code sections, definitions, or usages. By combining specialized profiles with AI-powered analysis and advanced search functionalities, the ai CLI equips developers with the tools needed to confidently tackle the complexities of monolith navigation and refactoring, making large codebases more manageable and easier to evolve.

10. API Engineering Workflows: Streamlining Design and Documentation

API engineering is a critical discipline, and the ai CLI provides specific tools and workflows to enhance API design, development, and documentation. For API-focused tasks, you can activate the `api_engineer` profile using ai use api_engineer. Within this profile, you can enable MCP servers that are tailored for API development, such as the OpenAPI MCP server, by running ai mcp enable openapi. Following this, it's good practice to regenerate the configurations to ensure all tools recognize the new MCP server: ai mcp generate.

With the OpenAPI MCP enabled, you can leverage AI to significantly speed up the generation or updating of API documentation. Instead of manually writing or updating specification files, you can ask your AI assistant to do the heavy lifting. For example, you can prompt Claude or another integrated tool with a request like: “Generate an updated OpenAPI spec based on the definitions in `/services/users`.” The AI, with the help of the enabled MCP server, can analyze your code, identify API endpoints, parameters, and response structures, and then generate or update your OpenAPI specification accordingly. This not only saves time but also helps ensure that your documentation is accurate and consistent with your implementation.

This workflow is particularly beneficial for maintaining consistency across microservices or when iterating rapidly on API designs. The ability to use natural language prompts to generate and update machine-readable API definitions streamlines the entire API lifecycle. By integrating these specialized commands and profiles, the ai CLI empowers API engineers to focus more on the design and functionality of their APIs, while letting AI handle much of the repetitive and complex documentation tasks. This leads to faster development cycles and higher quality, well-documented APIs.

11. DevOps / CI/CD Workflows: Automating and Optimizing Operations

DevOps and CI/CD pipelines are essential for modern software delivery, and the ai CLI can assist in automating and optimizing these critical processes. To engage with DevOps-related tasks, you can utilize the `devops` profile by running ai use devops. This profile can be configured with tools and settings relevant to infrastructure management, build processes, and deployment pipelines.

One common task in DevOps is managing containerization. You can use AI tools like Claude, launched within the `devops` profile (ai claude), to interactively inspect Dockerfiles. By asking questions such as, “Explain the build layers and identify opportunities for caching or size reduction,” you can leverage AI to optimize your container images for efficiency and speed. This kind of analysis can lead to faster builds, smaller deployment artifacts, and reduced infrastructure costs.

For teams working with Kubernetes, the ai CLI can also streamline interactions with your cluster. You can enable MCP servers specifically designed for Kubernetes operations, like `k8s_cluster`, using ai mcp enable k8s_cluster. After enabling, remember to regenerate the MCP configurations with ai mcp generate to ensure all tools can properly interface with your Kubernetes environment. This setup can facilitate AI-assisted management of deployments, resource monitoring, and troubleshooting within your cluster. By providing specialized profiles and MCP servers for DevOps and CI/CD tasks, the ai CLI helps teams automate complex operational workflows, optimize infrastructure, and improve the overall efficiency of their software delivery pipelines.

12. Global Maintenance: Keeping Your Environment Up-to-Date

Maintaining your development environment and AI tool configurations is crucial for smooth operation. The ai CLI commands offer utilities for global maintenance, ensuring your aidev setup remains current and reliable. Over time, new versions of aidev may introduce changes or improvements to configuration formats. The command ai doctor --fix is designed to help you upgrade your aidev configuration to a new version. Running this command can automatically detect outdated configurations and apply necessary fixes, ensuring compatibility and leveraging the latest features.

In addition to upgrades, managing configurations and preventing data loss is essential. The ai backup command, as mentioned earlier, allows you to create backups of your global and project-specific configurations. When needed, you can restore your environment to a previous state using the command ai backup restore. This is invaluable if a configuration change causes unexpected issues or if you need to revert to a known working state quickly. These maintenance commands act as essential safeguards, helping you manage the lifecycle of your aidev configuration and ensure a stable, up-to-date development environment.

⭐ Bonus: Daily Workflows Cheat Sheet

To wrap things up, here’s a quick-reference cheat sheet for common daily workflows using the ai CLI. Keep this handy for quick access:

Workflow Command
Daily coding ai use pair_programmer && ai claude
Refactoring ai use monolith_surgeon && ai claude
Debugging prod issues ai use debugger && ai claude
API design ai use api_engineer && ai claude
Reviewing a PR ai review --provider codex --staged
New repo setup ai quickstart
Fix configs ai doctor

This cheat sheet covers some of the most frequent tasks, combining profile switching with tool launching or specific commands for review and setup. Remember, the goal of these commands is to make your AI development workflow as smooth and efficient as possible. Experiment with different commands and profiles to find what works best for you!

For more in-depth information on AI development tools and best practices, you might find the following resources helpful:

  • Explore the official documentation for the ai CLI for the most up-to-date command references and advanced usage guides.
  • Visit the GitHub repository for insights into the development of the ai CLI and to contribute to the project.