kubectl-ai

kubectl-ai: AI for Kubernetes CLI Management 2025

Kubernetes just got smarter. Google engineers built a new open-source tool called kubectl-ai which is bringing artificial intelligence right into the command line. Instead of typing out complex Kubernetes commands, DevOps teams and SREs can now use simple, natural language to manage their clusters. It’s like having an AI assistant for your Kubernetes CLI.

Let’s explore what kubectl-ai is, how it works, and why it could be a game-changer for the future of cloud-native operations.

What is kubectl-ai?

kubectl-ai, developed by GoogleCloudPlatform, is an innovative tool that brings AI-powered assistance to Kubernetes management, simplifying tasks like troubleshooting, configuration, and learning.

Instead of typing out long complex CLI commands or if you are struggling to write valid YAML, it leverages AI to interpret natural language commands. For example “show all pods in the dev namespace” or “generate a config” hence provides actionable outputs or explanations tailored to your cluster. This tool uses generative AI to understand the request and then translates it into the right kubectl command. It acts like a translator between human and Kubernetes and makes working with Kubernetes more user-friendly. It’s like having a conversation with the cluster.


Also read: Kubectl Restart Pod: Ways to Restart Kubernetes Pods Effectively

Available on GitHub, it’s designed for users of all levels i.e. from beginners seeking clarity on Kubernetes concepts to experts automating workflows. By default, it enquiries Gemini, but can also query OpenAI, Grok, and even your own local LLMs.

kubectl-ai Key Features

  1. Talk to Kubernetes like you talk to a Human: You don’t need to remember complex commands anymore. With kubectl-ai, you can just type something like “show all pods in the dev namespace,” and it figures out the exact command for you.
  2. No more Googling kubectl flags: kubectl-ai quickly generates the right command using AI. It’s perfect when you forget the exact syntax or want to avoid typos.
  3. You stay in control: It doesn’t run commands for you automatically. Instead, it shows you what it thinks you want, and you decide whether to copy and run it. Safe and smart.
  4. Works with OpenAI or local AI Models: You can connect it to OpenAI’s GPT models, or run it with a local model like Ollama if you prefer to keep everything private and offline.
  5. Great for privacy and security: If your team handles sensitive data, kubectl-ai can run completely on your local machine without sending anything to the cloud.
  6. Helps you when you’re stuck: kubectl-ai can suggest helpful commands or offer fixes when something’s not working. It’s like having an experienced Kubernetes buddy on standby.
  7. Easy to use right in your terminal: You don’t need to learn a new interface. kubectl-ai works inside the terminal you’re already using, keeping things simple and familiar.
  8. Open source and backed by Google Engineers: It’s free, open to everyone, and built by folks at Google who really understand Kubernetes. You can even contribute if you want.

Prerequisites

  • Kubernetes cluster.
  • kubectl installed and configured to access the cluster.
  • An API key for chosen AI model (e.g., Gemini, OpenAI, or Grok) or a local LLM setup with Ollama.

How kubectl-ai works

Here’s how it works in general:

  1. Install the kubectl-ai plugin on your local machine
    • Using Pre-built Binaries (Download the latest release from GitHub)
    • Using Homebrew (for macOS)

brew tap sozercan/kubectl-ai https://github.com/sozercan/kubectl-ai

brew install kubectl-ai

  1. Provide your API key via an environment variable (Gemini, OpenAI, or Grok) or a local LLM setup with Ollama)
    • Using Gemini (Default)
export GEMINI_API_KEY=your_api_key_here

  Or you can also specify different Gemini models:

kubectl-ai --model gemini-2.5-pro-exp-03-25

# Using the faster 2.5 flash model

kubectl-ai --quiet --model gemini-2.5-flash-preview-04-17 "check logs for nginx app in hello namespace"
  • Using Local AI Models with Ollama
# Assuming ollama is running and you've pulled the gemma3 model

# ollama pull gemma3:12b-it-qat

kubectl-ai --llm-provider ollama --model gemma3:12b-it-qat --enable-tool-use-shim
  • Using OpenAI
export OPENAI_API_KEY=your_openai_api_key_here

kubectl-ai --llm-provider=openai --model=gpt-4.1

Read more: How to Copy Files from Pods to Local Machine using kubectl cp?

  1. When you use the kubectl ai command, your prompt is sent to the Gemini model. Once installed and configured, you can use kubectl-ai in several ways:
  • Interactive Mode

Simply run kubectl-ai without arguments to enter an interactive shell where you can have a conversation with the assistant, asking multiple questions while maintaining context.

  • Direct Commands

You can also run kubectl-ai with a specific task:

       kubectl-ai "fetch logs for nginx app in hello namespace"
  • Unix Integration
      kubectl-ai < query.txt
      # OR
      echo "list pods in the default namespace" | kubectl-ai
      # OR
      cat error.log | kubectl-ai "explain the error"
  1. The model interprets your request and returns:
    • A CLI command (like kubectl get pods –namespace=web)
    • A YAML manifest
    • An explanation or help text
  2. Finally, the response is printed to your terminal, and you can choose to copy, run, or refine it.

The plugin acts as an intelligent assistant, not just a code generator. Here are some special keywords used:

  • model: To list the current selected model.
  • models: To list all available models.
  • version: To display the kubectl-ai version.
  • reset: To clear the conversational context.
  •  clear: To clear the terminal screen.
  • exit or quit: To terminate the interactive shell.

kubectl-ai: A Game-Changer for the Future of Cloud-Native Operations

kubectl-ai is redefining how developers and DevOps teams interact with Kubernetes. By letting you use natural language instead of memorizing complex commands, it removes the friction of remembering flags, syntax, and YAML paths it brings the power of AI directly into the terminal. Built by Google engineers, this open-source tool is more than just a shortcut. Whether you’re a seasoned SRE or new to Kubernetes, it makes working with clusters faster, easier, and way more intuitive.

Scroll to Top