Monday, 15 December 2025

MCP server

Minikube + MCP server + VS code + continue + ollama


VS Code

  ↓

Continue Extension

  ↓

MCP Protocol (SSE)

  ↓

kubernetes-mcp-server

  ↓

Kubernetes API (via ServiceAccount)



VS Code
 └─ Continue Extension
     ├─ LLM (Ollama / OpenAI / etc.)
     └─ MCP Server → http://localhost:3000/mcp
         └─ kubernetes-mcp-server
             └─ Minikube cluster

🥇 OPTION 1 (RECOMMENDED): Build image locally & load into Minikube

This avoids GHCR entirely.

Step 1: Clone repo locally

git clone https://github.com/containers/kubernetes-mcp-server.git cd kubernetes-mcp-server

Step 2: Build Docker image

If using Docker (recommended):

docker build -t kubernetes-mcp-server:local .

If using Podman:

podman build -t kubernetes-mcp-server:local .

Step 3: Load image into Minikube

minikube image load kubernetes-mcp-server:local



apiVersion: rbac.authorization.k8s.io/v1 kind: ClusterRole metadata: name: mcp-reader rules: - apiGroups: [""] resources: - pods - services - nodes - events - namespaces verbs: ["get", "list", "watch"] - apiGroups: ["apps"] resources: - deployments - replicasets - statefulsets verbs: ["get", "list", "watch"] --- apiVersion: rbac.authorization.k8s.io/v1 kind: ClusterRoleBinding metadata: name: mcp-reader-binding subjects: - kind: ServiceAccount name: mcp-sa namespace: mcp roleRef: kind: ClusterRole name: mcp-reader apiGroup: rbac.authorization.k8s.io
apiVersion: apps/v1
kind: Deployment
metadata:
  name: kubernetes-mcp-server
  namespace: mcp
spec:
  replicas: 1
  selector:
    matchLabels:
      app: kubernetes-mcp-server
  template:
    metadata:
      labels:
        app: kubernetes-mcp-server
    spec:
      serviceAccountName: mcp-sa
      containers:
        - name: mcp
          image: kubernetes-mcp-server:local
          imagePullPolicy: IfNotPresent
          args:
            - --port
            - "3000"
          env:
            - name: KUBERNETES_NAMESPACE
              valueFrom:
                fieldRef:
                  fieldPath: metadata.namespace
          ports:
            - containerPort: 3000


PS C:\Users\Raj Kumar Gupta\Desktop\Raj\minikube> kubectl port-forward -n mcp svc/kubernetes-mcp-server 3000:3000

Forwarding from 127.0.0.1:3000 -> 3000

Forwarding from [::1]:3000 -> 3000

Handling connection for 3000

============================================

STEP 1️⃣ Install Ollama (Windows)

Download Ollama

👉 https://ollama.com/download

  1. Download Windows installer

  2. Install (default options)

  3. Reboot recommended


Verify Ollama installation

Open PowerShell:

ollama --version


PS C:\WINDOWS\system32> cd C:\Users\"Raj Kumar Gupta"\Desktop\Raj\minikube

PS C:\Users\Raj Kumar Gupta\Desktop\Raj\minikube> ollama --version

ollama version is 0.13.0

PS C:\Users\Raj Kumar Gupta\Desktop\Raj\minikube> ollama --version

ollama version is 0.13.0

PS C:\Users\Raj Kumar Gupta\Desktop\Raj\minikube> ollama pull llama3.1

pulling manifest

pulling 667b0c1932bc: 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 4.9 GB

pulling 948af2743fc7: 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 1.5 KB

pulling 0ba8f0e314b4: 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏  12 KB

pulling 56bb8bd477a5: 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏   96 B

pulling 455f34728c9b: 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏  487 B

verifying sha256 digest

writing manifest

success

PS C:\Users\Raj Kumar Gupta\Desktop\Raj\minikube> ollama run llama3.1

>>> hello

Hello! How are you today? Is there something I can help you with or would you like to chat?


============================================

 C:\Users\Raj Kumar Gupta\.continue

{
  "models": [
    {
      "title": "Ollama (Local)",
      "provider": "ollama",
      "model": "llama3.1"
    }
  ],
  "mcpServers": [
    {
      "name": "kubernetes",
      "transport": "http",
      "url": "http://localhost:3000/mcp"
    }
  ]
}

No comments:

Post a Comment