📌 Prerequisites (Blog Section)
✔ Windows / Linux / macOS
✔ VS Code installed
✔ Kubernetes cluster (Minikube used here)
✔ kubectl configured
✔ Docker installed
✔ Basic Kubernetes knowledge
🧠Architecture Overview (Explain in Blog)
VS Code (Continue Extension)
|
| (SSE / MCP)
v
Kubernetes MCP Server
|
| (Kubernetes API)
v
Minikube Cluster
^
|
Local LLM (Ollama – Llama 3.1)
Key idea:
Continue talks to Ollama for AI reasoning and to Kubernetes MCP for real cluster data.
🚀 Step 1: Install Ollama (Local LLM – Free)
Download Ollama
👉 https://ollama.ai/download
Verify installation
ollama --version
Pull model (IMPORTANT)
ollama pull llama3.1
Verify model
ollama list
🧩 Step 2: Install Continue Extension in VS Code
Open VS Code
Go to Extensions
Search Continue
Install Continue.dev
Reload VS Code
🤖 Step 3: Add Ollama Model in Continue
Open Continue panel (left sidebar)
Click Select model → Add Chat Model
Fill details:
Provider:
OllamaModel:
Llama3.1 Chat
Click Connect
Select Llama3.1 Chat
✅ At this point, Continue works with local AI.
☸️ Step 4: Deploy Kubernetes MCP Server
Create namespace
kubectl create namespace mcp
Deployment YAML
apiVersion: apps/v1
kind: Deployment
metadata:
name: kubernetes-mcp-server
namespace: mcp
spec:
replicas: 1
selector:
matchLabels:
app: kubernetes-mcp-server
template:
metadata:
labels:
app: kubernetes-mcp-server
spec:
containers:
- name: mcp
image: ghcr.io/containers/kubernetes-mcp-server:latest
args:
- "--port"
- "3000"
ports:
- containerPort: 3000
Service YAML
apiVersion: v1
kind: Service
metadata:
name: kubernetes-mcp-server
namespace: mcp
spec:
selector:
app: kubernetes-mcp-server
ports:
- port: 3000
targetPort: 3000
Apply
kubectl apply -f deployment.yaml
kubectl apply -f service.yaml
🔌 Step 5: Port-forward MCP Server
kubectl port-forward -n mcp svc/kubernetes-mcp-server 3000:3000
Keep this terminal open.
⚙️ Step 6: Configure Continue MCP (config.yaml)
Path:
C:\Users\<username>\.continue\config.yaml
Final Working Config (VERY IMPORTANT)
name: Local Config
version: 1.0.0
schema: v1
models:
- name: Llama3.1 Chat
provider: ollama
model: llama3.1
mcpServers:
- name: kubernetes
type: sse
url: http://localhost:3000/mcp
Reload VS Code
Ctrl + Shift + P → Reload Window
✅ Step 7: Verify Everything Works
Test AI
hello
Discover MCP tools
What tools are available?
Kubernetes real data
List pods in the mcp namespace
🎉 If you see real cluster output → success!
No comments:
Post a Comment