1. Install KCM
Pick the path that matches your current needs. You can start for free in just a few minutes and upgrade to a licensed deployment whenever you're ready.
Install free version
The Community build unlocks topic management, consumer monitoring, Kafka Connect, Schema Registry, and ACL tooling with no cost. Deploy it today with Docker; a Helm chart will be available soon for Kubernetes users.
Docker Compose (recommended)
- Clone the repository and jump into the Docker example directory:
git clone https://github.com/kcmhub/KCM.git cd KCM/00_kcm_with_docker_compose - Create the shared Docker network used by the compose stack:
docker network create kcm-net - Start the services (Kafka, Schema Registry, KCM UI/API, etc.):
docker compose up -d
This brings the full demo environment online at http://localhost. Default
credentials and service ports are preconfigured inside the compose project.
Pay & install license
Upgrade to the Pro plan to unlock multi-cluster management, unlimited users, advanced diagnostics, and enterprise integrations.
- Head over to the pricing page and click Subscribe. Complete the checkout form with your billing details.
- Once payment is confirmed, you'll receive an email with your license key and a link to the customer portal.
- In the KCM web console, sign in as an administrator and open Settings > License, then pick the Update tab.
- Select your preferred input method:
- File (.kcmlic): Click Browse, choose the license file from the email, and press Upload.
- Copy / Paste: Switch the toggle and paste the raw license token if your environment blocks file uploads.
- KCM validates the signature and restarts premium modules automatically. Confirm that the status badge now reads "Pro" and that Cluster Diagnostics is available.
.kcmlic license file or paste the key directly from the
Update tab.After installing KCM, sign in with your administrator credentials from the welcome screen to unlock setup actions such as cluster registration and access control.
http://localhost (or your deployed
URL). Use your admin account to continue configuration.2. Connect your Kafka clusters
Create a group
KCM lets you organize environments or sets of Kafka clusters into groups. Use them to keep production, staging, or regional footprints separated while still working from a single control plane.
- Open the Groups page from the top navigation.
- Click the blue ADD button to create a new entry.
- Provide a descriptive name (for example
cloud_prod) and pick the color that will identify the group. - Confirm with OK, then press ENTER on the new card to dive into the group workspace.
The banner at the top of the group view automatically adopts the color you selected. That visual cue helps teams immediately recognize which environment they are managing.
Register Kafka clusters
With your group ready, capture each Kafka environment you want to manage:
- Inside the group press the blue ADD button to launch the creation wizard.
- In the General tab, give the cluster a descriptive name, list one or more bootstrap servers separated by commas, and enable metrics if you want KCM to collect built-in telemetry.
- Expand the Security section to pick your protocol (PLAINTEXT, SASL, TLS) and the matching SASL mechanism. Provide broker credentials and upload truststores as needed.
- Click Create. Once the cluster appears in the list you can reopen it at any time to tweak endpoints or security settings.
- Once created, use Test Connection to validate connectivity. A green success banner confirms KCM can reach the brokers with the provided details.
You can register up to five clusters on the Pro plan. Enterprise plans support unlimited Kafka clusters.
3. Explore the web console
Overview
The Overview tab gives you an instant health snapshot: controller status, online brokers, topic counters, partition distribution, and the Kafka version running in your cluster.
Topics
Browse every topic at a glance, drill into configs, and watch throughput trends update live, all without touching the CLI. Tabs surface metadata, partitions, and consumer offsets, while the Messages view lets you tail records in real time for quick debugging.
- List existing topics, create new ones, and edit or delete those you no longer need.
- Purge accumulated data in a few clicks when you need a clean slate.
- Export the current table view to CSV; the download respects any filters you've applied.
- Tune advanced configurations without memorizing property keys.
- See which brokers host each partition, track ISR and replicas, and trigger reassignment.
- Discover the consumer groups currently reading from the topic to understand downstream impact.
Consumer Groups
Monitor consumption at the group level, from creation through ongoing lag tracking. KCM lets you bootstrap new groups, reset offsets safely, and visualize how traffic is distributed across topics and partitions without touching the command line.
- Create test or production consumer groups and attach one or many topics in a single dialog.
- Choose the offset reset strategy (latest, earliest, custom timestamp) before the group starts consuming.
- Review a recap screen prior to confirmation so you know exactly which partitions and reset modes will be applied.
- Drill into lag dashboards to spot back-pressure by topic, then zoom into partitions that need attention.
- Force kill members or trigger offset resets on demand when stuck consumers block progress.
Kafka Connect
Deploy, pause, resume, and delete connectors. View connector configs with syntax highlighting and change history.
ACL Management
Create, modify, and audit ACL entries with rollback support. Exporting ACL sets as YAML for GitOps pipelines is on the roadmap and will roll out in an upcoming release.
Cluster Diagnostics
Run health checks, broker config diff, log compaction scans, and produce tailored remediation guides.
4. Common workflows
Create a new topic
- Open Topics and click Create Topic.
- Fill in partitions, replication factor, and optional configs.
- Review the summary and confirm. KCM applies settings via the Kafka Admin API.
Produce a Kafka message
- In Topics, open the action menu for your target topic and choose Produce.
- Fill in optional headers, the message key, and the payload value. Headers can be toggled on or off individually.
- If you have a Schema Registry connected, pick it from the dropdown. Enable Key with Schema and/or Value with Schema so KCM validates and transforms your JSON payload into the registered schema automatically.
- Click Produce to send the record. A green toast confirms the partition and offset that were written.
- Optionally use Save as Template to store the current message for future tests. Templates keep headers and payloads so repeated checks take seconds.
Apply a saved template
- In the Produce dialog, click on the down arrow next to Paste from template and choose whether to browse all templates or only those linked to the current topic.
- Select the template you want and press Use this template. KCM injects the saved headers, key, value, and schema settings.
- If the template was created for a different topic, confirm whether you want to keep your current topic or switch to the template's original topic.
Consume Kafka messages
Use the Data Consumer workspace to tail records live, checkpoint what you receive, and copy interesting messages for replays or regression tests.
- Open Data Consumer, pick the cluster and topic you want to inspect, and optionally select partitions or a starting offset window.
- Click Start to begin streaming messages to the table. Press Stop any time to halt the flow and keep the records visible for review.
- Use Filter to narrow the stream by key, value, or headers, and apply the filters while the consumer keeps running.
- When you spot an interesting record, open the Headers pill to view every header/value pair, or use the row menu to Replay or Copy as template.
- Download the captured dataset with Export to CSV for offline debugging or attaching to a ticket.
Deploy a Kafka Connect connector
- Go to Kafka Connect > New Connector.
- Choose the connector class from the catalog.
- Paste or compose the JSON configuration.
- Validate with KCM's schema-aware checker and launch.
Audit changes
Browse the Audit section to view who changed what. Export audits to CSV or stream them to your SIEM.
5. CLI & automation
A lightweight CLI for scripting routine operations is on the product roadmap. The commands below preview the planned experience.
# List clusters
kcmctl clusters list
# Trigger diagnostics on a specific cluster
kcmctl diagnostics run --cluster prod-eu-1
# Export ACLs to YAML
kcmctl acls export --cluster prod-eu-1 --out acls.yaml
Authenticate with an API token via kcmctl login --token <your-token>. Tokens
can be generated in the web console under Profile > API Tokens.
6. Troubleshooting
Health checks fail
Verify broker reachability from the KCM host (telnet broker 9092) and confirm that
TLS certificates are trusted.
Connectors not visible
Ensure the Kafka Connect REST endpoint allows CORS from the KCM domain and that the service account has the Connector:Read permission.
"Unauthorized" errors
Review RBAC assignments under Admin > Access Control. Each module (Topics, ACLs, etc.) has granular permissions.
Log locations: Application logs live under logs/kcm-app.log (or
Docker container stdout). Enable debug mode with KCM_LOG_LEVEL=debug for deeper
tracing.
6. Need help?
We're here to help you succeed with Kafka.
- Open a ticket from within the app (Help menu).
- Email: support@kcmhub.io
- Join the community: GitHub Discussions