KCM Documentation

Your guide to installing, configuring, and operating Kafka Cluster Manager.

1. Install KCM

Pick the path that matches your current needs. You can start for free in just a few minutes and upgrade to a licensed deployment whenever you're ready.

Install free version

The Community build unlocks topic management, consumer monitoring, Kafka Connect, Schema Registry, and ACL tooling with no cost. Deploy it today with Docker; a Helm chart will be available soon for Kubernetes users.

Docker Compose (recommended)

  1. Clone the repository and jump into the Docker example directory:
    git clone https://github.com/kcmhub/KCM.git
    cd KCM/00_kcm_with_docker_compose
  2. Create the shared Docker network used by the compose stack:
    docker network create kcm-net
  3. Start the services (Kafka, Schema Registry, KCM UI/API, etc.):
    docker compose up -d

This brings the full demo environment online at http://localhost. Default credentials and service ports are preconfigured inside the compose project.

Helm chart: Native Kubernetes installation is in the final stages. Watch the release notes or subscribe to our newsletter to be notified when the Helm chart ships.

Pay & install license

Upgrade to the Pro plan to unlock multi-cluster management, unlimited users, advanced diagnostics, and enterprise integrations.

  1. Head over to the pricing page and click Subscribe. Complete the checkout form with your billing details.
  2. Once payment is confirmed, you'll receive an email with your license key and a link to the customer portal.
  3. In the KCM web console, sign in as an administrator and open Settings > License, then pick the Update tab.
  4. Select your preferred input method:
    • File (.kcmlic): Click Browse, choose the license file from the email, and press Upload.
    • Copy / Paste: Switch the toggle and paste the raw license token if your environment blocks file uploads.
  5. KCM validates the signature and restarts premium modules automatically. Confirm that the status badge now reads "Pro" and that Cluster Diagnostics is available.
KCM license upload dialog showing .kcmlic file selection
Upload your .kcmlic license file or paste the key directly from the Update tab.
Offline installs: If your cluster cannot reach the internet, contact support@kcmhub.io for an offline activation bundle.

After installing KCM, sign in with your administrator credentials from the welcome screen to unlock setup actions such as cluster registration and access control.

KCM login page requesting email and password for administrator access
The login screen is accessible at http://localhost (or your deployed URL). Use your admin account to continue configuration.

2. Connect your Kafka clusters

Create a group

KCM lets you organize environments or sets of Kafka clusters into groups. Use them to keep production, staging, or regional footprints separated while still working from a single control plane.

  1. Open the Groups page from the top navigation.
  2. Click the blue ADD button to create a new entry.
  3. Provide a descriptive name (for example cloud_prod) and pick the color that will identify the group.
  4. Confirm with OK, then press ENTER on the new card to dive into the group workspace.
KCM groups page listing existing environments
Start from the Groups page to review existing environments and create a new one.
Modal for creating a new KCM group with name and color
Use the Add Group modal to choose a name and color badge for your environment.

The banner at the top of the group view automatically adopts the color you selected. That visual cue helps teams immediately recognize which environment they are managing.

Group detail page showing colored banner matching chosen color
After entering the group workspace, the header switches to the chosen color so you always know where you are.

Register Kafka clusters

With your group ready, capture each Kafka environment you want to manage:

  1. Inside the group press the blue ADD button to launch the creation wizard.
  2. In the General tab, give the cluster a descriptive name, list one or more bootstrap servers separated by commas, and enable metrics if you want KCM to collect built-in telemetry.
  3. Expand the Security section to pick your protocol (PLAINTEXT, SASL, TLS) and the matching SASL mechanism. Provide broker credentials and upload truststores as needed.
  4. Click Create. Once the cluster appears in the list you can reopen it at any time to tweak endpoints or security settings.
  5. Once created, use Test Connection to validate connectivity. A green success banner confirms KCM can reach the brokers with the provided details.
General tab of the Add Kafka Cluster dialog showing cluster name, bootstrap servers, and metrics toggle
Fill in the General tab with the cluster name, bootstrap servers, and optional metrics settings.
Security tab of the Add Kafka Cluster dialog highlighting protocol, SASL mechanism, and credential inputs
Select your security protocol, SASL mechanism, and provide credentials or certificates before saving.
Edit Kafka Cluster dialog with Test Connection button available
Reopen the cluster to adjust settings or trigger a live connectivity test.
Green success toast after testing the Kafka cluster connection
A green toast confirms the brokers responded and the configuration is valid.

You can register up to five clusters on the Pro plan. Enterprise plans support unlimited Kafka clusters.

3. Explore the web console

Overview

The Overview tab gives you an instant health snapshot: controller status, online brokers, topic counters, partition distribution, and the Kafka version running in your cluster.

KCM overview dashboard showing broker health cards, Kafka version, and broker list
The Overview screen highlights controller, broker, and topic metrics so you can spot issues at a glance.

Topics

Browse every topic at a glance, drill into configs, and watch throughput trends update live, all without touching the CLI. Tabs surface metadata, partitions, and consumer offsets, while the Messages view lets you tail records in real time for quick debugging.

KCM topics list with simple columns showing name, partitions, and replication
The default Topics dashboard gives you a clean list with core details like partitions and replication factor.
KCM topics list with advanced columns enabled including min ISR and total size
Toggle Advanced Configs to reveal deeper metrics such as minimum ISR and total storage per topic.
Topics row action menu offering edit, produce, delete, and purge options
The row action menu lets you jump into editing, produce test messages, or delete and purge topics directly from the list.
  • List existing topics, create new ones, and edit or delete those you no longer need.
  • Purge accumulated data in a few clicks when you need a clean slate.
  • Export the current table view to CSV; the download respects any filters you've applied.
  • Tune advanced configurations without memorizing property keys.
  • See which brokers host each partition, track ISR and replicas, and trigger reassignment.
  • Discover the consumer groups currently reading from the topic to understand downstream impact.
Create Kafka Topic modal showing name, partitions, replication factor, and retention.
Creating a topic lets you set core parameters like partitions, replication factor, cleanup policy, and retention.
Topic configuration screen listing advanced per-topic settings with toggles.
After the topic exists, advanced config toggles make it easy to fine-tune retention, compression, and other broker-level options without memorizing property names.
Topic partitions tab showing leaders, replicas, ISR counts, and reassignment controls.
The Partitions tab visualizes leader and replica placement, helping you plan reassignment and monitor ISR membership.

Consumer Groups

Consumer Groups list showing group name and state
The list view surfaces essential information at a glance—group state and quick access to describe actions.
Consumer Groups list with Advanced Configs enabled, showing coordinator tooltip
Enable Advanced Configs to reveal coordinator brokers, members and export-to-CSV controls.

Monitor consumption at the group level, from creation through ongoing lag tracking. KCM lets you bootstrap new groups, reset offsets safely, and visualize how traffic is distributed across topics and partitions without touching the command line.

  • Create test or production consumer groups and attach one or many topics in a single dialog.
  • Choose the offset reset strategy (latest, earliest, custom timestamp) before the group starts consuming.
  • Review a recap screen prior to confirmation so you know exactly which partitions and reset modes will be applied.
  • Drill into lag dashboards to spot back-pressure by topic, then zoom into partitions that need attention.
  • Force kill members or trigger offset resets on demand when stuck consumers block progress.
New consumer group modal with name, offset reset mode, topic selection, and topic mode toggle
Use the creation modal to name the group, select where offsets should start, and add topics without leaving the page.
Recap dialog summarizing topic, partitions, and offset reset choice before creating the group
Confirm the reset configuration in the recap dialog before KCM provisions the group.
Consumer group overview modal showing lag by topic chart
Lag by topic charts reveal where consumers are falling behind so you can scale or rebalance quickly.
Consumer group overview filtered to a topic, displaying lag per partition
Drill down into a topic to inspect lag per partition and zero in on the slow followers.

Kafka Connect

Deploy, pause, resume, and delete connectors. View connector configs with syntax highlighting and change history.

ACL Management

Create, modify, and audit ACL entries with rollback support. Exporting ACL sets as YAML for GitOps pipelines is on the roadmap and will roll out in an upcoming release.

Cluster Diagnostics

Run health checks, broker config diff, log compaction scans, and produce tailored remediation guides.

4. Common workflows

Create a new topic

  1. Open Topics and click Create Topic.
  2. Fill in partitions, replication factor, and optional configs.
  3. Review the summary and confirm. KCM applies settings via the Kafka Admin API.

Produce a Kafka message

  1. In Topics, open the action menu for your target topic and choose Produce.
  2. Fill in optional headers, the message key, and the payload value. Headers can be toggled on or off individually.
  3. If you have a Schema Registry connected, pick it from the dropdown. Enable Key with Schema and/or Value with Schema so KCM validates and transforms your JSON payload into the registered schema automatically.
  4. Click Produce to send the record. A green toast confirms the partition and offset that were written.
  5. Optionally use Save as Template to store the current message for future tests. Templates keep headers and payloads so repeated checks take seconds.
Produce dialog showing headers, key, value, and schema registry options
The Produce dialog lets you enrich messages with headers, keys, and schema-aware payloads.
Success toast confirming message produced with partition and offset details
A success toast confirms the exact partition and offset on successful publish.
Save as template dialog prompting for name, description, and sharing toggle
Save the message as a reusable template to speed up regression testing. Toggle Shared to make the template available to teammates who have write access on this topic and cluster.

Apply a saved template

  1. In the Produce dialog, click on the down arrow next to Paste from template and choose whether to browse all templates or only those linked to the current topic.
  2. Select the template you want and press Use this template. KCM injects the saved headers, key, value, and schema settings.
  3. If the template was created for a different topic, confirm whether you want to keep your current topic or switch to the template's original topic.
Produce dialog showing template dropdown with options for all templates or this topic
Pick templates scoped to the current topic or scan your full template library.
Template selection modal highlighting the Use this template action
Apply the desired template to preload the message form instantly.
Confirmation dialog warning when template comes from a different topic
If the template belongs to another topic, confirm whether to stay on your current topic or switch to the template's topic before producing.

Consume Kafka messages

Use the Data Consumer workspace to tail records live, checkpoint what you receive, and copy interesting messages for replays or regression tests.

  1. Open Data Consumer, pick the cluster and topic you want to inspect, and optionally select partitions or a starting offset window.
  2. Click Start to begin streaming messages to the table. Press Stop any time to halt the flow and keep the records visible for review.
  3. Use Filter to narrow the stream by key, value, or headers, and apply the filters while the consumer keeps running.
  4. When you spot an interesting record, open the Headers pill to view every header/value pair, or use the row menu to Replay or Copy as template.
  5. Download the captured dataset with Export to CSV for offline debugging or attaching to a ticket.
Data Consumer screen showing topic selection before starting
Choose the topic, partitions, and starting position before you begin consuming.
Data Consumer filters dialog allowing key, value, and header filters
Apply filters to focus on specific keys, values, or header fields while the stream is live.
Data Consumer table listing live Kafka messages with Start, Stop, and Export buttons
Click Start to populate the table and Stop when you have enough messages; export the results to CSV at any time.
Modal displaying headers for the selected Kafka message
Drill into the Headers pill to inspect each header key/value captured on the message.
Row action menu offering replay or copy as template options
Reuse messages instantly via Replay or generate a producer template from the selected payload.

Deploy a Kafka Connect connector

  1. Go to Kafka Connect > New Connector.
  2. Choose the connector class from the catalog.
  3. Paste or compose the JSON configuration.
  4. Validate with KCM's schema-aware checker and launch.

Audit changes

Browse the Audit section to view who changed what. Export audits to CSV or stream them to your SIEM.

5. CLI & automation

A lightweight CLI for scripting routine operations is on the product roadmap. The commands below preview the planned experience.

# List clusters
kcmctl clusters list

# Trigger diagnostics on a specific cluster
kcmctl diagnostics run --cluster prod-eu-1

# Export ACLs to YAML
kcmctl acls export --cluster prod-eu-1 --out acls.yaml

Authenticate with an API token via kcmctl login --token <your-token>. Tokens can be generated in the web console under Profile > API Tokens.

6. Troubleshooting

Health checks fail

Verify broker reachability from the KCM host (telnet broker 9092) and confirm that TLS certificates are trusted.

Connectors not visible

Ensure the Kafka Connect REST endpoint allows CORS from the KCM domain and that the service account has the Connector:Read permission.

"Unauthorized" errors

Review RBAC assignments under Admin > Access Control. Each module (Topics, ACLs, etc.) has granular permissions.

Log locations: Application logs live under logs/kcm-app.log (or Docker container stdout). Enable debug mode with KCM_LOG_LEVEL=debug for deeper tracing.

6. Need help?

We're here to help you succeed with Kafka.