In an ever-evolving AI landscape, keeping up with secure, scalable experimentation environments is not just an advantage—it’s a necessity. In a recent live webinar, OpenMetal’s Todd Robinson sat down with Emrul Islam from Kasm Technologies to explore how container-based Virtual Desktop Infrastructure (VDI) and infrastructure flexibility can empower teams tackling everything from machine learning research to high-security operations.

 

Here’s a breakdown of the insightful discussion on where AI meets infrastructure, what’s changing in the VDI space, and how the combination of OpenMetal and Kasm Workspaces opens up new possibilities for secure, browser-native, containerized environments.


AI and Infrastructure: The OpenMetal Perspective

Todd opened the session by sharing OpenMetal’s view of the AI space: “We’re a hardware-centric company, helping customers make sense of the infrastructure behind inference and training workloads,” he explained.

From CPUs that handle light token generation to GPU-intensive clusters running on A100s, H100s, or H200s, OpenMetal helps customers right-size their AI environments without overcommitting resources.


From Military Origins to AI Enablement: Kasm’s Story

Kasm started as a DARPA-funded project for the U.S. military with one clear mission: enable secure collaboration. Over time, that goal evolved into what Kasm offers today—a browser-based, container-native VDI solution built on Docker that enables instant, isolated, and secure computing sessions. No bulky virtual machines. No bloated golden images. Just containers that spin up in seconds and stream directly to users’ browsers.

“The VDI world was getting clunky,” Emrul shared. “We rebuilt it from the ground up using Docker, which makes it lightweight, easy to manage, and ideal for modern DevOps integrations like Terraform and Ansible.”


Why Security and AI Go Hand-in-Hand

Although Kasm didn’t start as an AI company, their focus on secure, isolated environments has made them a natural fit for AI use cases—especially in industries like healthcare and defense, where data sensitivity is paramount.

“One challenge in AI is allowing researchers to work with large, regulated data sets—without exposing that data to local environments,” Emrul noted. “With Kasm, users don’t have to download anything. It all stays within the container, which is streamed to the browser.”

This approach also solves a major operational pain point: managing multiple researchers, each with different dependencies, toolkits, and access levels. Kasm’s container-based model makes it easy to spin up highly customized environments for each user—all while keeping data secure and auditable.


Real-World AI Use Cases: Reinforcement Learning and More

Emrul demonstrated a classic reinforcement learning project—training a virtual cheetah to walk using visual inputs from a simulated environment. Unlike traditional notebook-based work, some AI tasks require visual feedback to debug or validate models.

“In this case, being able to see why the agent was getting stuck made all the difference,” Emrul explained. “Kasm lets you run visual, GPU-accelerated environments directly in the browser without installing anything locally.”

Whether it’s RL agents, LLM experimentation, or multisensory AI research, Kasm makes it possible to interact with AI models in secure, reproducible environments—on-demand.


Under the Hood: How Kasm Workspaces Work

At the core of Kasm’s platform are three architectural elements:

  • Control Plane: Manages user sessions and orchestrates container deployment.
  • Agent Servers: Docker hosts (VM or bare metal) where user containers are spun up.
  • Docker Images: Preconfigured environments with tools like Jupyter, PyTorch, TensorFlow, and even full desktops with Visual Studio, Chrome, and more.

When a user launches a workspace, Kasm spins up a container on a GPU-capable host, streams it to the user, and tears it down when no longer needed. The process takes seconds and can be fine-tuned to support different Python builds, packages, and models.

Kasm also supports Windows via RDP and includes advanced features like clipboard restrictions, VPN configurations, and even support for smart card authentication.


Scaling with OpenMetal: Infrastructure That Grows With You

The discussion naturally turned toward how OpenMetal and Kasm complement each other. While Kasm provides the secure, containerized environment, OpenMetal delivers the flexible, scalable bare metal or cloud-native infrastructure to run it.

“This is where it gets interesting,” Todd said. “Whether you’re running inference workloads on modern CPUs or training models on multi-GPU nodes, OpenMetal can provide the hardware behind your secure sessions.”

OpenMetal’s infrastructure supports both cloud-native growth and hybrid workloads, ideal for organizations reaching that “public cloud tipping point” where spend becomes unpredictable or unmanageable.


Observability, Privacy, and Compliance

Kasm is built with compliance in mind. For air-gapped environments, healthcare regulation, or legal audits, the platform offers tools like:

  • Screen recording (optional)
  • Clipboard control
  • Per-session GPU and memory allocation
  • Data residency and storage isolation
  • Docker integration with existing Linux security tooling

Kasm even supports private LLM interactions—ensuring prompts and embeddings stay within your environment.


What’s Next for Kasm?

Emrul offered a glimpse into Kasm’s roadmap, shaped heavily by community feedback:

  • More automation tooling (Terraform, Pulumi, etc.)
  • Networking controls (per-session VPNs)
  • Multi-monitor streaming support
  • Pre-built AI workspaces like AnythingLLM, Easy Diffusion, and more
  • One-click installs via AWS Marketplace and other platforms

And with a growing ecosystem of home lab tinkerers, YouTube reviewers, and enterprise customers, Kasm is evolving quickly—powered by curiosity and a commitment to open standards.


Getting Started with Kasm

Kasm Workspaces is free for up to 5 concurrent users, making it ideal for testing, internal dev teams, or small-scale research projects. 👉 Visit kasmweb.com to get started.


Interested in Private GPU Servers and Clusters?

GPU Server Pricing

High-performance GPU hardware with detailed specs and transparent pricing.

View Options

Schedule a Consultation

Let’s discuss your GPU or AI needs and tailor a solution that fits your goals.

Schedule Meeting

Private AI Labs

$50k in credits to accelerate your AI project in a secure, private environment.

Apply Now

Explore More OpenMetal AI Content

In a recent live webinar, OpenMetal’s Todd Robinson sat down with Emrul Islam from Kasm Technologies to explore how container-based Virtual Desktop Infrastructure (VDI) and infrastructure flexibility can empower teams tackling everything from machine learning research to high-security operations.

With the new OpenMetal Private AI Labs program, you can access private GPU servers and clusters tailored for your AI projects. By joining, you’ll receive up to $50,000 in usage credits to test, build, and scale your AI workloads.

GPU Servers and Clusters are now available on OpenMetal—giving you dedicated access to enterprise-grade NVIDIA A100 and H100 GPUs on fully private, high-performance infrastructure.