SecurityWeek

Enhancing Data Center Security Without Sacrificing Performance


Every data center cybersecurity team faces the same impossible equation: host-based agents consume CPU cycles that high-performance computing requires. For years, the industry has tried to balance this trade-off. The more security you implement, the more performance suffers; yet, the more you preserve performance, the greater the risk of blind spots.

For an example of such a blind spot, look no further than the gap between a virtual machine (VM) and its physical host. In March 2025, Broadcom patched a series of VMware ESXi zero-day vulnerabilities that could escape the VM sandbox entirely. In 2023, the ESXiArgs campaign affected an estimated 3,800 servers globally.

In both instances, a single compromise disabled or encrypted dozens of VMs simultaneously. Host-based agents were ineffective because the attack occurred in the hypervisor.

The solution is not optimization; it requires reimagining the architecture by removing it from the host entirely. Data processing units (DPUs), installed on each server, provide this capability.

Executing security workloads on the DPU instead of the CPU frees the host CPU and GPU cycles for the operations they were built to perform. Even better, the DPU is invisible and inaccessible to attackers because it operates independently from the host OS.

The end result is tamper-proof security, enforced at line speed – without any negative performance impact.

Advertisement. Scroll to continue reading.

Legacy Risks at a Modern Pace

Data centers have always been among the most challenging environments to secure. Physical servers host hypervisors. Hypervisors host VMs. VMs host containers. Each layer adds abstraction, and each abstraction introduces blind spots where assets go unmanaged and vulnerabilities remain undetected.

Misconfigurations compound over time. VMs get copied from outdated templates. Firewall rules accumulate exceptions that no one audits. Servers remain running for a project long completed because no one wants to risk an outage by decommissioning them.

Perimeter security is of little use in these environments. Firewalls and network security devices monitor north-south traffic (i.e., data flowing into and out of the data center). But the majority of traffic in a data center is east-west (i.e., the lateral movement between VMs).

Once an attacker breaches a single instance, perimeter defenses have no visibility into what follows. This is where dwell time accumulates, and privilege escalation occurs, well beyond the defense of the traditional network perimeter.

AI data centers inherit all of these risks and then accelerate at an exponential rate. Transient network flows exist for hours (or even just minutes) before disappearing entirely. VMs are created and terminated for individual tasks. Containers are orchestrated across nodes that redistribute resources in real-time. These just-in-time assets materialize and vanish faster than any human operator or periodic scan can track.

When you consider that a single GPU cluster can represent millions of dollars in hardware and every percentage point of efficiency translates directly into a competitive advantage, then installing host-based security agents does not make sense. Unfortunately, that means some operators are quietly disabling security on their most critical compute nodes and hoping the perimeter holds. It doesn’t add up.

A Blueprint for a Better Tomorrow

Shifting security from CPU-based agents to a DPU-based security architecture eliminates the security vs. productivity tradeoff by relocating the entire security stack onto dedicated silicon. The DPU functions as an embedded sensor in each server, streaming telemetry data and monitoring network traffic without any operational impact on the host.

The performance implications are significant. Continuous real-time monitoring on a DPU can operate faster than CPU-based approaches – and the speed is only half of the advantage. The separation between the DPU and the host enables zero trust security at the hardware level.

The DPU resides between the host and the network, treating both with zero trust. Every packet, every access request and every process is subject to inspection and policy enforcement. Even if the host operating system is somehow compromised, the hardware isolation of the DPU maintains control.

In terms of visibility, a DPU-based architecture enables continuous monitoring across physical and virtual infrastructure and across east-west (internal) traffic and north-south (external) traffic. Deep packet inspection analyzes traffic at the endpoint, eliminating bottlenecks to and from external appliances.

Simultaneously, privacy protections are built into the design. Information is only extracted from kernel-level structures and system metadata, not from user data or application-layer content. The net result is comprehensive visibility without exposing sensitive data.

Enabling Security and Performance

For two decades, data center security has been defined by an impossible equation: security or productivity. DPU-based security balances the equation. For AI data centers, where the stakes are the highest and performance constraints are the tightest, security and performance are no longer a zero-sum game.

Related: Cisco Patches Critical Vulnerability in Data Center Management Product



Source link