NVIDIA Unveils BlueField-4 and Hardware Cyber Defense

29 October, 2025

Check Point, Armis and Palo Alto Networks are collaborating on a hardware-level security layer for NVIDIA’s new DPU, which delivers 800 Gb/s throughput

At its GTC conference in Washington DC, NVIDIA introduced BlueField-4 — the latest generation in its Data Processing Unit (DPU) family, originally rooted in technologies developed by Israel’s Mellanox. A DPU is a specialized processor designed to manage data movement and operations inside modern data centers. It offloads non-computational tasks — such as networking, storage and security — from the CPU, allowing servers to “think less about management and more about execution,” resulting in faster performance and better energy efficiency.

BlueField-4 represents the next evolutionary leap in NVIDIA’s DPU line. The chip supports data-transfer speeds of up to 800 Gb/s and is optimized for so-called “AI factories” — massive clusters that train and serve large-scale machine-learning models. It delivers roughly six times the processing power of BlueField-3 and is engineered to orchestrate architectures spanning thousands of servers and accelerators working in sync.

The new DPU combines three core elements: advanced ARM cores, dedicated acceleration engines for network and storage processing, and high-speed interfaces up to 800 Gb/s in Ethernet and InfiniBand modes. It integrates natively with NVIDIA’s ConnectX-9 network cards and the company’s DOCA software framework — an infrastructure SDK that lets developers build secure networking, storage and analytics services directly on hardware. The result is a unified control plane that handles data processing, security and communications in real time with minimal CPU involvement.

Compared with its predecessor, BlueField-4 offers not only double the bandwidth but a fundamental architectural shift. Where BlueField-3 focused on accelerating network and storage traffic, BlueField-4 is built to serve as the operational core of large-scale AI infrastructure. It adds hardware-level security modules, dedicated encryption engines and Zero-Trust isolation between users and workloads in multi-tenant environments — ensuring that every bit of data, from packets to AI models in training, is verified and protected without compromising throughput.

A new security architecture: cyber defense moves into hardware

Security sits at the heart of the BlueField-4 launch, but NVIDIA isn’t just upgrading performance — it is redefining how data-center defense is architected. The company is moving cyber protection away from software layers that react after an incident, into the hardware fabric itself. With BlueField-4, security takes place inside the data path — in real time, as information flows through the system.

Rather than running security software on top of the operating system, NVIDIA enables those functions to run directly on the processor. Its DOCA framework lets cybersecurity vendors write micro-services that operate within the DPU itself — much like AI developers use CUDA to access GPU acceleration.

A lineup of leading security vendors — including Check Point Software Technologies, Palo Alto Networks, Fortinet, CrowdStrike and Armis — is already building dedicated applications for the new chip. These solutions are designed to detect attacks in real time, enable distributed hardware-level firewalls, isolate cloud containers and encrypt or filter traffic as it moves across the network.

In effect, the DPU acts as a “nervous system and immune response” for the data center — monitoring every packet, verifying identities, encrypting information and blocking intrusions before they reach the operating system or applications. It creates a distributed security mesh where hundreds or thousands of DPUs enforce policy autonomously instead of relying on a single central firewall.

This marks a fundamental shift in cloud and AI security: protection is no longer an add-on at the edge but a native function of the hardware core. For NVIDIA, it’s also a way to make BlueField-4 a cornerstone of its data-center ecosystem — where the GPU does the compute, the CPU handles control, and the DPU becomes the gatekeeper of data itself.

Israeli WEKA introduces smart storage running directly on the DPU

Another Israeli player, WEKA IO, is embracing the DPU revolution with its new NeuralMesh storage architecture, built specifically for NVIDIA’s BlueField-4. The system leverages the chip’s massive 800 Gb/s bandwidth to run storage and data-management services directly on the DPU instead of on separate CPU servers. This approach reduces bottlenecks, shortens data-access latency and significantly improves energy efficiency across the data center.

By offloading tasks such as data-path optimization, encryption and multi-tenant isolation to the DPU, WEKA is eliminating traditional gaps between storage and compute. The result is AI infrastructure that runs faster, smarter and more securely — a clear sign that BlueField-4 is quickly becoming a central building block in the next generation of intelligent data centers.

Share via Whatsapp

Posted in: AI , Cyber , News

Posted in tags: Bluefield-4 , DPU , Nvida , WEKA